Cobb, Ernest D.
1986-01-01
The U.S. Geological Survey evaluated 2,349 communities in 1984 for the application of limited-detail flood-insurance study methods, that is, methods with a reduced effort and cost compared to the detailed studies. Limited-detail study methods were found to be appropriate for 1,705 communities, while detailed studies were appropriate for 62 communities and no studies were appropriate for 582 communities. The total length of streams for which limited-detail studies are recommended is 9 ,327 miles with a corresponding cost of $23,007,000. This results in average estimated costs for conducting limited-detail studies of $2,500 per mile of studied stream length. The purpose of the report is to document the limited-detail study methods and the results of the evaluation. (USGS)
NASA Astrophysics Data System (ADS)
Shea, Joan-Emma; Brooks, Charles L., III
2001-10-01
Beginning with simplified lattice and continuum "minimalist" models and progressing to detailed atomic models, simulation studies have augmented and directed development of the modern landscape perspective of protein folding. In this review we discuss aspects of detailed atomic simulation methods applied to studies of protein folding free energy surfaces, using biased-sampling free energy methods and temperature-induced protein unfolding. We review studies from each on systems of particular experimental interest and assess the strengths and weaknesses of each approach in the context of "exact" results for both free energies and kinetics of a minimalist model for a beta-barrel protein. We illustrate in detail how each approach is implemented and discuss analysis methods that have been developed as components of these studies. We describe key insights into the relationship between protein topology and the folding mechanism emerging from folding free energy surface calculations. We further describe the determination of detailed "pathways" and models of folding transition states that have resulted from unfolding studies. Our assessment of the two methods suggests that both can provide, often complementary, details of folding mechanism and thermodynamics, but this success relies on (a) adequate sampling of diverse conformational regions for the biased-sampling free energy approach and (b) many trajectories at multiple temperatures for unfolding studies. Furthermore, we find that temperature-induced unfolding provides representatives of folding trajectories only when the topology and sequence (energy) provide a relatively funneled landscape and "off-pathway" intermediates do not exist.
IMPROVING THE REPORTING OF THERAPEUTIC EXERCISE INTERVENTIONS IN REHABILITATION RESEARCH.
Page, Phil; Hoogenboom, Barb; Voight, Michael
2017-04-01
The foundation of evidence-based practice lies in clinical research, which is based on the utilization of the scientific method. The scientific method requires that all details of the experiment be provided in publications to support replication of the study in order to evaluate and validate the results. More importantly, clinical research can only be translated into practice when researchers provide explicit details of the study. Too often, rehabilitation exercise intervention studies lack the appropriate detail to allow clinicians to replicate the exercise protocol in their patient populations. Therefore, the purpose of this clinical commentary is to provide guidelines for optimal reporting of therapeutic exercise interventions in rehabilitation research. 5.
Key Features of Academic Detailing: Development of an Expert Consensus Using the Delphi Method
Yeh, James S.; Van Hoof, Thomas J.; Fischer, Michael A.
2016-01-01
Background Academic detailing is an outreach education technique that combines the direct social marketing traditionally used by pharmaceutical representatives with unbiased content summarizing the best evidence for a given clinical issue. Academic detailing is conducted with clinicians to encourage evidence-based practice in order to improve the quality of care and patient outcomes. The adoption of academic detailing has increased substantially since the original studies in the 1980s. However, the lack of standard agreement on its implementation makes the evaluation of academic detailing outcomes challenging. Objective To identify consensus on the key elements of academic detailing among a group of experts with varying experiences in academic detailing. Methods This study is based on an online survey of 20 experts with experience in academic detailing. We used the Delphi process, an iterative and systematic method of developing consensus within a group. We conducted 3 rounds of online surveys, which addressed 72 individual items derived from a previous literature review of 5 features of academic detailing, including (1) content, (2) communication process, (3) clinicians targeted, (4) change agents delivering intervention, and (5) context for intervention. Nonrespondents were removed from later rounds of the surveys. For most questions, a 4-point ordinal scale was used for responses. We defined consensus agreement as 70% of respondents for a single rating category or 80% for dichotomized ratings. Results The overall survey response rate was 95% (54 of 57 surveys) and nearly 92% consensus agreement on the survey items (66 of 72 items) by the end of the Delphi exercise. The experts' responses suggested that (1) focused clinician education offering support for clinical decision-making is a key component of academic detailing, (2) detailing messages need to be tailored and provide feasible strategies and solutions to challenging cases, and (3) academic detailers need to develop specific skill sets required to overcome barriers to changing clinician behavior. Conclusion Consensus derived from this Delphi exercise can serve as a useful template of general principles in academic detailing initiatives and evaluation. The study findings are limited by the lack of standard definitions of certain terms used in the Delphi process. PMID:27066195
How Much Detail Needs to Be Elucidated in Self-Harm Research?
ERIC Educational Resources Information Center
Stanford, Sarah; Jones, Michael P.
2010-01-01
Assessing self-harm through brief multiple choice items is simple and less invasive than more detailed methods of assessment. However, there is currently little validation for brief methods of self-harm assessment. This study evaluates the extent to which adolescents' perceptions of self-harm agree with definitions in the literature, and what…
Methods for comparative evaluation of propulsion system designs for supersonic aircraft
NASA Technical Reports Server (NTRS)
Tyson, R. M.; Mairs, R. Y.; Halferty, F. D., Jr.; Moore, B. E.; Chaloff, D.; Knudsen, A. W.
1976-01-01
The propulsion system comparative evaluation study was conducted to define a rapid, approximate method for evaluating the effects of propulsion system changes for an advanced supersonic cruise airplane, and to verify the approximate method by comparing its mission performance results with those from a more detailed analysis. A table look up computer program was developed to determine nacelle drag increments for a range of parametric nacelle shapes and sizes. Aircraft sensitivities to propulsion parameters were defined. Nacelle shapes, installed weights, and installed performance was determined for four study engines selected from the NASA supersonic cruise aircraft research (SCAR) engine studies program. Both rapid evaluation method (using sensitivities) and traditional preliminary design methods were then used to assess the four engines. The method was found to compare well with the more detailed analyses.
NASA Technical Reports Server (NTRS)
Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.
1993-01-01
Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.
Calculation methods study on hot spot stress of new girder structure detail
NASA Astrophysics Data System (ADS)
Liao, Ping; Zhao, Renda; Jia, Yi; Wei, Xing
2017-10-01
To study modeling calculation methods of new girder structure detail's hot spot stress, based on surface extrapolation method among hot spot stress method, a few finite element analysis models of this welded detail were established by finite element software ANSYS. The influence of element type, mesh density, different local modeling methods of the weld toe and extrapolation methods was analyzed on hot spot stress calculation results at the toe of welds. The results show that the difference of the normal stress in the thickness direction and the surface direction among different models is larger when the distance from the weld toe is smaller. When the distance from the toe is greater than 0.5t, the normal stress of solid models, shell models with welds and non-weld shell models tends to be consistent along the surface direction. Therefore, it is recommended that the extrapolated point should be selected outside the 0.5t for new girder welded detail. According to the results of the calculation and analysis, shell models have good grid stability, and extrapolated hot spot stress of solid models is smaller than that of shell models. So it is suggested that formula 2 and solid45 should be carried out during the hot spot stress extrapolation calculation of this welded detail. For each finite element model under different shell modeling methods, the results calculated by formula 2 are smaller than those of the other two methods, and the results of shell models with welds are the largest. Under the same local mesh density, the extrapolated hot spot stress decreases gradually with the increase of the number of layers in the thickness direction of the main plate, and the variation range is within 7.5%.
ERIC Educational Resources Information Center
Swanson, James; Arnold, L. Eugene; Kraemer, Helena; Hechtman, Lily; Molina, Brooke; Hinshaw, Stephen; Vitiello, Benedetto; Jensen, Peter; Steinhoff, Ken; Lerner, Marc; Greenhill, Laurence; Abikoff, Howard; Wells, Karen; Epstein, Jeffery; Elliott, Glen; Newcorn, Jeffrey; Hoza, Betsy; Wigal, Timothy
2008-01-01
Objective: To review the primary and secondary findings from the Multimodal Treatment study of ADHD (MTA) published over the past decade as three sets of articles. Method: In a two-part article--Part I: Executive Summary (without distracting details) and Part II: Supporting Details (with additional background and detail required by the complexity…
Effects of Detailed Illustrations on Science Learning: An Eye-Tracking Study
ERIC Educational Resources Information Center
Lin, Yu Ying; Holmqvist, Kenneth; Miyoshi, Kiyofumi; Ashida, Hiroshi
2017-01-01
The eye-tracking method was used to assess the influence of detailed, colorful illustrations on reading behaviors and learning outcomes. Based on participants' subjective ratings in a pre-study, we selected eight one-page human anatomy lessons. In the main study, participants learned these eight human anatomy lessons; four were accompanied by…
Mei, Liang; Svanberg, Sune
2015-03-20
This work presents a detailed study of the theoretical aspects of the Fourier analysis method, which has been utilized for gas absorption harmonic detection in wavelength modulation spectroscopy (WMS). The lock-in detection of the harmonic signal is accomplished by studying the phase term of the inverse Fourier transform of the Fourier spectrum that corresponds to the harmonic signal. The mathematics and the corresponding simulation results are given for each procedure when applying the Fourier analysis method. The present work provides a detailed view of the WMS technique when applying the Fourier analysis method.
The enhancement of friction ridge detail on brass ammunition casings using cold patination fluid.
James, Richard Michael; Altamimi, Mohamad Jamal
2015-12-01
Brass ammunition is commonly found at firearms related crime scenes. For this reason, many studies have focused on evidence that can be obtained from brass ammunition such as DNA, gunshot residue and fingerprints. Latent fingerprints on ammunition can provide good forensic evidence, however; fingerprint development on ammunition casings has proven to be difficult. A method using cold patination fluid is described as a potential tool to enhance friction ridge detail on brass ammunition casings. Current latent fingerprint development methods for brass ammunition have either failed to provide the necessary quality of friction ridge detail or can be very time consuming and require expensive equipment. In this study, the enhancement of fingerprints on live ammunition has been achieved with a good level of detail whilst the development on spent casings has to an extent also been possible. Development with cold patination fluid has proven to be a quick, simple and cost-effective method for fingerprint development on brass ammunition that can be easily implemented for routine police work. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.
Liao, Hongjing; Hitchcock, John
2018-06-01
This synthesis study examined the reported use of credibility techniques in higher education evaluation articles that use qualitative methods. The sample included 118 articles published in six leading higher education evaluation journals from 2003 to 2012. Mixed methods approaches were used to identify key credibility techniques reported across the articles, document the frequency of these techniques, and describe their use and properties. Two broad sets of techniques were of interest: primary design techniques (i.e., basic), such as sampling/participant recruitment strategies, data collection methods, analytic details, and additional qualitative credibility techniques (e.g., member checking, negative case analyses, peer debriefing). The majority of evaluation articles reported use of primary techniques although there was wide variation in the amount of supporting detail; most of the articles did not describe the use of additional credibility techniques. This suggests that editors of evaluation journals should encourage the reporting of qualitative design details and authors should develop strategies yielding fuller methodological description. Copyright © 2018 Elsevier Ltd. All rights reserved.
ChIP-seq and RNA-seq methods to study circadian control of transcription in mammals
Takahashi, Joseph S.; Kumar, Vivek; Nakashe, Prachi; Koike, Nobuya; Huang, Hung-Chung; Green, Carla B.; Kim, Tae-Kyung
2015-01-01
Genome-wide analyses have revolutionized our ability to study the transcriptional regulation of circadian rhythms. The advent of next-generation sequencing methods has facilitated the use of two such technologies, ChIP-seq and RNA-seq. In this chapter, we describe detailed methods and protocols for these two techniques, with emphasis on their usage in circadian rhythm experiments in the mouse liver, a major target organ of the circadian clock system. Critical factors for these methods are highlighted and issues arising with time series samples for ChIP-seq and RNA-seq are discussed. Finally detailed protocols for library preparation suitable for Illumina sequencing platforms are presented. PMID:25662462
Verifying detailed fluctuation relations for discrete feedback-controlled quantum dynamics
NASA Astrophysics Data System (ADS)
Camati, Patrice A.; Serra, Roberto M.
2018-04-01
Discrete quantum feedback control consists of a managed dynamics according to the information acquired by a previous measurement. Energy fluctuations along such dynamics satisfy generalized fluctuation relations, which are useful tools to study the thermodynamics of systems far away from equilibrium. Due to the practical challenge to assess energy fluctuations in the quantum scenario, the experimental verification of detailed fluctuation relations in the presence of feedback control remains elusive. We present a feasible method to experimentally verify detailed fluctuation relations for discrete feedback control quantum dynamics. Two detailed fluctuation relations are developed and employed. The method is based on a quantum interferometric strategy that allows the verification of fluctuation relations in the presence of feedback control. An analytical example to illustrate the applicability of the method is discussed. The comprehensive technique introduced here can be experimentally implemented at a microscale with the current technology in a variety of experimental platforms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of differentmore » approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.« less
NASA Astrophysics Data System (ADS)
Chen, Chun-Nan; Luo, Win-Jet; Shyu, Feng-Lin; Chung, Hsien-Ching; Lin, Chiun-Yan; Wu, Jhao-Ying
2018-01-01
Using a non-equilibrium Green’s function framework in combination with the complex energy-band method, an atomistic full-quantum model for solving quantum transport problems for a zigzag-edge graphene nanoribbon (zGNR) structure is proposed. For transport calculations, the mathematical expressions from the theory for zGNR-based device structures are derived in detail. The transport properties of zGNR-based devices are calculated and studied in detail using the proposed method.
Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,
1975-12-01
Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and
ERIC Educational Resources Information Center
Cunningham, Corbin A.; Yassa, Michael A.; Egeth, Howard E.
2015-01-01
Previous work suggests that visual long-term memory (VLTM) is highly detailed and has a massive capacity. However, memory performance is subject to the effects of the type of testing procedure used. The current study examines detail memory performance by probing the same memories within the same subjects, but using divergent probing methods. The…
An Open Conversation on Using Eye-Gaze Methods in Studies of Neurodevelopmental Disorders
ERIC Educational Resources Information Center
Venker, Courtney E.; Kover, Sara T.
2015-01-01
Purpose: Eye-gaze methods have the potential to advance the study of neurodevelopmental disorders. Despite their increasing use, challenges arise in using these methods with individuals with neurodevelopmental disorders and in reporting sufficient methodological detail such that the resulting research is replicable and interpretable. Method: This…
Care Staff Perceptions of Choking Incidents: What Details Are Reported?
ERIC Educational Resources Information Center
Guthrie, Susan; Lecko, Caroline; Roddam, Hazel
2015-01-01
Background: Following a series of fatal choking incidents in one UK specialist service, this study evaluated the detail included in incident reporting. This study compared the enhanced reporting system in the specialist service with the national reporting and learning system. Methods: Eligible reports were selected from a national organization and…
40 CFR Appendix D to Part 136 - Precision and Recovery Statements for Methods for Measuring Metals
Code of Federal Regulations, 2011 CFR
2011-07-01
... Accuracy Section with the following: Precision and Accuracy An interlaboratory study on metal analyses by... details are found in “USEPA Method Study 7, Analyses for Trace Methods in water by Atomic Absorption... study on metal analyses by this method was conducted by the Quality Assurance Branch (QAB) of the...
Plume rise study at Colbert Steam Plant--data presentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, T.L.; Coleman, J.H.
1979-05-01
This report makes detailed data on plume rise available for independent analysis by other specialists studying atmospheric dispersion. Techniques of data collection and methods of data reduction are detailed. Data from 24 time-averaged observations of the plume at Colbert Steam Plant, its source, and the meteorological conditions are reported. Most of the data were collected during early to midmorning and are therefore characterized by stable atmospheric conditions. The data are presented in both a summary and a detailed format.
Modeling and scaleup of steamflood in a heterogeneous reservoir
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dehghani, K.; Basham, W.M.; Durlofsky, L.J.
1995-11-01
A series of simulation runs was conducted for different geostatistically derived cross-sectional models to study the degree of heterogeneity required for proper modeling of steamfloods in a thick, heavy-oil reservoir with thin diatomite barriers Different methods for coarsening the most detailed models were applied, and performance predictions for the coarsened and detailed models compared. Use of a general scaleup method provided the most accurate coarse grid models.
Research on the Hotel Image Based on the Detail Service
NASA Astrophysics Data System (ADS)
Li, Ban; Shenghua, Zheng; He, Yi
Detail service management, initially developed as marketing programs to enhance customer loyalty, has now become an important part of customer relation strategy. This paper analyzes the critical factors of detail service and its influence on the hotel image. We establish the theoretical model of influencing factors on hotel image and propose corresponding hypotheses. We use applying statistical method to test and verify the above-mentioned hypotheses. This paper provides a foundation for further study of detail service design and planning issues.
A summary of methods for the collection and analysis of basic hydrologic data for arid regions
Rantz, S.E.; Eakin, T.E.
1971-01-01
This report summarizes and discusses current methods of collecting and analyzing the data required for a study of the basic hydrology of arid regions. The fundamental principles behind these methods are no different than those that apply to studies of humid regions, but in arid regions the infrequent occurrence of precipitation, the great variability of the many hydrologic elements, and the inaccessibility of most basins usually make it economically infeasible to use conventional levels of instrumentation. Because of these economic considerations hydrologic studies in arid regions have been commonly of the reconnaissance type; the more costly detailed studies are generally restricted to experimental basins and to those basins that now have major economic significance. A thorough search of the literature and personal communication with workers in the field of arid-land hydrology provided the basis for this summary of methods used in both reconnaissance and detailed hydrologic studies. The conclusions reached from a consideration of previously reported methods are interspersed in this report where appropriate.
Frank, Lawrence D; Fox, Eric H; Ulmer, Jared M; Chapman, James E; Kershaw, Suzanne E; Sallis, James F; Conway, Terry L; Cerin, Ester; Cain, Kelli L; Adams, Marc A; Smith, Graham R; Hinckson, Erica; Mavoa, Suzanne; Christiansen, Lars B; Hino, Adriano Akira F; Lopes, Adalberto A S; Schipperijn, Jasper
2017-01-23
Advancements in geographic information systems over the past two decades have increased the specificity by which an individual's neighborhood environment may be spatially defined for physical activity and health research. This study investigated how different types of street network buffering methods compared in measuring a set of commonly used built environment measures (BEMs) and tested their performance on associations with physical activity outcomes. An internationally-developed set of objective BEMs using three different spatial buffering techniques were used to evaluate the relative differences in resulting explanatory power on self-reported physical activity outcomes. BEMs were developed in five countries using 'sausage,' 'detailed-trimmed,' and 'detailed,' network buffers at a distance of 1 km around participant household addresses (n = 5883). BEM values were significantly different (p < 0.05) for 96% of sausage versus detailed-trimmed buffer comparisons and 89% of sausage versus detailed network buffer comparisons. Results showed that BEM coefficients in physical activity models did not differ significantly across buffering methods, and in most cases BEM associations with physical activity outcomes had the same level of statistical significance across buffer types. However, BEM coefficients differed in significance for 9% of the sausage versus detailed models, which may warrant further investigation. Results of this study inform the selection of spatial buffering methods to estimate physical activity outcomes using an internationally consistent set of BEMs. Using three different network-based buffering methods, the findings indicate significant variation among BEM values, however associations with physical activity outcomes were similar across each buffering technique. The study advances knowledge by presenting consistently assessed relationships between three different network buffer types and utilitarian travel, sedentary behavior, and leisure-oriented physical activity outcomes.
Improved parallel image reconstruction using feature refinement.
Cheng, Jing; Jia, Sen; Ying, Leslie; Liu, Yuanyuan; Wang, Shanshan; Zhu, Yanjie; Li, Ye; Zou, Chao; Liu, Xin; Liang, Dong
2018-07-01
The aim of this study was to develop a novel feature refinement MR reconstruction method from highly undersampled multichannel acquisitions for improving the image quality and preserve more detail information. The feature refinement technique, which uses a feature descriptor to pick up useful features from residual image discarded by sparsity constrains, is applied to preserve the details of the image in compressed sensing and parallel imaging in MRI (CS-pMRI). The texture descriptor and structure descriptor recognizing different types of features are required for forming the feature descriptor. Feasibility of the feature refinement was validated using three different multicoil reconstruction methods on in vivo data. Experimental results show that reconstruction methods with feature refinement improve the quality of reconstructed image and restore the image details more accurately than the original methods, which is also verified by the lower values of the root mean square error and high frequency error norm. A simple and effective way to preserve more useful detailed information in CS-pMRI is proposed. This technique can effectively improve the reconstruction quality and has superior performance in terms of detail preservation compared with the original version without feature refinement. Magn Reson Med 80:211-223, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Key Features of Academic Detailing: Development of an Expert Consensus Using the Delphi Method.
Yeh, James S; Van Hoof, Thomas J; Fischer, Michael A
2016-02-01
Academic detailing is an outreach education technique that combines the direct social marketing traditionally used by pharmaceutical representatives with unbiased content summarizing the best evidence for a given clinical issue. Academic detailing is conducted with clinicians to encourage evidence-based practice in order to improve the quality of care and patient outcomes. The adoption of academic detailing has increased substantially since the original studies in the 1980s. However, the lack of standard agreement on its implementation makes the evaluation of academic detailing outcomes challenging. To identify consensus on the key elements of academic detailing among a group of experts with varying experiences in academic detailing. This study is based on an online survey of 20 experts with experience in academic detailing. We used the Delphi process, an iterative and systematic method of developing consensus within a group. We conducted 3 rounds of online surveys, which addressed 72 individual items derived from a previous literature review of 5 features of academic detailing, including (1) content, (2) communication process, (3) clinicians targeted, (4) change agents delivering intervention, and (5) context for intervention. Nonrespondents were removed from later rounds of the surveys. For most questions, a 4-point ordinal scale was used for responses. We defined consensus agreement as 70% of respondents for a single rating category or 80% for dichotomized ratings. The overall survey response rate was 95% (54 of 57 surveys) and nearly 92% consensus agreement on the survey items (66 of 72 items) by the end of the Delphi exercise. The experts' responses suggested that (1) focused clinician education offering support for clinical decision-making is a key component of academic detailing, (2) detailing messages need to be tailored and provide feasible strategies and solutions to challenging cases, and (3) academic detailers need to develop specific skill sets required to overcome barriers to changing clinician behavior. Consensus derived from this Delphi exercise can serve as a useful template of general principles in academic detailing initiatives and evaluation. The study findings are limited by the lack of standard definitions of certain terms used in the Delphi process.
NASA Astrophysics Data System (ADS)
Ziegler, Hannes Moritz
Planners and managers often rely on coarse population distribution data from the census for addressing various social, economic, and environmental problems. In the analysis of physical vulnerabilities to sea-level rise, census units such as blocks or block groups are coarse relative to the required decision-making application. This study explores the benefits offered from integrating image classification and dasymetric mapping at the household level to provide detailed small area population estimates at the scale of residential buildings. In a case study of Boca Raton, FL, a sea-level rise inundation grid based on mapping methods by NOAA is overlaid on the highly detailed population distribution data to identify vulnerable residences and estimate population displacement. The enhanced spatial detail offered through this method has the potential to better guide targeted strategies for future development, mitigation, and adaptation efforts.
NASA Technical Reports Server (NTRS)
Hartung, Lin C.
1991-01-01
A method for predicting radiation adsorption and emission coefficients in thermochemical nonequilibrium flows is developed. The method is called the Langley optimized radiative nonequilibrium code (LORAN). It applies the smeared band approximation for molecular radiation to produce moderately detailed results and is intended to fill the gap between detailed but costly prediction methods and very fast but highly approximate methods. The optimization of the method to provide efficient solutions allowing coupling to flowfield solvers is discussed. Representative results are obtained and compared to previous nonequilibrium radiation methods, as well as to ground- and flight-measured data. Reasonable agreement is found in all cases. A multidimensional radiative transport method is also developed for axisymmetric flows. Its predictions for wall radiative flux are 20 to 25 percent lower than those of the tangent slab transport method, as expected, though additional investigation of the symmetry and outflow boundary conditions is indicated. The method was applied to the peak heating condition of the aeroassist flight experiment (AFE) trajectory, with results comparable to predictions from other methods. The LORAN method was also applied in conjunction with the computational fluid dynamics (CFD) code LAURA to study the sensitivity of the radiative heating prediction to various models used in nonequilibrium CFD. This study suggests that radiation measurements can provide diagnostic information about the detailed processes occurring in a nonequilibrium flowfield because radiation phenomena are very sensitive to these processes.
Evaluation of fatigue-prone details using a low-cost thermoelastic stress analysis system.
DOT National Transportation Integrated Search
2016-11-01
This study was designed to develop a novel approach for in situ evaluation of stress fields in the vicinity of fatigue-prone details on highway bridges using a low-cost microbolometer thermal imager. : The method was adapted into a field-deployable i...
Multiscale infrared and visible image fusion using gradient domain guided image filtering
NASA Astrophysics Data System (ADS)
Zhu, Jin; Jin, Weiqi; Li, Li; Han, Zhenghao; Wang, Xia
2018-03-01
For better surveillance with infrared and visible imaging, a novel hybrid multiscale decomposition fusion method using gradient domain guided image filtering (HMSD-GDGF) is proposed in this study. In this method, hybrid multiscale decomposition with guided image filtering and gradient domain guided image filtering of source images are first applied before the weight maps of each scale are obtained using a saliency detection technology and filtering means with three different fusion rules at different scales. The three types of fusion rules are for small-scale detail level, large-scale detail level, and base level. Finally, the target becomes more salient and can be more easily detected in the fusion result, with the detail information of the scene being fully displayed. After analyzing the experimental comparisons with state-of-the-art fusion methods, the HMSD-GDGF method has obvious advantages in fidelity of salient information (including structural similarity, brightness, and contrast), preservation of edge features, and human visual perception. Therefore, visual effects can be improved by using the proposed HMSD-GDGF method.
Rapid-estimation method for assessing scour at highway bridges
Holnbeck, Stephen R.
1998-01-01
A method was developed by the U.S. Geological Survey for rapid estimation of scour at highway bridges using limited site data and analytical procedures to estimate pier, abutment, and contraction scour depths. The basis for the method was a procedure recommended by the Federal Highway Administration for conducting detailed scour investigations, commonly referred to as the Level 2 method. Using pier, abutment, and contraction scour results obtained from Level 2 investigations at 122 sites in 10 States, envelope curves and graphical relations were developed that enable determination of scour-depth estimates at most bridge sites in a matter of a few hours. Rather than using complex hydraulic variables, surrogate variables more easily obtained in the field were related to calculated scour-depth data from Level 2 studies. The method was tested by having several experienced individuals apply the method in the field, and results were compared among the individuals and with previous detailed analyses performed for the sites. Results indicated that the variability in predicted scour depth among individuals applying the method generally was within an acceptable range, and that conservatively greater scour depths generally were obtained by the rapid-estimation method compared to the Level 2 method. The rapid-estimation method is considered most applicable for conducting limited-detail scour assessments and as a screening tool to determine those bridge sites that may require more detailed analysis. The method is designed to be applied only by a qualified professional possessing knowledge and experience in the fields of bridge scour, hydraulics, and flood hydrology, and having specific expertise with the Level 2 method.
2011-01-01
Background Many physicians do not routinely inquire about intimate partner violence. Purpose This qualitative study explores the process of academic detailing as an intervention to change physician behavior with regard to intimate partner violence (IPV) identification and documentation. Method A non-physician academic detailer provided a seven-session modular curriculum over a two-and-a-half month period. The detailer noted written details of each training session. Audiotapes of training sessions and semi-structured exit interviews with each physician were recorded and transcribed. Transcriptions were qualitatively and thematically coded and analyzed using Atlas ti®. Results All three study physicians reported increased clarity with regard to the scope of their responsibility to their patients experiencing IPV. They also reported increased levels of comfort in the effective identification and appropriate documentation of IPV and the provision of ongoing support to the patient, including referrals to specialized community services. Conclusion Academic detailing, if presented by a supportive and knowledgeable academic detailer, shows promise to improve physician attitudes and practices with regards to patients in violent relationships. PMID:21679450
X-Ray Topographic Studies of Energetic Materials.
1987-03-01
role of these defects in crystal growth and in the microplasticity of the solid. 1.1 Experimental Techniques The method chosen for the detailed...the electron microscope. The examinations can be readily extended to detailed studies of the microplasticity of the materials using stress/strain...the availability of large, high quality, single crystals. A considerable part of the initial contract was devoted to the determination of conditions
ERIC Educational Resources Information Center
Banyard, Victoria L.; Williams, Linda M.
2007-01-01
Objective: The current study was exploratory and used multiple methods to examine patterns of stability and change in resilient functioning across 7 years of early adulthood. Second, qualitative data were used to examine in greater detail survivors' own narratives about correlates of healing. Method: This study was longitudinal and used both…
Schaafsma, Dilana; Kok, Gerjo; Stoffelen, Joke M. T.; Curfs, Leopold M. G.
2015-01-01
Sex education for individuals with intellectual disabilities is important. However, our knowledge about effective methods for teaching sex education to this population is limited. We report the results of a systematic review identifying methods for sex education programs aimed at individuals with intellectual disabilities. In all, 20 articles were included that met the criteria set in terms of topic—the effectiveness of sex education programs—and population of interest—individuals with intellectual disabilities. In these articles, methods for increasing knowledge and for improving skills and attitudes were reported. However, the studies revealed that generalization of skills to real-life situations was often not achieved. There are indications that the maintenance of knowledge and skills still needs extra attention. Moreover, detailed descriptions of the program materials, program goals, and methods used in the programs were often lacking in the reports. Although there is some evidence for methods that may improve knowledge, attitudes, and skills with regard to sex education aimed at individuals with intellectual disabilities, due to the lack of detailed descriptions provided it is unclear under which conditions these methods work. We therefore suggest that authors provide additional detail about methods in future publications or in online supplements. PMID:25085114
Structure and information in spatial segregation
2017-01-01
Ethnoracial residential segregation is a complex, multiscalar phenomenon with immense moral and economic costs. Modeling the structure and dynamics of segregation is a pressing problem for sociology and urban planning, but existing methods have limitations. In this paper, we develop a suite of methods, grounded in information theory, for studying the spatial structure of segregation. We first advance existing profile and decomposition methods by posing two related regionalization methods, which allow for profile curves with nonconstant spatial scale and decomposition analysis with nonarbitrary areal units. We then formulate a measure of local spatial scale, which may be used for both detailed, within-city analysis and intercity comparisons. These methods highlight detailed insights in the structure and dynamics of urban segregation that would be otherwise easy to miss or difficult to quantify. They are computationally efficient, applicable to a broad range of study questions, and freely available in open source software. PMID:29078323
Structure and information in spatial segregation.
Chodrow, Philip S
2017-10-31
Ethnoracial residential segregation is a complex, multiscalar phenomenon with immense moral and economic costs. Modeling the structure and dynamics of segregation is a pressing problem for sociology and urban planning, but existing methods have limitations. In this paper, we develop a suite of methods, grounded in information theory, for studying the spatial structure of segregation. We first advance existing profile and decomposition methods by posing two related regionalization methods, which allow for profile curves with nonconstant spatial scale and decomposition analysis with nonarbitrary areal units. We then formulate a measure of local spatial scale, which may be used for both detailed, within-city analysis and intercity comparisons. These methods highlight detailed insights in the structure and dynamics of urban segregation that would be otherwise easy to miss or difficult to quantify. They are computationally efficient, applicable to a broad range of study questions, and freely available in open source software. Published under the PNAS license.
2012-01-01
Background Academic detailing is an interactive, convenient, and user-friendly approach to delivering non-commercial education to healthcare clinicians. While evidence suggests academic detailing is associated with improvements in prescribing behavior, uncertainty exists about generalizability and scalability in diverse settings. Our study evaluates different models of delivering academic detailing in a rural family medicine setting. Methods We conducted a pilot project to assess the feasibility, effectiveness, and satisfaction with academic detailing delivered face-to-face as compared to a modified approach using distance-learning technology. The recipients were four family medicine clinics within the Oregon Rural Practice-based Research Network (ORPRN). Two clinics were allocated to receive face-to-face detailing and two received outreach through video conferencing or asynchronous web-based outreach. Surveys at midpoint and completion were used to assess effectiveness and satisfaction. Results Each clinic received four outreach visits over an eight month period. Topics included treatment-resistant depression, management of atypical antipsychotics, drugs for insomnia, and benzodiazepine tapering. Overall, 90% of participating clinicians were satisfied with the program. Respondents who received in person detailing reported a higher likelihood of changing their behavior compared to respondents in the distance detailing group for five of seven content areas. While 90%-100% of respondents indicated they would continue to participate if the program were continued, the likelihood of participation declined if only distance approaches were offered. Conclusions We found strong support and satisfaction for the program among participating clinicians. Participants favored in-person approaches to distance interactions. Future efforts will be directed at quantitative methods for evaluating the economic and clinical effectiveness of detailing in rural family practice settings. PMID:23276303
A proof of the DBRF-MEGN method, an algorithm for deducing minimum equivalent gene networks
2011-01-01
Background We previously developed the DBRF-MEGN (difference-based regulation finding-minimum equivalent gene network) method, which deduces the most parsimonious signed directed graphs (SDGs) consistent with expression profiles of single-gene deletion mutants. However, until the present study, we have not presented the details of the method's algorithm or a proof of the algorithm. Results We describe in detail the algorithm of the DBRF-MEGN method and prove that the algorithm deduces all of the exact solutions of the most parsimonious SDGs consistent with expression profiles of gene deletion mutants. Conclusions The DBRF-MEGN method provides all of the exact solutions of the most parsimonious SDGs consistent with expression profiles of gene deletion mutants. PMID:21699737
The secret lives of experiments: methods reporting in the fMRI literature.
Carp, Joshua
2012-10-15
Replication of research findings is critical to the progress of scientific understanding. Accordingly, most scientific journals require authors to report experimental procedures in sufficient detail for independent researchers to replicate their work. To what extent do research reports in the functional neuroimaging literature live up to this standard? The present study evaluated methods reporting and methodological choices across 241 recent fMRI articles. Many studies did not report critical methodological details with regard to experimental design, data acquisition, and analysis. Further, many studies were underpowered to detect any but the largest statistical effects. Finally, data collection and analysis methods were highly flexible across studies, with nearly as many unique analysis pipelines as there were studies in the sample. Because the rate of false positive results is thought to increase with the flexibility of experimental designs, the field of functional neuroimaging may be particularly vulnerable to false positives. In sum, the present study documented significant gaps in methods reporting among fMRI studies. Improved methodological descriptions in research reports would yield significant benefits for the field. Copyright © 2012 Elsevier Inc. All rights reserved.
Personal and professional profile of mountain medicine physicians.
Peters, Patrick
2003-01-01
The purpose of this study was to define and describe the personal and professional profile of mountain medicine physicians including general physical training information and to include a detailed overview of the practice of mountain sports. A group of physicians participating in a specialized mountain medicine education program filled out a standardized questionnaire. The data obtained from this questionnaire were first analyzed in a descriptive way and then by statistical methods (chi2 test, t test, and analysis of variance). Detailed results have been provided for gender, age, marital status, general training frequency and methods, professional status, additional medical qualifications, memberships in professional societies and alpine clubs, mountain sports practice, and injuries sustained during the practice of mountain sports. This study has provided a detailed overview concerning the personal and professional profile of mountain medicine physicians. Course organizers as well as official commissions regulating the education in mountain medicine will be able to use this information to adapt and optimize the courses and the recommendations/requirements as detailed by the UIAA-ICAR-ISMM (Union Internationale des Associations Alpinistes, International Commission for Alpine Rescue, International Society for Mountain Medicine).
Cloke, Jonathan; Evans, Katharine; Crabtree, David; Hughes, Annette; Simpson, Helen; Holopainen, Jani; Wickstrand, Nina; Kauppinen, Mikko; Leon-Velarde, Carlos; Larson, Nathan; Dave, Keron; Chen, Yi; Ryser, Elliot; Carter, Mark
2016-01-01
The Thermo Scientific™ SureTect™ Listeria species assay is a new real-time PCR assay for the detection of all species of Listeria in food and environmental samples. The assay was originally certified as Performance Tested Methods(SM) (PTM) 071304 in 2013. This report details the method modification study undertaken to extend the performance claims of the assay for matrixes of raw ground turkey, raw ground pork, bagged lettuce, raw pork sausages, pasteurized 2% fat milk, raw cod, pasteurized brie cheese, and ice cream. The method modification study was conducted using the AOAC Research Institute (RI) PTM program to validate the SureTect PCR assay in comparison to the reference method detailed in ISO 11290-1:1996 including amendment 1:2004. All matrixes were tested by Thermo Fisher Scientific (Basingstoke, United Kingdom). In addition, three matrixes (raw cod, bagged lettuce, and pasteurized brie cheese) were analyzed independently as part of the AOAC RI-controlled independent laboratory study by the University of Guelph, Canada. Using probability of detection statistical analysis, there was no significant difference in the performance between the SureTect assay and the International Organization for Standardization reference method for any of the matrixes analyzed in this study.
NASA Astrophysics Data System (ADS)
Teodorani, M.; Strand, E.
Unexplained plasma-like atmospheric `light balls' are observed at very low altitudes during alternate phases of maximum and minimum in the Hessdalen area, located in central Norway. Several theories are presented in order to explain the observed phenomenon; among these: piezo-electricity from rocks, atmospheric ionization triggered by solar activity and cosmic rays. The presented study is aimed at proposing the use of a dedicated instrumental set-up, research experimental procedures and methods in order to prove or disprove every single theory: in this context several kinds of observational techniques, measurement strategies and physical tests of tactical relevance are discussed in detail. An introduction on any considered theory is presented together with a detailed discussion regarding the subsequent experimental phase. For each specific theory brief descriptions of the observable parameters and of the essential instrumental choices and a detailed discussion of measurement procedures coupled with suitable flow-charts, are presented.
Magnetic Resonance Imaging (MRI) -- Head
MedlinePlus Videos and Cool Tools
... are clearer and more detailed than other imaging methods. This exam does not use ionizing radiation and ... clearer and more detailed than with other imaging methods. This detail makes MRI an invaluable tool in ...
Theoretical investigation of gas-surface interactions
NASA Technical Reports Server (NTRS)
Lee, Timothy J.
1989-01-01
Four reprints are presented from four projects which are to be published in a refereed journal. Two are of interest to us and are presented herein. One is a description of a very detailed theoretical study of four anionic hydrogen bonded complexes. The other is a detailed study of the first generally reliable diagnostic for determining the quality of results that may be expected from single reference based electron correlation methods.
ERIC Educational Resources Information Center
Brock, Richard; Taber, Keith S.
2017-01-01
This paper examines the role of the microgenetic method in science education. The microgenetic method is a technique for exploring the progression of learning in detail through repeated, high-frequency observations of a learner's "performance" in some activity. Existing microgenetic studies in science education are analysed. This leads…
Stability of Permanent Magnets,
1984-03-06
temperature. The effect of impacts, vibrations and external magnetic fields in less detail is illuminated. The new, accelerated methods of the study of...accelerated methods, developed by the author of the study of the stability of magnets and systems, which do not require prolonged time intervais...the accelerated methods proposed to them of the study of the stability of magnets will contribute to the accumulation of experimental results and to
Historical Development of Asphalt Content Determination by the Ignition Method
DOT National Transportation Integrated Search
1996-01-01
This study was conducted to develop a reliable, detailed test procedure for determining asphalt cement (AC) content by the ignition method. The goal was to minimize the overall test time as well as technician time, and to produce a test method with a...
Lai, Yin-Hung; Wang, Yi-Sheng
2017-01-01
Although matrix-assisted laser desorption/ionization (MALDI) mass spectrometry is one of the most widely used soft ionization methods for biomolecules, the lack of detailed understanding of ionization mechanisms restricts its application in the analysis of carbohydrates. Structural identification of carbohydrates achieved by MALDI mass spectrometry helps us to gain insights into biological functions and pathogenesis of disease. In this review, we highlight mechanistic details of MALDI, including both ionization and desorption. Strategies to improve the ion yield of carbohydrates are also reviewed. Furthermore, commonly used fragmentation methods to identify the structure are discussed. PMID:28959517
Hammerschmidt, Lukas; Maschio, Lorenzo; Müller, Carsten; Paulus, Beate
2015-01-13
We have applied the Method of Increments and the periodic Local-MP2 approach to the study of the (110) surface of magnesium fluoride, a system of significant interest in heterogeneous catalysis. After careful assessment of the approximations inherent in both methods, the two schemes, though conceptually different, are shown to yield nearly identical results. This remains true even when analyzed in fine detail through partition of the individual contribution to the total energy. This kind of partitioning also provides thorough insight into the electron correlation effects underlying the surface formation process, which are discussed in detail.
Telling It All : A Story of Women's Social Capital Using a Mixed Methods Approach
ERIC Educational Resources Information Center
Hodgkin, Suzanne
2008-01-01
The aim of this article is to demonstrate how quantitative and qualitative methods can be us together in feminist research. Despite an increasing number of texts and journal articles detailing mixed methods research, there are relatively few published reports of its use in feminist study. This article draws on a study conducted in regional…
ERIC Educational Resources Information Center
Roberts, Amy; Chou, Prudence; Ching, Greg
2010-01-01
This article details a mixed methods study conducted during the 2007-2008 academic year at the National Chengchi University (NCCU) in Taipei Taiwan. It contributes to discourse examining the opportunities and challenges of international student enrollments in institutions of higher learning around the globe. In scope it details an empirical study…
NASA Astrophysics Data System (ADS)
Alshakova, E. L.
2017-01-01
The program in the AutoLISP language allows automatically to form parametrical drawings during the work in the AutoCAD software product. Students study development of programs on AutoLISP language with the use of the methodical complex containing methodical instructions in which real examples of creation of images and drawings are realized. Methodical instructions contain reference information necessary for the performance of the offered tasks. The method of step-by-step development of the program is the basis for training in programming on AutoLISP language: the program draws elements of the drawing of a detail by means of definitely created function which values of arguments register in that sequence in which AutoCAD gives out inquiries when performing the corresponding command in the editor. The process of the program design is reduced to the process of step-by-step formation of functions and sequence of their calls. The author considers the development of the AutoLISP program for the creation of parametrical drawings of details, the defined design, the user enters the dimensions of elements of details. These programs generate variants of tasks of the graphic works performed in educational process of "Engineering graphics", "Engineering and computer graphics" disciplines. Individual tasks allow to develop at students skills of independent work in reading and creation of drawings, as well as 3D modeling.
A Brief Introduction into the Renin-Angiotensin-Aldosterone System: New and Old Techniques.
Thatcher, Sean E
2017-01-01
The renin-angiotensin-aldosterone system (RAAS) is a complex system of enzymes, receptors, and peptides that help to control blood pressure and fluid homeostasis. Techniques in studying the RAAS can be difficult due to such factors as peptide/enzyme stability and receptor localization. This paper gives a brief account of the different components of the RAAS and current methods in measuring each component. There is also a discussion of different methods in measuring stem and immune cells by flow cytometry, hypertension, atherosclerosis, oxidative stress, energy balance, and other RAAS-activated phenotypes. While studies on the RAAS have been performed for over 100 years, new techniques have allowed scientists to come up with new insights into this system. These techniques are detailed in this Methods in Molecular Biology Series and give students new to studying the RAAS the proper controls and technical details needed to perform each procedure.
Using Mixed Methods to Study First-Year College Impact on Liberal Arts Learning Outcomes
ERIC Educational Resources Information Center
Seifert, Tricia A.; Goodman, Kathleen; King, Patricia M.; Baxter Magolda, Marcia B.
2010-01-01
This study details the collection, analysis, and interpretation of data from a national multi-institutional longitudinal mixed methods study of college impact and student development of liberal arts outcomes. The authors found three sets of practices in the quantitative data that corroborated with the themes that emerged from the qualitative data:…
Economic effects of propulsion system technology on existing and future transport aircraft
NASA Technical Reports Server (NTRS)
Sallee, G. P.
1974-01-01
The results of an airline study of the economic effects of propulsion system technology on current and future transport aircraft are presented. This report represents the results of a detailed study of propulsion system operating economics. The study has four major parts: (1) a detailed analysis of current propulsion system maintenance with respect to the material and labor costs encountered versus years in service and the design characteristics of the major elements of the propulsion system of the B707, b727, and B747. (2) an analysis of the economic impact of a future representative 1979 propulsion system is presented with emphasis on depreciation of investment, fuel costs and maintenance costs developed on the basis of the analysis of the historical trends observed. (3) recommendations concerning improved methods of forecasting the maintenance cost of future propulsion systems are presented. A detailed method based on the summation of the projected labor and material repair costs for each major engine module and its installation along with a shorter form suitable for quick, less detailed analysis are presented, and (4) recommendations concerning areas where additional technology is needed to improve the economics of future commercial propulsion systems are presented along with the suggested economic benefits available from such advanced technology efforts.
Characteristics of Academic Detailing: Results of a Literature Review
Van Hoof, Thomas J.; Harrison, Lisa G.; Miller, Nicole E.; Pappas, Maryanne S.; Fischer, Michael A.
2015-01-01
Background Academic detailing is an evidence-based strategy to improve patient care. Efforts to understand the intervention and to use it strategically require an understanding of its important characteristics. A recent systematic review and a subsequent reporting framework call for more accurate and complete reporting of continuing medical education interventions. Objectives Building on a previously published systematic review of 69 studies, we sought to determine how an expanded set of 106 academic detailing studies, including many recently published articles, fared with respect to reporting of important data about this intervention. Methods We conducted a search of MEDLINE, the Cumulative Index to Nursing and Allied Health Literature (clinical) database, and Scopus, from which we identified 38 additional randomized controlled trials published from August 2007 through March 2013. Including the original 69 studies, we abstracted 106 available English-language studies and quantitatively analyzed information about 4 important characteristics of academic detailing: content of visits, clinicians being visited, communication process underlying visits, and outreach workers making visits. Results We found considerable variation (36.5%-100%) in the extent of reporting intervention characteristics, especially about the communication process underlying visits and the outreach workers making visits. The best overall documentation of intervention characteristics of any single study was 68%. Results also demonstrate wide variation in the approach to academic detailing. Conclusions This study demonstrates the need for a standardized approach to collecting and reporting data about academic detailing interventions. Our findings also highlight opportunities for using academic detailing more effectively in research and quality-improvement efforts. PMID:26702333
The present study investigates primary and secondary sources of organic carbon for Bakersfield, CA, USA as part of the 2010 CalNex study. The method used here involves integrated sampling that is designed to allow for detailed and specific chemical analysis of particulate matter ...
Iterative image-domain ring artifact removal in cone-beam CT
NASA Astrophysics Data System (ADS)
Liang, Xiaokun; Zhang, Zhicheng; Niu, Tianye; Yu, Shaode; Wu, Shibin; Li, Zhicheng; Zhang, Huailing; Xie, Yaoqin
2017-07-01
Ring artifacts in cone beam computed tomography (CBCT) images are caused by pixel gain variations using flat-panel detectors, and may lead to structured non-uniformities and deterioration of image quality. The purpose of this study is to propose a method of general ring artifact removal in CBCT images. This method is based on the polar coordinate system, where the ring artifacts manifest as stripe artifacts. Using relative total variation, the CBCT images are first smoothed to generate template images with fewer image details and ring artifacts. By subtracting the template images from the CBCT images, residual images with image details and ring artifacts are generated. As the ring artifact manifests as a stripe artifact in a polar coordinate system, the artifact image can be extracted by mean value from the residual image; the image details are generated by subtracting the artifact image from the residual image. Finally, the image details are compensated to the template image to generate the corrected images. The proposed framework is iterated until the differences in the extracted ring artifacts are minimized. We use a 3D Shepp-Logan phantom, Catphan©504 phantom, uniform acrylic cylinder, and images from a head patient to evaluate the proposed method. In the experiments using simulated data, the spatial uniformity is increased by 1.68 times and the structural similarity index is increased from 87.12% to 95.50% using the proposed method. In the experiment using clinical data, our method shows high efficiency in ring artifact removal while preserving the image structure and detail. The iterative approach we propose for ring artifact removal in cone-beam CT is practical and attractive for CBCT guided radiation therapy.
Experimental Study on Fatigue Behaviour of Shot-Peened Open-Hole Steel Plates
Wang, Zhi-Yu; Wang, Qing-Yuan; Cao, Mengqin
2017-01-01
This paper presents an experimental study on the fatigue behaviour of shot-peened open-hole plates with Q345 steel. The beneficial effects induced by shot peening on the fatigue life improvement are highlighted. The characteristic fatigue crack initiation and propagation modes of open-hole details under fatigue loading are revealed. The surface hardening effect brought by the shot peening is analyzed from the aspects of in-depth micro-hardness and compressive residual stress. The fatigue life results are evaluated and related design suggestions are made as a comparison with codified detail categories. In particular, a fracture mechanics theory-based method is proposed and demonstrated its validity in predicting the fatigue life of studied shot-peened open-hole details. PMID:28841160
Research notes : improving freight data collection methods.
DOT National Transportation Integrated Search
2004-07-01
The overall goal of this study was to identify data collection methods capable of generating the information at a level of detail that would better fill ODOTs modeling and freight planning needs at the metropolitan level. After a review of other r...
Khaksari, Maryam; Mazzoleni, Lynn R; Ruan, Chunhai; Kennedy, Robert T; Minerick, Adrienne R
2017-04-01
Two separate liquid chromatography (LC)-mass spectrometry (MS) methods were developed for determination and quantification of water-soluble and fat-soluble vitamins in human tear and blood serum samples. The water-soluble vitamin method was originally developed to detect vitamins B 1 , B 2 , B 3 (nicotinamide), B 5 , B 6 (pyridoxine), B 7 , B 9 and B 12 while the fat-soluble vitamin method detected vitamins A, D 3 , 25(OH)D 3, E and K 1 . These methods were then validated with tear and blood serum samples. In this data in brief article, we provide details on the two LC-MS methods development, methods sensitivity, as well as precision and accuracy for determination of vitamins in human tears and blood serum. These methods were then used to determine the vitamin concentrations in infant and parent samples under a clinical study which were reported in "Determination of Water-Soluble and Fat-Soluble Vitamins in Tears and Blood Serum of Infants and Parents by Liquid Chromatography/Mass Spectrometry DOI:10.1016/j.exer.2016.12.007 [1]". This article provides more details on comparison of vitamin concentrations in the samples with the ranges reported in the literature along with the medically accepted normal ranges. The details on concentrations below the limits of detection (LOD) and limits of quantification (LOQ) are also discussed. Vitamin concentrations were also compared and cross-correlated with clinical data and nutritional information. Significant differences and strongly correlated data were reported in [1]. This article provides comprehensive details on the data with slight differences or slight correlations.
Route visualization using detail lenses.
Karnick, Pushpak; Cline, David; Jeschke, Stefan; Razdan, Anshuman; Wonka, Peter
2010-01-01
We present a method designed to address some limitations of typical route map displays of driving directions. The main goal of our system is to generate a printable version of a route map that shows the overview and detail views of the route within a single, consistent visual frame. Our proposed visualization provides a more intuitive spatial context than a simple list of turns. We present a novel multifocus technique to achieve this goal, where the foci are defined by points of interest (POI) along the route. A detail lens that encapsulates the POI at a finer geospatial scale is created for each focus. The lenses are laid out on the map to avoid occlusion with the route and each other, and to optimally utilize the free space around the route. We define a set of layout metrics to evaluate the quality of a lens layout for a given route map visualization. We compare standard lens layout methods to our proposed method and demonstrate the effectiveness of our method in generating aesthetically pleasing layouts. Finally, we perform a user study to evaluate the effectiveness of our layout choices.
Resolution for color photography
NASA Astrophysics Data System (ADS)
Hubel, Paul M.; Bautsch, Markus
2006-02-01
Although it is well known that luminance resolution is most important, the ability to accurately render colored details, color textures, and colored fabrics cannot be overlooked. This includes the ability to accurately render single-pixel color details as well as avoiding color aliasing. All consumer digital cameras on the market today record in color and the scenes people are photographing are usually color. Yet almost all resolution measurements made on color cameras are done using a black and white target. In this paper we present several methods for measuring and quantifying color resolution. The first method, detailed in a previous publication, uses a slanted-edge target of two colored surfaces in place of the standard black and white edge pattern. The second method employs the standard black and white targets recommended in the ISO standard, but records these onto the camera through colored filters thus giving modulation between black and one particular color component; red, green, and blue color separation filters are used in this study. The third method, conducted at Stiftung Warentest, an independent consumer organization of Germany, uses a whitelight interferometer to generate fringe pattern targets of varying color and spatial frequency.
Compiled visualization with IPI method for analysing of liquid liquid mixing process
NASA Astrophysics Data System (ADS)
Jasikova, Darina; Kotek, Michal; Kysela, Bohus; Sulc, Radek; Kopecky, Vaclav
2018-06-01
The article deals with the research of mixing process using visualization techniques and IPI method. Characteristics of the size distribution and the evolution of two liquid-liquid phase's disintegration were studied. A methodology has been proposed for visualization and image analysis of data acquired during the initial phase of the mixing process. IPI method was used for subsequent detailed study of the disintegrated droplets. The article describes advantages of usage of appropriate method, presents the limits of each method, and compares them.
ERIC Educational Resources Information Center
Unal, Suat; Calik, Muammer; Ayas, Alipasa; Coll, Richard K.
2006-01-01
The present paper presents a detailed thematic review of chemical bonding studies. To achieve this, a matrix is developed to summarize and present the findings by focusing on insights derived from the related studies. The matrix incorporates the following themes: needs, aims, methods of exploring students' conceptions, general knowledge claims,…
Methods for the evaluation of alternative disaster warning systems
NASA Technical Reports Server (NTRS)
Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.
1977-01-01
For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.
Illustrating a Mixed-Method Approach for Validating Culturally Specific Constructs
ERIC Educational Resources Information Center
Hitchcock, J.H.; Nastasi, B.K.; Dai, D.Y.; Newman, J.; Jayasena, A.; Bernstein-Moore, R.; Sarkar, S.; Varjas, K.
2005-01-01
The purpose of this article is to illustrate a mixed-method approach (i.e., combining qualitative and quantitative methods) for advancing the study of construct validation in cross-cultural research. The article offers a detailed illustration of the approach using the responses 612 Sri Lankan adolescents provided to an ethnographic survey. Such…
Development of a Magnetic Attachment Method for Bionic Eye Applications.
Fox, Kate; Meffin, Hamish; Burns, Owen; Abbott, Carla J; Allen, Penelope J; Opie, Nicholas L; McGowan, Ceara; Yeoh, Jonathan; Ahnood, Arman; Luu, Chi D; Cicione, Rosemary; Saunders, Alexia L; McPhedran, Michelle; Cardamone, Lisa; Villalobos, Joel; Garrett, David J; Nayagam, David A X; Apollo, Nicholas V; Ganesan, Kumaravelu; Shivdasani, Mohit N; Stacey, Alastair; Escudie, Mathilde; Lichter, Samantha; Shepherd, Robert K; Prawer, Steven
2016-03-01
Successful visual prostheses require stable, long-term attachment. Epiretinal prostheses, in particular, require attachment methods to fix the prosthesis onto the retina. The most common method is fixation with a retinal tack; however, tacks cause retinal trauma, and surgical proficiency is important to ensure optimal placement of the prosthesis near the macula. Accordingly, alternate attachment methods are required. In this study, we detail a novel method of magnetic attachment for an epiretinal prosthesis using two prostheses components positioned on opposing sides of the retina. The magnetic attachment technique was piloted in a feline animal model (chronic, nonrecovery implantation). We also detail a new method to reliably control the magnet coupling force using heat. It was found that the force exerted upon the tissue that separates the two components could be minimized as the measured force is proportionately smaller at the working distance. We thus detail, for the first time, a surgical method using customized magnets to position and affix an epiretinal prosthesis on the retina. The position of the epiretinal prosthesis is reliable, and its location on the retina is accurately controlled by the placement of a secondary magnet in the suprachoroidal location. The electrode position above the retina is less than 50 microns at the center of the device, although there were pressure points seen at the two edges due to curvature misalignment. The degree of retinal compression found in this study was unacceptably high; nevertheless, the normal structure of the retina remained intact under the electrodes. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
Barnabishvili, Maia; Ulrichs, Timo; Waldherr, Ruth
2016-09-01
This data article presents the supplementary material for the review paper "Role of acceptability barriers in delayed diagnosis of Tuberculosis: Literature review from high burden countries" (Barnabishvili et al., in press) [1]. General overview of 12 qualitative papers, including the details about authors, years of publication, data source locations, study objectives, overview of methods, study population characteristics, as well as the details of intervention and the outcome parameters of the papers are summarized in the first two tables included to the article. Quality assessment process of the methodological strength of 12 papers and the results of the critical appraisal are further described and summarized in the second part of the article.
Detailed simulation of a Lobster-eye telescope.
Putkunz, Corey T; Peele, Andrew G
2009-08-03
The concept of an x-ray telescope based on the optics of the eye of certain types of crustacea has been in currency for nearly thirty years. However, it is only in the last decade that the technology to make the telescope and the opportunity to mount it on a suitable space platform have combined to allow the idea to become a reality. Accordingly, we have undertaken a detailed simulation study, updating previous simplified models, to properly characterise the performance of the instrument in orbit. The study reveals details of how the particular characteristics of the lobster-eye optics affect the sensitivity of the instrument and allow us to implement new ideas in data extraction methods.
Field Methods for the Study of Slope and Fluvial Processes
Leopold, Luna Bergere; Leopold, Luna Bergere
1967-01-01
In Belgium during the summer of 1966 the Commission on Slopes and the Commission on Applied Geomorphology of the International Geographical Union sponsored a joint symposium, with field excursions, and meetings of the two commissions. As a result of the conference and associated discussions, the participants expressed the view that it would be a contribution to scientific work relating to the subject area if the Commission on Applied Geomorphology could prepare a small manual describling the methods of field investigation being used by research scientists throughout the world in the study of various aspects of &lope development and fluvial processes. The Commission then assumed this responsibility and asked as many persons as were known to be. working on this subject to contribute whatever they wished in the way of descriptions of methods being employed.The purpose of the present manual is to show the variety of study methods now in use, to describe from the experience gained the limitations and advantages of different techniques, and to give pertinent detail which might be useful to other investigators. Some details that would be useful to know are not included in scientific publications, but in a manual on methods the details of how best t6 use a method has a place. Various persons have learned certain things which cannot be done, as well as some methods that are successful. It is our hope that comparison of methods tried will give the reader suggestions as to how a particular method might best be applied to his own circumstance.The manual does not purport to include methods used by all workers. In particular, it does not interfere with a more systematic treatment of the subject (1) or with various papers already published in the present journal. In fact we are sure that there are pertinent research methods that we do not know of and the Commission would be glad to receive additions and other ideas from those who find they have something to contribute. Also, the manual describes the methods in brief form. If further details are desired we urge that individual scientists correspond with their colleagues whose contributions are included in this little volume. The Commission thanks all contributors to this manual and hopes that their contributions have been included in a satisfactory way. The Commission also thanks Dr. Luna B. Leopold of the United States Geological Survey, who at our request assumed the task of collecting the contributions, editing them and compiling the present work.
The report discusses an EPA investigation of techniques to improve methods for estimating volatile organic compound (VOC) emissions from area sources. Using the automobile refinishing industry for a detailed area source case study, an emission estimation method is being developed...
ERIC Educational Resources Information Center
Gough, Deborah
1991-01-01
This document summarizes five studies that offer insight into the nature of higher-order thinking skills and the most effective methods for teaching them to students. The reviews outline the conclusions, definitions, recommendations, specific methods of teaching, instructional strategies, and programs detailed in the documents themselves.…
NASA Astrophysics Data System (ADS)
Yu, Yu-Hong; Xu, Hua-Gen; Xu, Hu-Shan; Zhan, Wen-Long; Sun, Zhi-Yu; Guo, Zhong-Yan; Hu, Zheng-Guo; Wang, Jian-Song; Chen, Jun-Ling; Zheng, Chuan
2009-07-01
To achieve a better time resolution of a scintillator-bar detector for a neutron wall at the external target facility of HIRFL-CSR, we have carried out a detailed study of the photomultiplier, the wrapping material and the coupling media. The timing properties of a scintillator-bar detector have been studied in detail with cosmic rays using a high and low level signal coincidence. A time resolution of 80 ps has been achieved in the center of the scintillator-bar detector.
A multiresolution processing method for contrast enhancement in portal imaging.
Gonzalez-Lopez, Antonio
2018-06-18
Portal images have a unique feature among the imaging modalities used in radiotherapy: they provide direct visualization of the irradiated volumes. However, contrast and spatial resolution are strongly limited due to the high energy of the radiation sources. Because of this, imaging modalities using x-ray energy beams have gained importance in the verification of patient positioning, replacing portal imaging. The purpose of this work was to develop a method for the enhancement of local contrast in portal images. The method operates in the subbands of a wavelet decomposition of the image, re-scaling them in such a way that coefficients in the high and medium resolution subbands are amplified, an approach totally different of those operating on the image histogram, widely used nowadays. Portal images of an anthropomorphic phantom were acquired in an electronic portal imaging device (EPID). Then, different re-scaling strategies were investigated, studying the effects of the scaling parameters on the enhanced images. Also, the effect of using different types of transforms was studied. Finally, the implemented methods were combined with histogram equalization methods like the contrast limited adaptive histogram equalization (CLAHE), and these combinations were compared. Uniform amplification of the detail subbands shows the best results in contrast enhancement. On the other hand, linear re-escalation of the high resolution subbands increases the visibility of fine detail of the images, at the expense of an increase in noise levels. Also, since processing is applied only to detail subbands, not to the approximation, the mean gray level of the image is minimally modified and no further display adjustments are required. It is shown that re-escalation of the detail subbands of portal images can be used as an efficient method for the enhancement of both, the local contrast and the resolution of these images. © 2018 Institute of Physics and Engineering in Medicine.
Facebook targeted advertisement for research recruitment: A primer for nurse researchers.
Carter-Harris, Lisa
2016-11-01
Recruiting participants for research studies can be challenging and costly. Innovative recruitment methods are needed. Facebook targeted advertisement offers a low-cost alternative to traditional methods that has been successfully used in research study recruitment. This primer offers nurse researchers a method utilizing social media as a recruitment tool and details Facebook targeted advertisement for research recruitment. Copyright © 2016 Elsevier Inc. All rights reserved.
Cloke, Jonathan; Arizanova, Julia; Crabtree, David; Simpson, Helen; Evans, Katharine; Vaahtoranta, Laura; Palomäki, Jukka-Pekka; Artimo, Paulus; Huang, Feng; Liikanen, Maria; Koskela, Suvi; Chen, Yi
2016-01-01
The Thermo Scientific™ SureTect™ Listeria species Real-Time PCR Assay was certified during 2013 by the AOAC Research Institute (RI) Performance Tested Methods(SM) program as a rapid method for the detection of Listeria species from a wide range of food matrixes and surface samples. A method modification study was conducted in 2015 to extend the matrix claims of the product to a wider range of food matrixes. This report details the method modification study undertaken to extend the use of this PCR kit to the Applied Biosystems™ 7500 Fast PCR Instrument and Applied Biosystems RapidFinder™ Express 2.0 software allowing use of the assay on a 96-well format PCR cycler in addition to the current workflow, using the 24-well Thermo Scientific PikoReal™ PCR Instrument and Thermo Scientific SureTect software. The method modification study presented in this report was assessed by the AOAC-RI as being a level 2 method modification study, necessitating a method developer study on a representative range of food matrixes covering raw ground turkey, 2% fat pasteurized milk, and bagged lettuce as well as stainless steel surface samples. All testing was conducted in comparison to the reference method detailed in International Organization for Standardization (ISO) 6579:2002. No significant difference by probability of detection statistical analysis was found between the SureTect Listeria species PCR Assay or the ISO reference method methods for any of the three food matrixes and the surface samples analyzed during the study.
NASA Astrophysics Data System (ADS)
Fallahi, Arya; Oswald, Benedikt; Leidenberger, Patrick
2012-04-01
We study a 3-dimensional, dual-field, fully explicit method for the solution of Maxwell's equations in the time domain on unstructured, tetrahedral grids. The algorithm uses the element level time domain (ELTD) discretization of the electric and magnetic vector wave equations. In particular, the suitability of the method for the numerical analysis of nanometer structured systems in the optical region of the electromagnetic spectrum is investigated. The details of the theory and its implementation as a computer code are introduced and its convergence behavior as well as conditions for stable time domain integration is examined. Here, we restrict ourselves to non-dispersive dielectric material properties since dielectric dispersion will be treated in a subsequent paper. Analytically solvable problems are analyzed in order to benchmark the method. Eventually, a dielectric microlens is considered to demonstrate the potential of the method. A flexible method of 2nd order accuracy is obtained that is applicable to a wide range of nano-optical configurations and can be a serious competitor to more conventional finite difference time domain schemes which operate only on hexahedral grids. The ELTD scheme can resolve geometries with a wide span of characteristic length scales and with the appropriate level of detail, using small tetrahedra where delicate, physically relevant details must be modeled.
ERIC Educational Resources Information Center
Feltis, Brooke B.; Powell, Martine B.; Snow, Pamela C.; Hughes-Scholes, Carolyn H.
2010-01-01
Objective: This study compared the effects of open-ended versus specific questions, and various types of open-ended questions, in eliciting story-grammar detail in child abuse interviews. Methods: The sample included 34 police interviews with child witnesses aged 5-15 years ("M" age = 9 years, 9 months). The interviewers' questions and their…
ERIC Educational Resources Information Center
Munson, Benjamin; Johnson, Julie M.; Edwards, Jan
2012-01-01
Purpose: This study examined whether experienced speech-language pathologists (SLPs) differ from inexperienced people in their perception of phonetic detail in children's speech. Method: Twenty-one experienced SLPs and 21 inexperienced listeners participated in a series of tasks in which they used a visual-analog scale (VAS) to rate children's…
Ndefo, Uche Anadu; Norman, Rolicia; Henry, Andrea
2017-01-01
Background When initiated by a health plan, academic detailing can be used to change prescribing practices, which can lead to increased safety and savings. Objective To evaluate the impact of academic detailing on prescribing and prescription drug costs of cefixime to a health plan. Methods A prospective intervention study was carried out that evaluated the prescribing practices and prescription drug costs of cefixime. A total of 11 prescribers were detailed by 1 pharmacist between August 2014 and March 2015. Two of the 11 prescribers did not respond to the academic detailing and were not followed up. The physicians' prescribing habits and prescription costs were compared before and after detailing to evaluate the effectiveness of the intervention. Data were collected for approximately 5 months before and after the intervention. Each prescriber served as his or her own control. Results Overall, an approximate 36% reduction in the number of cefixime prescriptions written and an approximate 20% decrease in prescription costs was seen with academic detailing compared with the year before the intervention. In 9 of 11 (82%) prescribers, intervention with academic detailing was successful and resulted in fewer prescriptions for cefixime during the study period. Conclusion Academic detailing had a positive impact on prescribing, by decreasing the number of cefixime prescriptions and lowering the drug costs to the health plan. PMID:28626509
How Seductive Are Decorative Elements in Learning Materials?
ERIC Educational Resources Information Center
Rey, Gunter Daniel
2012-01-01
The seductive detail effect arises when people learn more deeply from a multimedia presentation when interesting but irrelevant adjuncts are excluded. However, previous studies about this effect are rather inconclusive and contained various methodical problems. The recent experiment attempted to overcome these methodical problems. Undergraduate…
Comparing Three Methods for Teaching Newton's Third Law
ERIC Educational Resources Information Center
Smith, Trevor I.; Wittman, Michael C.
2007-01-01
Although guided-inquiry methods for teaching introductory physics have been individually shown to be more effective at improving conceptual understanding than traditional lecture-style instruction, researchers in physics education have not studied differences among reform-based curricula in much detail. Several researchers have developed…
MEASUREMENT OF INDOOR AIR EMISSIONS FROM DRY-PROCESS PHOTOCOPY MACHINES
The article provides background information on indoor air emissions from office equipment, with emphasis on dry-process photocopy machines. The test method is described in detail along with results of a study to evaluate the test method using four dry-process photocopy machines. ...
Promoting a smokers' quitline in Ontario, Canada: an evaluation of an academic detailing approach.
Kirst, Maritt; Schwartz, Robert
2015-06-01
This study assesses the impact of an academic detailing quitline promotional outreach program on integration of patient referrals to the quitline by fax in healthcare settings and quitline utilization in Ontario, Canada. The study employed a mixed methods approach for evaluation, with trend analysis of quitline administrative data from the year before program inception (2005) to 2011 and qualitative interviews with quitline stakeholders. Participants in the qualitative interviews included academic detailing program staff, regional tobacco control stakeholders and quitline promotion experts. Quantitative outcomes included the number of fax referral partners and fax referrals received, and quitline reach. Trends in proximal and distal outreach program outcomes were assessed. The qualitative data were analysed through a process of data coding involving the constant comparative technique derived from grounded theory methods. The study identified that the outreach program has had some success in integrating the fax referral program in healthcare settings through evidence of increased fax referrals since program inception. However, organizational barriers to program partner engagement have been encountered. While referral from health professionals through the fax referral programs has increased since the inception of the outreach program, the overall reach of the quitline has not increased. The study findings highlight that an academic detailing approach to quitline promotion can have some success in achieving increased fax referral program integration in healthcare settings. However, findings suggest that investment in a comprehensive promotional strategy, incorporating academic detailing, media and the provision of free cessation medications may be a more effective approach to quitline promotion. © The Author (2013). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Wang, Shuo; Poon, Gregory M K; Wilson, W David
2015-01-01
Biosensor-surface plasmon resonance (SPR) technology has emerged as a powerful label-free approach for the study of nucleic acid interactions in real time. The method provides simultaneous equilibrium and kinetic characterization for biomolecular interactions with low sample requirements and without the need for external probes. A detailed and practical guide for protein-DNA interaction analyses using biosensor-SPR methods is presented. Details of SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips and samples, experimental design, quantitative and qualitative data analyses and presentation. A specific example of the interaction of a transcription factor with DNA is provided with results evaluated by both kinetic and steady-state SPR methods.
Prussin, Aaron J; Zigler, David F; Jain, Avijita; Brown, Jared R; Winkel, Brenda S J; Brewer, Karen J
2008-04-01
Methods for the study of DNA photocleavage are illustrated using a mixed-metal supramolecular complex [{(bpy)(2)Ru(dpp)}(2)RhCl(2)]Cl(5). The methods use supercoiled pUC18 plasmid as a DNA probe and either filtered light from a xenon arc lamp source or monochromatic light from a newly designed, high-intensity light-emitting diode (LED) array. Detailed methods for performing the photochemical experiments and analysis of the DNA photoproduct are delineated. Detailed methods are also given for building an LED array to be used for DNA photolysis experiments. The Xe arc source has a broad spectral range and high light flux. The LEDs have a high-intensity, nearly monochromatic output. Arrays of LEDs have the advantage of allowing tunable, accurate output to multiple samples for high-throughput photochemistry experiments at relatively low cost.
Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
Individual Patterns of Enrolment in Primary Schools in the Republic of Honduras
ERIC Educational Resources Information Center
Sekiya, Takeshi
2014-01-01
The Reconstructed Cohort Method is often used to examine the status of national education. However, this method does not account for individual details and we know little about the status of school enrolments by tracking individual students from entrance until dropout or graduation. This study employs the True Cohort Method to analyse data for…
Home Brew Salinity Measuring Devices: Their Construction and Use.
ERIC Educational Resources Information Center
Schlenker, Richard M.
This paper discusses several inexpensive methods of evaluating the salinity of seawater. One method is presented in some detail. This method has several attractive features. First, it can be used to provide instruction, not only in marine chemistry, but also in studying the mathematics of the point slope formula, and as an aid in teaching students…
Complex Langevin method: When can it be trusted?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aarts, Gert; Seiler, Erhard; Stamatescu, Ion-Olimpiu
2010-03-01
We analyze to what extent the complex Langevin method, which is in principle capable of solving the so-called sign problems, can be considered as reliable. We give a formal derivation of the correctness and then point out various mathematical loopholes. The detailed study of some simple examples leads to practical suggestions about the application of the method.
NASA Astrophysics Data System (ADS)
Pawłowicz, Joanna A.
2017-10-01
The TLS method (Terrestrial Laser Scanning) may replace the traditional building survey methods, e.g. those requiring the use measuring tapes or range finders. This technology allows for collecting digital data in the form of a point cloud, which can be used to create a 3D model of a building. In addition, it allows for collecting data with an incredible precision, which translates into the possibility to reproduce all architectural features of a building. This data is applied in reverse engineering to create a 3D model of an object existing in a physical space. This study presents the results of a research carried out using a point cloud to recreate the architectural features of a historical building with the application of reverse engineering. The research was conducted on a two-storey residential building with a basement and an attic. Out of the building’s façade sticks a veranda featuring a complicated, wooden structure. The measurements were taken at the medium and the highest resolution using a ScanStation C10 laser scanner by Leica. The data obtained was processed using specialist software, which allowed for the application of reverse engineering, especially for reproducing the sculpted details of the veranda. Following digitization, all redundant data was removed from the point cloud and the cloud was subjected to modelling. For testing purposes, a selected part of the veranda was modelled by means of two methods: surface matching and Triangulated Irregular Network. Both modelling methods were applied in the case of data collected at medium and the highest resolution. Creating a model based on data obtained at medium resolution, both by means of the surface matching and the TIN method, does not allow for a precise recreation of architectural details. The study presents certain sculpted elements recreated based on the highest resolution data with superimposed TIN juxtaposed against a digital image. The resulting model is very precise. Creating good models requires highly accurate field data. It is important to properly choose the distance between the measuring station and the measured object in order to ensure that the angles of incidence (horizontal and vertical) of the laser beam are as straight as possible. The model created based on medium resolution offers very poor quality of details, i.e. only the bigger, basic elements of each detail are clearly visible, while the smaller ones are blurred. This is why in order to obtain data sufficient to reproduce architectural details laser scanning should be performed at the highest resolution. In addition, modelling by means of the surface matching method should be avoided - a better idea is to use the TIN method. In addition to providing a realistically-looking visualization, the method has one more important advantage - it is 4 times faster than the surface matching method.
Zero leakage separable and semipermanent ducting joints
NASA Technical Reports Server (NTRS)
Mischel, H. T.
1973-01-01
A study program has been conducted to explore new methods of achieving zero leakage, separable and semipermanent, ducting joints for space flight vehicles. The study consisted of a search of literature of existing zero leakage methods, the generation of concepts of new methods of achieving the desired zero leakage criteria and the development of detailed analysis and design of a selected concept. Other techniques of leak detection were explored with a view toward improving this area.
Tomographic assessment of the spine in children with spondylocostal dysotosis syndrome
Kaissi, Ali Al; Klaushofer, Klaus; Grill, Franz
2010-01-01
OBJECTIVE: The aim of this study was to perform a detailed tomographic analysis of the skull base, craniocervical junction, and the entire spine in seven patients with spondylocostal dysostosis syndrome. METHOD: Detailed scanning images have been organized in accordance with the most prominent clinical pathology. The reasons behind plagiocephaly, torticollis, short immobile neck, scoliosis and rigid back have been detected. Radiographic documentation was insufficient modality. RESULTS: Detailed computed tomography scans provided excellent delineation of the osseous abnormality pattern in our patients. CONCLUSION: This article throws light on the most serious osseous manifestations of spondylocostal dysostosis syndrome. PMID:21120293
Auricular Acupuncture with Laser
Bahr, Frank
2013-01-01
Auricular acupuncture is a method which has been successfully used in various fields of medicine especially in the treatment of pain relief. The introduction of lasers especially low-level lasers into medicine brought besides the already existing stimulation with needles and electricity an additional technique to auricular acupuncture. This literature research looks at the historical background, the development and the anatomical and neurological aspects of auricular acupuncture in general and auricular laser acupuncture in detail. Preliminary scientific findings on auricular acupuncture with laser have been described in detail and discussed critically in this review article. The results of the studies have shown evidence of the effect of auricular laser acupuncture. However, a comparison of these studies was impossible due to their different study designs. The most important technical as well as study parameters were described in detail in order to give more sufficient evidence and to improve the quality of future studies. PMID:23935695
Assessing the reliability of ecotoxicological studies: An overview of current needs and approaches.
Moermond, Caroline; Beasley, Amy; Breton, Roger; Junghans, Marion; Laskowski, Ryszard; Solomon, Keith; Zahner, Holly
2017-07-01
In general, reliable studies are well designed and well performed, and enough details on study design and performance are reported to assess the study. For hazard and risk assessment in various legal frameworks, many different types of ecotoxicity studies need to be evaluated for reliability. These studies vary in study design, methodology, quality, and level of detail reported (e.g., reviews, peer-reviewed research papers, or industry-sponsored studies documented under Good Laboratory Practice [GLP] guidelines). Regulators have the responsibility to make sound and verifiable decisions and should evaluate each study for reliability in accordance with scientific principles regardless of whether they were conducted in accordance with GLP and/or standardized methods. Thus, a systematic and transparent approach is needed to evaluate studies for reliability. In this paper, 8 different methods for reliability assessment were compared using a number of attributes: categorical versus numerical scoring methods, use of exclusion and critical criteria, weighting of criteria, whether methods are tested with case studies, domain of applicability, bias toward GLP studies, incorporation of standard guidelines in the evaluation method, number of criteria used, type of criteria considered, and availability of guidance material. Finally, some considerations are given on how to choose a suitable method for assessing reliability of ecotoxicity studies. Integr Environ Assess Manag 2017;13:640-651. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).
NASA Astrophysics Data System (ADS)
Sedukhin, V. V.; Anikeev, A. N.; Chumanov, I. V.
2017-11-01
Method optimizes hardening working layer parts’, working in high-abrasive conditions looks in this work: bland refractory particles WC and TiC in respect of 70/30 wt. % prepared by beforehand is applied on polystyrene model in casting’ mould. After metal poured in mould, withstand for crystallization, and then a study is carried out. Study macro- and microstructure received samples allows to say that thickness and structure received hardened layer depends on duration interactions blend harder carbides and liquid metal. Different character interactions various dispersed particles and matrix metal observed under the same conditions. Tests abrasive wear resistance received materials of method calculating residual masses was conducted in laboratory’ conditions. Results research wear resistance showed about that method obtaining harder coating of blend carbide tungsten and carbide titanium by means of drawing on surface foam polystyrene model before moulding, allows receive details with surface has wear resistance in 2.5 times higher, than details of analogy steel uncoated. Wherein energy costs necessary for transformation units mass’ substances in powder at obtained harder layer in 2.06 times higher, than materials uncoated.
Stochastic reconstructions of spectral functions: Application to lattice QCD
NASA Astrophysics Data System (ADS)
Ding, H.-T.; Kaczmarek, O.; Mukherjee, Swagato; Ohno, H.; Shu, H.-T.
2018-05-01
We present a detailed study of the applications of two stochastic approaches, stochastic optimization method (SOM) and stochastic analytical inference (SAI), to extract spectral functions from Euclidean correlation functions. SOM has the advantage that it does not require prior information. On the other hand, SAI is a more generalized method based on Bayesian inference. Under mean field approximation SAI reduces to the often-used maximum entropy method (MEM) and for a specific choice of the prior SAI becomes equivalent to SOM. To test the applicability of these two stochastic methods to lattice QCD, firstly, we apply these methods to various reasonably chosen model correlation functions and present detailed comparisons of the reconstructed spectral functions obtained from SOM, SAI and MEM. Next, we present similar studies for charmonia correlation functions obtained from lattice QCD computations using clover-improved Wilson fermions on large, fine, isotropic lattices at 0.75 and 1.5 Tc, Tc being the deconfinement transition temperature of a pure gluon plasma. We find that SAI and SOM give consistent results to MEM at these two temperatures.
Screening Tools to Estimate Mold Burdens in Homes
Objective: The objective of this study was to develop screening tools that could be used to estimate the mold burden in a home which would indicate whether more detailed testing might be useful. Methods: Previously, in the American Healthy Home Survey, a DNA-based method of an...
Sonoda, T; Ona, T; Yokoi, H; Ishida, Y; Ohtani, H; Tsuge, S
2001-11-15
Detailed quantitative analysis of lignin monomer composition comprising p-coumaryl, coniferyl, and sinapyl alcohol and p-coumaraldehyde, coniferaldehyde, and sinapaldehyde in plant has not been studied from every point mainly because of artifact formation during the lignin isolation procedure, partial loss of the lignin components inherent in the chemical degradative methods, and difficulty in the explanation of the complex spectra generally observed for the lignin components. Here we propose a new method to quantify lignin monomer composition in detail by pyrolysis-gas chromatography (Py-GC) using acetylated lignin samples. The lignin acetylation procedure would contribute to prevent secondary formation of cinnamaldehydes from the corresponding alcohol forms during pyrolysis, which are otherwise unavoidable in conventional Py-GC process to some extent. On the basis of the characteristic peaks on the pyrograms of the acetylated sample, lignin monomer compositions in various dehydrogenative polymers (DHP) as lignin model compounds were determined, taking even minor components such as cinnamaldehydes into consideration. The observed compositions by Py-GC were in good agreement with the supplied lignin monomer contents on DHP synthesis. The new Py-GC method combined with sample preacetylation allowed us an accurate quantitative analysis of detailed lignin monomer composition using a microgram order of extractive-free plant samples.
A detail-preserved and luminance-consistent multi-exposure image fusion algorithm
NASA Astrophysics Data System (ADS)
Wang, Guanquan; Zhou, Yue
2018-04-01
When irradiance across a scene varies greatly, we can hardly get an image of the scene without over- or underexposure area, because of the constraints of cameras. Multi-exposure image fusion (MEF) is an effective method to deal with this problem by fusing multi-exposure images of a static scene. A novel MEF method is described in this paper. In the proposed algorithm, coarser-scale luminance consistency is preserved by contribution adjustment using the luminance information between blocks; detail-preserved smoothing filter can stitch blocks smoothly without losing details. Experiment results show that the proposed method performs well in preserving luminance consistency and details.
Studying DNA in the Classroom.
ERIC Educational Resources Information Center
Zarins, Silja
1993-01-01
Outlines a workshop for teachers that illustrates a method of extracting DNA and provides instructions on how to do some simple work with DNA without sophisticated and expensive equipment. Provides details on viscosity studies and breaking DNA molecules. (DDR)
Numerical optimization of conical flow waveriders including detailed viscous effects
NASA Technical Reports Server (NTRS)
Bowcutt, Kevin G.; Anderson, John D., Jr.; Capriotti, Diego
1987-01-01
A family of optimized hypersonic waveriders is generated and studied wherein detailed viscous effects are included within the optimization process itself. This is in contrast to previous optimized waverider work, wherein purely inviscid flow is used to obtain the waverider shapes. For the present waveriders, the undersurface is a streamsurface of an inviscid conical flowfield, the upper surface is a streamsurface of the inviscid flow over a tapered cylinder (calculated by the axisymmetric method of characteristics), and the viscous effects are treated by integral solutions of the boundary layer equations. Transition from laminar to turbulent flow is included within the viscous calculations. The optimization is carried out using a nonlinear simplex method. The resulting family of viscous hypersonic waveriders yields predicted high values of lift/drag, high enough to break the L/D barrier based on experience with other hypersonic configurations. Moreover, the numerical optimization process for the viscous waveriders results in distinctly different shapes compared to previous work with inviscid-designed waveriders. Also, the fine details of the viscous solution, such as how the shear stress is distributed over the surface, and the location of transition, are crucial to the details of the resulting waverider geometry. Finally, the moment coefficient variations and heat transfer distributions associated with the viscous optimized waveriders are studied.
Becoming a Teacher Educator: A Self-Study of the Use of Inquiry in a Mathematics Methods Course
ERIC Educational Resources Information Center
Marin, Katherine Ariemma
2014-01-01
This article details the self-study of a beginning teacher educator in her first experience in teaching a mathematics methods course. The transition from teacher to teacher educator is explored through the experience of a course focused on inquiry. Inquiry is embedded within the course from two perspectives: mathematical inquiry and teaching as…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao Hanqing; Fu Zhiguo; Lu Xiaoguang
Guided by the sedimentation theory and knowledge of modern and ancient fluvial deposition and utilizing the abundant information of sedimentary series, microfacies type and petrophysical parameters from well logging curves of close spaced thousands of wells located in a large area. A new method for establishing detailed sedimentation and permeability distribution models for fluvial reservoirs have been developed successfully. This study aimed at the geometry and internal architecture of sandbodies, in accordance to their hierarchical levels of heterogeneity and building up sedimentation and permeability distribution models of fluvial reservoirs, describing the reservoir heterogeneity on the light of the river sedimentarymore » rules. The results and methods obtained in outcrop and modem sedimentation studies have successfully supported the study. Taking advantage of this method, the major producing layers (PI{sub 1-2}), which have been considered as heterogeneous and thick fluvial reservoirs extending widely in lateral are researched in detail. These layers are subdivided into single sedimentary units vertically and the microfacies are identified horizontally. Furthermore, a complex system is recognized according to their hierarchical levels from large to small, meander belt, single channel sandbody, meander scroll, point bar, and lateral accretion bodies of point bar. The achieved results improved the description of areal distribution of point bar sandbodies, provide an accurate and detailed framework model for establishing high resolution predicting model. By using geostatistic technique, it also plays an important role in searching for enriched zone of residual oil distribution.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilgu, Muslum
A detailed study was done of the neomycin-B RNA aptamer for determining its selectivity and binding ability to both neomycin– and kanamycin-class aminoglycosides. A novel method to increase drug concentrations in cells for more efficiently killing is described. To test the method, a bacterial model system was adopted and several small RNA molecules interacting with aminoglycosides were cloned downstream of T7 RNA polymerase promoter in an expression vector. Then, the growth analysis of E. coli expressing aptamers was observed for 12-hour period. Our analysis indicated that aptamers helped to increase the intracellular concentration of aminoglycosides thereby increasing their efficacy.
Jacques, Eveline; Wells, Darren M; Bennett, Malcolm J; Vissenberg, Kris
2015-01-01
High-resolution imaging of cytoskeletal structures paves the way for standardized methods to quantify cytoskeletal organization. Here we provide a detailed description of the analysis performed to determine the microtubule patterns in gravistimulated roots, using the recently developed software tool MicroFilament Analyzer.
Applied Epistemology and Understanding in Information Studies
ERIC Educational Resources Information Center
Gorichanaz, Tim
2017-01-01
Introduction: Applied epistemology allows information studies to benefit from developments in philosophy. In information studies, epistemic concepts are rarely considered in detail. This paper offers a review of several epistemic concepts, focusing on understanding, as a call for further work in applied epistemology in information studies. Method:…
[Psychophysiological selection: status and prospects].
Gurovskiĭ, N N; Novikov, M A
1981-01-01
The major stages in the development of psychophysiological selection of cosmonauts in the USSR are discussed. The psychophysiological selection was originally based on the data of psychoneurological expertise of the flight personnel and achievements of aviation psychology in the USSR. This was followed by the development of psychophysiological research, using instrumentation and simulation flights. Further complication of flight programs and participation of non-pilot cosmonauts (engineers, scientists) necessitated detailed study of personality properties and application of personality tests. At the present stage in the development of psychophysiological selection great importance is attached to the biorhythmological selection and methods for studying man's capabilities to control his own emotional, behavioral and autonomic reactions as well as environmental parameters. The review also discusses in detail methods of group selection and problems of rational selection of space crews.
ERIC Educational Resources Information Center
Kingsley, Barbara E.; Robertson, Julia M.
2017-01-01
As a fundamental element of any psychology degree, the teaching and learning of research methods is repeatedly brought into sharp focus, and it is often regarded as a real challenge by undergraduate students. The reasons for this are complex, but frequently attributed to an aversion of maths. To gain a more detailed understanding of students'…
1985-07-24
ZAVEDENIY: KHBIIYA I KHIMICHESKAYA TEKHNOLOGIYA, No 3, Mar 85) ................ 2 ANALYTICAL CHEMISTRY Spectroscopic Method of Studying...Ye. Gutman, I. A. Myasnikov, et al.; ZHURNAL FIZICHESKOY KHE-1II, No 3, Mar 85) 4 -a - [III - USSR - 21B S&T] Detailed Method for...Containing Rare Earth and Transitional Elements (S. E. Mamedov, B. A. Dadashev: KINETIKA I KATALIZ, No 1, Jan-Feb 85) 11 Isotope Exchange Method in
Lattice quantum chromodynamical approach to nuclear physics
NASA Astrophysics Data System (ADS)
Aoki, Sinya; Doi, Takumi; Hatsuda, Tetsuo; Ikeda, Yoichi; Inoue, Takashi; Ishii, Noriyoshi; Murano, Keiko; Nemura, Hidekatsu; Sasaki, Kenji; HAL QCD Collaboration
2012-09-01
We review recent progress in the HAL QCD method, which was recently proposed to investigate hadron interactions in lattice quantum chromodynamics (QCD). The strategy to extract the energy-independent non-local potential in lattice QCD is explained in detail. The method is applied to study nucleon-nucleon, nucleon-hyperon, hyperon-hyperon, and meson-baryon interactions. Several extensions of the method are also discussed.
[Application of numerical convolution in in vivo/in vitro correlation research].
Yue, Peng
2009-01-01
This paper introduced the conception and principle of in vivo/in vitro correlation (IVIVC) and convolution/deconvolution methods, and elucidated in details the convolution strategy and method for calculating the in vivo absorption performance of the pharmaceutics according to the their pharmacokinetic data in Excel, then put the results forward to IVIVC research. Firstly, the pharmacokinetic data ware fitted by mathematical software to make up the lost points. Secondly, the parameters of the optimal fitted input function were defined by trail-and-error method according to the convolution principle in Excel under the hypothesis that all the input functions fit the Weibull functions. Finally, the IVIVC between in vivo input function and the in vitro dissolution was studied. In the examples, not only the application of this method was demonstrated in details but also its simplicity and effectiveness were proved by comparing with the compartment model method and deconvolution method. It showed to be a powerful tool for IVIVC research.
Advances in Statistical Methods for Substance Abuse Prevention Research
MacKinnon, David P.; Lockwood, Chondra M.
2010-01-01
The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467
Trends in study design and the statistical methods employed in a leading general medicine journal.
Gosho, M; Sato, Y; Nagashima, K; Takahashi, S
2018-02-01
Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing after the presentation of the FDA guidance for adaptive design. © 2017 John Wiley & Sons Ltd.
Mori, Yoshiharu; Okumura, Hisashi
2015-12-05
Simulated tempering (ST) is a useful method to enhance sampling of molecular simulations. When ST is used, the Metropolis algorithm, which satisfies the detailed balance condition, is usually applied to calculate the transition probability. Recently, an alternative method that satisfies the global balance condition instead of the detailed balance condition has been proposed by Suwa and Todo. In this study, ST method with the Suwa-Todo algorithm is proposed. Molecular dynamics simulations with ST are performed with three algorithms (the Metropolis, heat bath, and Suwa-Todo algorithms) to calculate the transition probability. Among the three algorithms, the Suwa-Todo algorithm yields the highest acceptance ratio and the shortest autocorrelation time. These suggest that sampling by a ST simulation with the Suwa-Todo algorithm is most efficient. In addition, because the acceptance ratio of the Suwa-Todo algorithm is higher than that of the Metropolis algorithm, the number of temperature states can be reduced by 25% for the Suwa-Todo algorithm when compared with the Metropolis algorithm. © 2015 Wiley Periodicals, Inc.
Review of digital holography reconstruction methods
NASA Astrophysics Data System (ADS)
Dovhaliuk, Rostyslav Yu.
2018-01-01
Development of digital holography opened new ways of both transparent and opaque objects non-destructive study. In this paper, a digital hologram reconstruction process is investigated. The advantages and limitations of common wave propagation methods are discussed. The details of a software implementation of a digital hologram reconstruction methods are presented. Finally, the performance of each wave propagation method is evaluated, and recommendations about possible use cases for each of them are given.
Eroğlu, S
2010-02-01
In this study, the frequency of bridging of the hypoglossal canal was investigated on 324 skulls belonging to 10 ancient Anatolian populations recovered from various archaeological sites and dated from Early Bronze Age to the first quarter of the 20th century. The change in the frequency of bridging trait in the hypoglossal canal that has already been recorded according to both the traditional method (absent or present) and the graded method (0-5) was analysed here in relationship to age, sex, skull side and population. The results revealed no significant relation between the bridging of hypoglossal canal and age or sex. Both recording methods showed that the studied samples of ancient Anatolian populations exhibited a homogenous structure and they were found to differ considerably from other populations which inhabited lands other than Anatolia. This indicates that these two recording methods produce similar results in comparing populations. The differences between the sides were found to be significant with the detailed recording method as opposed to the dichotomous method. This asymmetry emerging with the detailed recording method is considered to be important in determining the effect of environmental factors upon the trait. Copyright (c) 2010 Elsevier GmbH. All rights reserved.
A Review on Microdialysis Calibration Methods: the Theory and Current Related Efforts.
Kho, Chun Min; Enche Ab Rahim, Siti Kartini; Ahmad, Zainal Arifin; Abdullah, Norazharuddin Shah
2017-07-01
Microdialysis is a sampling technique first introduced in the late 1950s. Although this technique was originally designed to study endogenous compounds in animal brain, it is later modified to be used in other organs. Additionally, microdialysis is not only able to collect unbound concentration of compounds from tissue sites; this technique can also be used to deliver exogenous compounds to a designated area. Due to its versatility, microdialysis technique is widely employed in a number of areas, including biomedical research. However, for most in vivo studies, the concentration of substance obtained directly from the microdialysis technique does not accurately describe the concentration of the substance on-site. In order to relate the results collected from microdialysis to the actual in vivo condition, a calibration method is required. To date, various microdialysis calibration methods have been reported, with each method being capable to provide valuable insights of the technique itself and its applications. This paper aims to provide a critical review on various calibration methods used in microdialysis applications, inclusive of a detailed description of the microdialysis technique itself to start with. It is expected that this article shall review in detail, the various calibration methods employed, present examples of work related to each calibration method including clinical efforts, plus the advantages and disadvantages of each of the methods.
Hand, Carri; Huot, Suzanne; Laliberte Rudman, Debbie; Wijekoon, Sachindri
2017-06-01
Research exploring how places shape and interact with the lives of aging adults must be grounded in the places where aging adults live and participate. Combined participatory geospatial and qualitative methods have the potential to illuminate the complex processes enacted between person and place to create much-needed knowledge in this area. The purpose of this scoping review was to identify methods that can be used to study person-place relationships among aging adults and their neighborhoods by determining the extent and nature of research with aging adults that combines qualitative methods with participatory geospatial methods. A systematic search of nine databases identified 1,965 articles published from 1995 to late 2015. We extracted data and assessed whether the geospatial and qualitative methods were supported by a specified methodology, the methods of data analysis, and the extent of integration of geospatial and qualitative methods. Fifteen studies were included and used the photovoice method, global positioning system tracking plus interview, or go-along interviews. Most included articles provided sufficient detail about data collection methods, yet limited detail about methodologies supporting the study designs and/or data analysis. Approaches that combine participatory geospatial and qualitative methods are beginning to emerge in the aging literature. By more explicitly grounding studies in a methodology, better integrating different types of data during analysis, and reflecting on methods as they are applied, these methods can be further developed and utilized to provide crucial place-based knowledge that can support aging adults' health, well-being, engagement, and participation. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Pisoni, Ronald L; Bieber, Brian A; Al Wakeel, Jamal; Al Arrayed, Sameer; Alkandari, Naser; Hassan, Mohamed; Karkar, Ayman; Al Lawati, Nabil M; Al Ali, Fadwa; Albert, Justin M; Robinson, Bruce M
2016-11-01
The Dialysis Outcomes and Practice Patterns Study (DOPPS) is an international prospective cohort study of the relationships between hemodialysis (HD) care practices and HD patient outcomes. The DOPPS began in 1996, in the United States, and has since expanded to 21 countries, collecting detailed data from >75,000 HD patients, with >200 scientific publications, focused on describing HD practices associated with improved HD patient outcomes. The goal of DOPPS is to help HD patients "live better and live longer." Starting in 2012, the DOPPS was able to expand to all six of the Gulf Cooperation Council (GCC) countries, namely, Bahrain, Kuwait, Oman, Qatar, Saudi Arabia, and the United Arab Emirates. The DOPPS study design consists of selecting HD facilities for study participation in each country to represent the different types of HD facilities and geographic regions within each GCC country. Within each study site, HD patients were randomly selected for detailed data collection to represent the HD practices within each participating HD facility. Altogether, 41 HD facilities have participated in the GCC-DOPPS Phase 5 study including 20 facilities from Saudi Arabia, nine from the United Arab Emirates, four each from Kuwait and Oman, two from Qatar, and one from Bahrain. Herein, we provide a detailed description of the study design and methods, data collection, study management, scientific investigator oversight and guidance, and study governance and support for the GCCDOPPS Phase 5 study.
The ratio method: A new tool to study one-neutron halo nuclei
Capel, Pierre; Johnson, R. C.; Nunes, F. M.
2013-10-02
Recently a new observable to study halo nuclei was introduced, based on the ratio between breakup and elastic angular cross sections. This new observable is shown by the analysis of specific reactions to be independent of the reaction mechanism and to provide nuclear-structure information of the projectile. Here we explore the details of this ratio method, including the sensitivity to binding energy and angular momentum of the projectile. We also study the reliability of the method with breakup energy. Lastly, we provide guidelines and specific examples for experimentalists who wish to apply this method.
Method for rapid estimation of scour at highway bridges based on limited site data
Holnbeck, S.R.; Parrett, Charles
1997-01-01
Limited site data were used to develop a method for rapid estimation of scour at highway bridges. The estimates can be obtained in a matter of hours rather than several days as required by more-detailed methods. Such a method is important because scour assessments are needed to identify scour-critical bridges throughout the United States. Using detailed scour-analysis methods and scour-prediction equations recommended by the Federal Highway Administration, the U.S. Geological Survey, in cooperation with the Montana Department of Transportation, obtained contraction, pier, and abutment scour-depth data for sites from 10 States.The data were used to develop relations between scour depth and hydraulic variables that can be rapidly measured in the field. Relations between scour depth and hydraulic variables, in the form of envelope curves, were based on simpler forms of detailed scour-prediction equations. To apply the rapid-estimation method, a 100-year recurrence interval peak discharge is determined, and bridge- length data are used in the field with graphs relating unit discharge to velocity and velocity to bridge backwater as a basis for estimating flow depths and other hydraulic variables that can then be applied using the envelope curves. The method was tested in the field. Results showed good agreement among individuals involved and with results from more-detailed methods. Although useful for identifying potentially scour-critical bridges, themethod does not replace more-detailed methods used for design purposes. Use of the rapid- estimation method should be limited to individuals having experience in bridge scour, hydraulics, and flood hydrology, and some training in use of the method.
Flow in curved ducts of varying cross-section
NASA Astrophysics Data System (ADS)
Sotiropoulos, F.; Patel, V. C.
1992-07-01
Two numerical methods for solving the incompressible Navier-Stokes equations are compared with each other by applying them to calculate laminar and turbulent flows through curved ducts of regular cross-section. Detailed comparisons, between the computed solutions and experimental data, are carried out in order to validate the two methods and to identify their relative merits and disadvantages. Based on the conclusions of this comparative study a numerical method is developed for simulating viscous flows through curved ducts of varying cross-sections. The proposed method is capable of simulating the near-wall turbulence using fine computational meshes across the sublayer in conjunction with a two-layer k-epsilon model. Numerical solutions are obtained for: (1) a straight transition duct geometry, and (2) a hydroturbine draft-tube configuration at model scale Reynolds number for various inlet swirl intensities. The report also provides a detailed literature survey that summarizes all the experimental and computational work in the area of duct flows.
NASA Astrophysics Data System (ADS)
Tokarczyk, Piotr; Leitao, Joao Paulo; Rieckermann, Jörg; Schindler, Konrad; Blumensaat, Frank
2015-04-01
Modelling rainfall-runoff in urban areas is increasingly applied to support flood risk assessment particularly against the background of a changing climate and an increasing urbanization. These models typically rely on high-quality data for rainfall and surface characteristics of the area. While recent research in urban drainage has been focusing on providing spatially detailed rainfall data, the technological advances in remote sensing that ease the acquisition of detailed land-use information are less prominently discussed within the community. The relevance of such methods increase as in many parts of the globe, accurate land-use information is generally lacking, because detailed image data is unavailable. Modern unmanned air vehicles (UAVs) allow acquiring high-resolution images on a local level at comparably lower cost, performing on-demand repetitive measurements, and obtaining a degree of detail tailored for the purpose of the study. In this study, we investigate for the first time the possibility to derive high-resolution imperviousness maps for urban areas from UAV imagery and to use this information as input for urban drainage models. To do so, an automatic processing pipeline with a modern classification method is tested and applied in a state-of-the-art urban drainage modelling exercise. In a real-life case study in the area of Lucerne, Switzerland, we compare imperviousness maps generated from a consumer micro-UAV and standard large-format aerial images acquired by the Swiss national mapping agency (swisstopo). After assessing their correctness, we perform an end-to-end comparison, in which they are used as an input for an urban drainage model. Then, we evaluate the influence which different image data sources and their processing methods have on hydrological and hydraulic model performance. We analyze the surface runoff of the 307 individual sub-catchments regarding relevant attributes, such as peak runoff and volume. Finally, we evaluate the model's channel flow prediction performance through a cross-comparison with reference flow measured at the catchment outlet. We show that imperviousness maps generated using UAV imagery processed with modern classification methods achieve accuracy comparable with standard, off-the-shelf aerial imagery. In the examined case study, we find that the different imperviousness maps only have a limited influence on modelled surface runoff and pipe flows. We conclude that UAV imagery represents a valuable alternative data source for urban drainage model applications due to the possibility to flexibly acquire up-to-date aerial images at a superior quality and a competitive price. Our analyses furthermore suggest that spatially more detailed urban drainage models can even better benefit from the full detail of UAV imagery.
High-quality observation of surface imperviousness for urban runoff modelling using UAV imagery
NASA Astrophysics Data System (ADS)
Tokarczyk, P.; Leitao, J. P.; Rieckermann, J.; Schindler, K.; Blumensaat, F.
2015-01-01
Modelling rainfall-runoff in urban areas is increasingly applied to support flood risk assessment particularly against the background of a changing climate and an increasing urbanization. These models typically rely on high-quality data for rainfall and surface characteristics of the area. While recent research in urban drainage has been focusing on providing spatially detailed rainfall data, the technological advances in remote sensing that ease the acquisition of detailed land-use information are less prominently discussed within the community. The relevance of such methods increase as in many parts of the globe, accurate land-use information is generally lacking, because detailed image data is unavailable. Modern unmanned air vehicles (UAVs) allow acquiring high-resolution images on a local level at comparably lower cost, performing on-demand repetitive measurements, and obtaining a degree of detail tailored for the purpose of the study. In this study, we investigate for the first time the possibility to derive high-resolution imperviousness maps for urban areas from UAV imagery and to use this information as input for urban drainage models. To do so, an automatic processing pipeline with a modern classification method is tested and applied in a state-of-the-art urban drainage modelling exercise. In a real-life case study in the area of Lucerne, Switzerland, we compare imperviousness maps generated from a consumer micro-UAV and standard large-format aerial images acquired by the Swiss national mapping agency (swisstopo). After assessing their correctness, we perform an end-to-end comparison, in which they are used as an input for an urban drainage model. Then, we evaluate the influence which different image data sources and their processing methods have on hydrological and hydraulic model performance. We analyze the surface runoff of the 307 individual subcatchments regarding relevant attributes, such as peak runoff and volume. Finally, we evaluate the model's channel flow prediction performance through a cross-comparison with reference flow measured at the catchment outlet. We show that imperviousness maps generated using UAV imagery processed with modern classification methods achieve accuracy comparable with standard, off-the-shelf aerial imagery. In the examined case study, we find that the different imperviousness maps only have a limited influence on modelled surface runoff and pipe flows. We conclude that UAV imagery represents a valuable alternative data source for urban drainage model applications due to the possibility to flexibly acquire up-to-date aerial images at a superior quality and a competitive price. Our analyses furthermore suggest that spatially more detailed urban drainage models can even better benefit from the full detail of UAV imagery.
Exposure studies rely on detailed characterization of air quality, either from sparsely located routine ambient monitors or from central monitoring sites that may lack spatial representativeness. Alternatively, some studies use models of various complexities to characterize local...
DOT National Transportation Integrated Search
2008-01-01
This research involved a detailed laboratory study of a new test method for evaluating road base materials based on : the strength of the soil binder. In this test method, small test specimens (5.0in length and 0.75in square cross : section) of binde...
USDA-ARS?s Scientific Manuscript database
Geometrical isomers of carotenoids behave differently in aspects like stability towards oxidants, bioavailability, vitamin A activity and specificity for enzymes. The availability of HPLC methods for their detailed profiling is therefore advisable to expand our knowledge on their metabolism and biol...
Filtering of high noise breast thermal images using fast non-local means.
Suganthi, S S; Ramakrishnan, S
2014-01-01
Analyses of breast thermograms are still a challenging task primarily due to the limitations such as low contrast, low signal to noise ratio and absence of clear edges. Therefore, always there is a requirement for preprocessing techniques before performing any quantitative analysis. In this work, a noise removal framework using fast non-local means algorithm, method noise and median filter was used to denoise breast thermograms. The images considered were subjected to Anscombe transformation to convert the distribution from Poisson to Gaussian. The pre-denoised image was obtained by subjecting the transformed image to fast non-local means filtering. The method noise which is the difference between the original and pre-denoised image was observed with the noise component merged in few structures and fine detail of the image. The image details presented in the method noise was extracted by smoothing the noise part using the median filter. The retrieved image part was added to the pre-denoised image to obtain the final denoised image. The performance of this technique was compared with that of Wiener and SUSAN filters. The results show that all the filters considered are able to remove the noise component. The performance of the proposed denoising framework is found to be good in preserving detail and removing noise. Further, the method noise is observed with negligible image details. Similarly, denoised image with no noise and smoothed edges are observed using Wiener filter and its method noise is contained with few structures and image details. The performance results of SUSAN filter is found to be blurred denoised image with little noise and also method noise with extensive structure and image details. Hence, it appears that the proposed denoising framework is able to preserve the edge information and generate clear image that could help in enhancing the diagnostic relevance of breast thermograms. In this paper, the introduction, objectives, materials and methods, results and discussion and conclusions are presented in detail.
Memory for the 2008 Presidential election in healthy aging and Mild Cognitive Impairment
Waring, Jill D.; Seiger, Ashley N.; Solomon, Paul R.; Budson, Andrew E.; Kensinger, Elizabeth A.
2014-01-01
Objective The present study examined memory accuracy and confidence for personal and public event details of the 2008 Presidential election in healthy older adults and those with Mild Cognitive Impairment (MCI). Method Participants completed phone interviews within a week after the election and after a 10-month delay. Results MCI patients and healthy older adults had comparable emotional reactions to learning the outcome of the election, with most people finding it to be a positive experience. After the delay period, details about the election were better remembered by all participants than a less emotionally arousing comparison event. However, MCI patients had more difficulty than healthy older adults correctly recalling details of public information about the election, although often the MCI patients could recognize the correct details. Conclusion This is the first study to show that MCI patients’ memory can benefit from emotionally arousing positive events, complementing the literature demonstrating similar effects for negative events. PMID:24533684
NASA Astrophysics Data System (ADS)
Zhou, Xiran; Liu, Jun; Liu, Shuguang; Cao, Lei; Zhou, Qiming; Huang, Huawen
2014-02-01
High spatial resolution and spectral fidelity are basic standards for evaluating an image fusion algorithm. Numerous fusion methods for remote sensing images have been developed. Some of these methods are based on the intensity-hue-saturation (IHS) transform and the generalized IHS (GIHS), which may cause serious spectral distortion. Spectral distortion in the GIHS is proven to result from changes in saturation during fusion. Therefore, reducing such changes can achieve high spectral fidelity. A GIHS-based spectral preservation fusion method that can theoretically reduce spectral distortion is proposed in this study. The proposed algorithm consists of two steps. The first step is spectral modulation (SM), which uses the Gaussian function to extract spatial details and conduct SM of multispectral (MS) images. This method yields a desirable visual effect without requiring histogram matching between the panchromatic image and the intensity of the MS image. The second step uses the Gaussian convolution function to restore lost edge details during SM. The proposed method is proven effective and shown to provide better results compared with other GIHS-based methods.
Ageing airplane repair assessment program for Airbus A300
NASA Technical Reports Server (NTRS)
Gaillardon, J. M.; Schmidt, HANS-J.; Brandecker, B.
1992-01-01
This paper describes the current status of the repair categorization activities and includes all details about the methodologies developed for determination of the inspection program for the skin on pressurized fuselages. For inspection threshold determination two methods are defined based on fatigue life approach, a simplified and detailed method. The detailed method considers 15 different parameters to assess the influences of material, geometry, size location, aircraft usage, and workmanship on the fatigue life of the repair and the original structure. For definition of the inspection intervals a general method is developed which applies to all concerned repairs. For this the initial flaw concept is used by considering 6 parameters and the detectable flaw sizes depending on proposed nondestructive inspection methods. An alternative method is provided for small repairs allowing visual inspection with shorter intervals.
Demonstration of improved seismic source inversion method of tele-seismic body wave
NASA Astrophysics Data System (ADS)
Yagi, Y.; Okuwaki, R.
2017-12-01
Seismic rupture inversion of tele-seismic body wave has been widely applied to studies of large earthquakes. In general, tele-seismic body wave contains information of overall rupture process of large earthquake, while the tele-seismic body wave is inappropriate for analyzing a detailed rupture process of M6 7 class earthquake. Recently, the quality and quantity of tele-seismic data and the inversion method has been greatly improved. Improved data and method enable us to study a detailed rupture process of M6 7 class earthquake even if we use only tele-seismic body wave. In this study, we demonstrate the ability of the improved data and method through analyses of the 2016 Rieti, Italy earthquake (Mw 6.2) and the 2016 Kumamoto, Japan earthquake (Mw 7.0) that have been well investigated by using the InSAR data set and the field observations. We assumed the rupture occurring on a single fault plane model inferred from the moment tensor solutions and the aftershock distribution. We constructed spatiotemporal discretized slip-rate functions with patches arranged as closely as possible. We performed inversions using several fault models and found that the spatiotemporal location of large slip-rate area was robust. In the 2016 Kumamoto, Japan earthquake, the slip-rate distribution shows that the rupture propagated to southwest during the first 5 s. At 5 s after the origin time, the main rupture started to propagate toward northeast. First episode and second episode correspond to rupture propagation along the Hinagu fault and the Futagawa fault, respectively. In the 2016 Rieti, Italy earthquake, the slip-rate distribution shows that the rupture propagated to up-dip direction during the first 2 s, and then rupture propagated toward northwest. From both analyses, we propose that the spatiotemporal slip-rate distribution estimated by improved inversion method of tele-seismic body wave has enough information to study a detailed rupture process of M6 7 class earthquake.
Fusion and quality analysis for remote sensing images using contourlet transform
NASA Astrophysics Data System (ADS)
Choi, Yoonsuk; Sharifahmadian, Ershad; Latifi, Shahram
2013-05-01
Recent developments in remote sensing technologies have provided various images with high spatial and spectral resolutions. However, multispectral images have low spatial resolution and panchromatic images have low spectral resolution. Therefore, image fusion techniques are necessary to improve the spatial resolution of spectral images by injecting spatial details of high-resolution panchromatic images. The objective of image fusion is to provide useful information by improving the spatial resolution and the spectral information of the original images. The fusion results can be utilized in various applications, such as military, medical imaging, and remote sensing. This paper addresses two issues in image fusion: i) image fusion method and ii) quality analysis of fusion results. First, a new contourlet-based image fusion method is presented, which is an improvement over the wavelet-based fusion. This fusion method is then applied to a case study to demonstrate its fusion performance. Fusion framework and scheme used in the study are discussed in detail. Second, quality analysis for the fusion results is discussed. We employed various quality metrics in order to analyze the fusion results both spatially and spectrally. Our results indicate that the proposed contourlet-based fusion method performs better than the conventional wavelet-based fusion methods.
ERIC Educational Resources Information Center
Simon, Brian
1980-01-01
Presents some of the findings of the ORACLE research program (Observational Research and Classroom Learning Evaluation), a detailed observational study of teacher-student interaction, teaching styles, and management methods within a sample of primary classrooms. (Editor/SJL)
McGovern, Alice E; Mazzone, Stuart B
2014-12-01
Described in this unit are methods for establishing guinea pig models of asthma. Sufficient detail is provided to enable investigators to study bronchoconstriction, cough, airway hyperresponsiveness, inflammation, and remodeling. Copyright © 2014 John Wiley & Sons, Inc.
Method Development for Analysis of Aspirin Tablets.
ERIC Educational Resources Information Center
Street, Kenneth W., Jr.
1988-01-01
Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)
Gilbert H. Schubert
1963-01-01
The objectie of the study is to obtain information on the growth and yield of even-aged stands of different stocking levels. The plot will also serve as the start of a detailed growing stock study for the ponderosa pine region.
Comparison of Provider Types Who Performed Prehospital Lifesaving Interventions: A Prospective Study
2014-12-01
In less than 2 hours, 15 critically ill children were triaged and admitted to the PICU or surge spaces. Conclusions:Identified strengths included...details increasing telemedicine uti - lization during a 4 year period and outlines program structural changes that improved utilization. Methods: The study...population survival. CSC ICU resource- allocation algorithms (ALGs) exist for adults. Our goal was to evaluate a CSC pandemic ALG for children . Methods
NASA Astrophysics Data System (ADS)
Li, N.; Cheng, Y. M.
2015-01-01
Landslide is a major disaster resulting in considerable loss of human lives and property damages in hilly terrain in Hong Kong, China and many other countries. The factor of safety and the critical slip surface for slope stabilization are the main considerations for slope stability analysis in the past, while the detailed post-failure conditions of the slopes have not been considered in sufficient detail. There is however increasing interest in the consequences after the initiation of failure that includes the development and propagation of the failure surfaces, the amount of failed mass and runoff and the affected region. To assess the development of slope failure in more detail and to consider the potential danger of slopes after failure has initiated, the slope stability problem under external surcharge is analyzed by the distinct element method (DEM) and a laboratory model test in the present research. A more refined study about the development of failure, microcosmic failure mechanisms and the post-failure mechanisms of slopes will be carried out. The numerical modeling method and the various findings from the present work can provide an alternate method of analysis of slope failure, which can give additional information not available from the classical methods of analysis.
Laboratory and 3-D-distinct element analysis of failure mechanism of slope under external surcharge
NASA Astrophysics Data System (ADS)
Li, N.; Cheng, Y. M.
2014-09-01
Landslide is a major disaster resulting in considerable loss of human lives and property damages in hilly terrain in Hong Kong, China and many other countries. The factor of safety and the critical slip surface for slope stabilization are the main considerations for slope stability analysis in the past, while the detailed post-failure conditions of the slopes have not been considered in sufficient details. There are however increasing interest on the consequences after the initiation of failure which includes the development and propagation of the failure surfaces, the amount of failed mass and runoff and the affected region. To assess the development of slope failure in more details and to consider the potential danger of slopes after failure has initiated, the slope stability problem under external surcharge is analyzed by the distinct element method (DEM) and laboratory model test in the present research. A more refined study about the development of failure, microcosmic failure mechanism and the post-failure mechanism of slope will be carried out. The numerical modeling method and the various findings from the present work can provide an alternate method of analysis of slope failure which can give additional information not available from the classical methods of analysis.
Estimating survival of radio-tagged birds
Bunck, C.M.; Pollock, K.H.; Lebreton, J.-D.; North, P.M.
1993-01-01
Parametric and nonparametric methods for estimating survival of radio-tagged birds are described. The general assumptions of these methods are reviewed. An estimate based on the assumption of constant survival throughout the period is emphasized in the overview of parametric methods. Two nonparametric methods, the Kaplan-Meier estimate of the survival funcrion and the log rank test, are explained in detail The link between these nonparametric methods and traditional capture-recapture models is discussed aloag with considerations in designing studies that use telemetry techniques to estimate survival.
Interpolation of diffusion weighted imaging datasets.
Dyrby, Tim B; Lundell, Henrik; Burke, Mark W; Reislev, Nina L; Paulson, Olaf B; Ptito, Maurice; Siebner, Hartwig R
2014-12-01
Diffusion weighted imaging (DWI) is used to study white-matter fibre organisation, orientation and structural connectivity by means of fibre reconstruction algorithms and tractography. For clinical settings, limited scan time compromises the possibilities to achieve high image resolution for finer anatomical details and signal-to-noise-ratio for reliable fibre reconstruction. We assessed the potential benefits of interpolating DWI datasets to a higher image resolution before fibre reconstruction using a diffusion tensor model. Simulations of straight and curved crossing tracts smaller than or equal to the voxel size showed that conventional higher-order interpolation methods improved the geometrical representation of white-matter tracts with reduced partial-volume-effect (PVE), except at tract boundaries. Simulations and interpolation of ex-vivo monkey brain DWI datasets revealed that conventional interpolation methods fail to disentangle fine anatomical details if PVE is too pronounced in the original data. As for validation we used ex-vivo DWI datasets acquired at various image resolutions as well as Nissl-stained sections. Increasing the image resolution by a factor of eight yielded finer geometrical resolution and more anatomical details in complex regions such as tract boundaries and cortical layers, which are normally only visualized at higher image resolutions. Similar results were found with typical clinical human DWI dataset. However, a possible bias in quantitative values imposed by the interpolation method used should be considered. The results indicate that conventional interpolation methods can be successfully applied to DWI datasets for mining anatomical details that are normally seen only at higher resolutions, which will aid in tractography and microstructural mapping of tissue compartments. Copyright © 2014. Published by Elsevier Inc.
Teaching Mathematics in Seven Countries: Results from the TIMSS 1999 Video Study.
ERIC Educational Resources Information Center
Hiebert, James; Gallimore, Ronald; Garnier, Helen; Givvin, Karen Bogard; Hollingsworth, Hilary; Jacobs, Jennifer; Chui, Angel Miu-Ying; Wearne, Diana; Smith, Margaret; Kersting, Nicole; Manaster, Alfred; Tseng, Ellen; Etterbeek, Wallace; Manaster, Carl; Gonzales, Patrick; Stigler, James
This book reports teaching practices in mathematics in seven countries from the Third International Mathematics and Science Study (TIMSS) 1999 video study. A detailed description of the methods in the mathematics portion of the study is presented in an accompanying technical report from an international perspective. Contexts of the lessons, the…
NASA Technical Reports Server (NTRS)
1988-01-01
For the pressure fed engines, detailed trade studies were conducted defining engine features such as thrust vector control methods, thrust chamber construction, etc. This was followed by engine design layouts and booster propulsion configuration layouts. For the pump fed engines parametric performance and weight data was generated for both O2/H2 and O2/RP-1 engines. Subsequent studies resulted in the selection of both LOX/RP-1 and O2/H2 propellants for the pump fed engines. More detailed analysis of the selected LOX/RP-1 and O2/H2 engines was conducted during the final phase of the study.
Chemometric analysis of soil pollution data using the Tucker N-way method.
Stanimirova, I; Zehl, K; Massart, D L; Vander Heyden, Y; Einax, J W
2006-06-01
N-way methods, particularly the Tucker method, are often the methods of choice when analyzing data sets arranged in three- (or higher) way arrays, which is the case for most environmental data sets. In the future, applying N-way methods will become an increasingly popular way to uncover hidden information in complex data sets. The reason for this is that classical two-way approaches such as principal component analysis are not as good at revealing the complex relationships present in data sets. This study describes in detail the application of a chemometric N-way approach, namely the Tucker method, in order to evaluate the level of pollution in soil from a contaminated site. The analyzed soil data set was five-way in nature. The samples were collected at different depths (way 1) from two locations (way 2) and the levels of thirteen metals (way 3) were analyzed using a four-step-sequential extraction procedure (way 4), allowing detailed information to be obtained about the bioavailability and activity of the different binding forms of the metals. Furthermore, the measurements were performed under two conditions (way 5), inert and non-inert. The preferred Tucker model of definite complexity showed that there was no significant difference in measurements analyzed under inert or non-inert conditions. It also allowed two depth horizons, characterized by different accumulation pathways, to be distinguished, and it allowed the relationships between chemical elements and their biological activities and mobilities in the soil to be described in detail.
Harbicht, Andrew B.; Castro-Santos, Theodore R.; Ardren, William R.; Gorsky, Dimitry; Fraser, Dylan
2017-01-01
Radio‐tag signals from fixed‐position antennas are most often used to indicate presence or absence of individuals, or to estimate individual activity levels from signal strength variation within an antenna's detection zone. The potential of such systems to provide more precise information on tag location and movement has not been explored in great detail in an ecological setting.By reversing the roles that transmitters and receivers play in localization methods common to the telecommunications industry, we present a new telemetric tool for accurately estimating the location of tagged individuals from received signal strength values. The methods used to characterize the study area in terms of received signal strength are described, as is the random forest model used for localization. The resulting method is then validated using test data before being applied to true data collected from tagged individuals in the study site.Application of the localization method to test data withheld from the learning dataset indicated a low average error over the entire study area (<1 m), whereas application of the localization method to real data produced highly probable results consistent with field observations.This telemetric approach provided detailed movement data for tagged fish along a single axis (a migratory path) and is particularly useful for monitoring passage along migratory routes. The new methods applied in this study can also be expanded to include multiple axes (x, y, z) and multiple environments (aquatic and terrestrial) for remotely monitoring wildlife movement.
Methodology Series Module 10: Qualitative Health Research
Setia, Maninder Singh
2017-01-01
Although quantitative designs are commonly used in clinical research, some studies require qualitative methods. These designs are different from quantitative methods; thus, researchers should be aware of data collection methods and analyses for qualitative research. Qualitative methods are particularly useful to understand patient experiences with the treatment or new methods of management or to explore issues in detail. These methods are useful in social and behavioral research. In qualitative research, often, the main focus is to understand the issue in detail rather than generalizability; thus, the sampling methods commonly used are purposive sampling; quota sampling; and snowball sampling (for hard to reach groups). Data can be collected using in-depth interviews (IDIs) or focus group discussions (FGDs). IDI is a one-to-one interview with the participant. FGD is a method of group interview or discussion, in which more than one participant is interviewed at the same time and is usually led by a facilitator. The commonly used methods for data analysis are: thematic analysis; grounded theory analysis; and framework analysis. Qualitative data collection and analysis require special expertise. Hence, if the reader plans to conduct qualitative research, they should team up with a qualitative researcher. PMID:28794545
Methodology Series Module 10: Qualitative Health Research.
Setia, Maninder Singh
2017-01-01
Although quantitative designs are commonly used in clinical research, some studies require qualitative methods. These designs are different from quantitative methods; thus, researchers should be aware of data collection methods and analyses for qualitative research. Qualitative methods are particularly useful to understand patient experiences with the treatment or new methods of management or to explore issues in detail. These methods are useful in social and behavioral research. In qualitative research, often, the main focus is to understand the issue in detail rather than generalizability; thus, the sampling methods commonly used are purposive sampling; quota sampling; and snowball sampling (for hard to reach groups). Data can be collected using in-depth interviews (IDIs) or focus group discussions (FGDs). IDI is a one-to-one interview with the participant. FGD is a method of group interview or discussion, in which more than one participant is interviewed at the same time and is usually led by a facilitator. The commonly used methods for data analysis are: thematic analysis; grounded theory analysis; and framework analysis. Qualitative data collection and analysis require special expertise. Hence, if the reader plans to conduct qualitative research, they should team up with a qualitative researcher.
Radiation costing methods: a systematic review
Rahman, F.; Seung, S.J.; Cheng, S.Y.; Saherawala, H.; Earle, C.C.; Mittmann, N.
2016-01-01
Objective Costs for radiation therapy (rt) and the methods used to cost rt are highly diverse across the literature. To date, no study has compared various costing methods in detail. Our objective was to perform a thorough review of the radiation costing literature to identify sources of costs and methods used. Methods A systematic review of Ovid medline, Ovid oldmedline, embase, Ovid HealthStar, and EconLit from 2005 to 23 March 2015 used search terms such as “radiation,” “radiotherapy,” “neoplasm,” “cost,” “ cost analysis,” and “cost benefit analysis” to locate relevant articles. Original papers were reviewed for detailed costing methods. Cost sources and methods were extracted for papers investigating rt modalities, including three-dimensional conformal rt (3D-crt), intensity-modulated rt (imrt), stereotactic body rt (sbrt), and brachytherapy (bt). All costs were translated into 2014 U.S. dollars. Results Most of the studies (91%) reported in the 33 articles retrieved provided rt costs from the health system perspective. The cost of rt ranged from US$2,687.87 to US$111,900.60 per treatment for imrt, followed by US$5,583.28 to US$90,055 for 3D-crt, US$10,544.22 to US$78,667.40 for bt, and US$6,520.58 to US$19,602.68 for sbrt. Cost drivers were professional or personnel costs and the cost of rt treatment. Most studies did not address the cost of rt equipment (85%) and institutional or facility costs (66%). Conclusions Costing methods and sources were widely variable across studies, highlighting the need for consistency in the reporting of rt costs. More work to promote comparability and consistency across studies is needed. PMID:27536189
1975-06-01
the Air Force Flight Dynamics Laboratory for use in conceptual and preliminary designs pauses of weapon system development. The methods are a...trade study method provides ai\\ iterative capability stemming from a direct interface with design synthesis programs. A detailed cost data base ;ind...system for data expmjsion is provided. The methods are designed for ease in changing cost estimating relationships and estimating coefficients
Xiao, Yiling; McElheny, Dan; Hoshi, Minako; Ishii, Yoshitaka
2018-01-01
Intense efforts have been made to understand the molecular structures of misfolded amyloid β (Aβ) in order to gain insight into the pathological mechanism of Alzheimer's disease. Solid-state NMR spectroscopy (SSNMR) is considered a primary tool for elucidating the structures of insoluble and noncrystalline amyloid fibrils and other amyloid assemblies. In this chapter, we describe a detailed protocol to obtain the first atomic model of the 42-residue human Aβ peptide Aβ(1-42) in structurally homogeneous amyloid fibrils from our recent SSNMR study (Nat Struct Mol Biol 22:499-505, 2015). Despite great biological and clinical interest in Aβ(1-42) fibrils, their structural details have been long-elusive until this study. The protocol is divided into four sections. First, the solid-phase peptide synthesis (SPPS) and purification of monomeric Aβ(1-42) is described. We illustrate a controlled incubation method to prompt misfolding of Aβ(1-42) into homogeneous amyloid fibrils in an aqueous solution with fragmented Aβ(1-42) fibrils as seeds. Next, we detail analysis of Aβ(1-42) fibrils by SSNMR to obtain structural restraints. Finally, we describe methods to construct atomic models of Aβ(1-42) fibrils based on SSNMR results through two-stage molecular dynamics calculations.
Artali, Roberto; Botta, Mauro; Cavallotti, Camilla; Giovenzana, Giovanni B; Palmisano, Giovanni; Sisti, Massimo
2007-08-07
A novel pyridine-containing DTPA-like ligand, carrying additional hydroxymethyl groups on the pyridine side-arms, was synthesized in 5 steps. The corresponding Gd(III) complex, potentially useful as an MRI contrast agent, was prepared and characterized in detail by relaxometric methods and its structure modeled by computational methods.
Researching Children and Fashion: An Embodied Ethnography
ERIC Educational Resources Information Center
Pole, Christopher
2007-01-01
Child-centred research methods present a range of opportunities for the researcher to gather rich and detailed data on many aspects of the lives of children. This article examines the experience of using such methods in the context of a study of children as consumers of clothing and fashion. Its principal concern is with the application of an…
ERIC Educational Resources Information Center
Normand, Sebastien; Schneider, Barry H.; Lee, Matthew D.; Maisonneuve, Marie-France; Kuehn, Sally M.; Robaey, Philippe
2011-01-01
This multimethod study provides detailed information about the friendships of 87 children (76% boys) with ADHD and 46 comparison children aged 7-13 years. The methods included parent and teacher ratings, self-report measures and direct observation of friends' dyadic behaviors in three structured analogue tasks. Results indicated that, in contrast…
Connecting Generations: Developing Co-Design Methods for Older Adults and Children
ERIC Educational Resources Information Center
Xie, Bo; Druin, Allison; Fails, Jerry; Massey, Sheri; Golub, Evan; Franckel, Sonia; Schneider, Kiki
2012-01-01
As new technologies emerge that can bring older adults together with children, little has been discussed by researchers concerning the design methods used to create these new technologies. Giving both children and older adults a voice in a shared design process comes with many challenges. This paper details an exploratory study focusing on…
Analysis of Semiotic Principles in a Constructivist Learning Environment.
ERIC Educational Resources Information Center
Williams, Paul
To advance nuclear plant simulator training, the industry must focus on a more detailed and theoretical approach to conduct of this training. The use of semiotics is one method of refining the existing training and examining ways to diversify and blend it with new theoretical methods. Semiotics is the study of signs and how humans interpret them.…
DOT National Transportation Integrated Search
2011-07-01
This research focuses on finding a method for creating cost effective and innovative steel bridges in Colorado. The design method that was discovered to create this cost efficiency was designing the beams as simply supported for non-composite dead lo...
ERIC Educational Resources Information Center
Brown, Glyn; Scott-Little, Catherine; Amwake, Lynn; Wynn, Lucy
2007-01-01
The report provides detailed information about the methods and instruments used to evaluate school readiness initiatives, discusses important considerations in selecting instruments, and provides resources and recommendations that may be helpful to those who are designing and implementing school readiness evaluations. Study results indicate that…
ERIC Educational Resources Information Center
Chanda, Jacqueline; Basinger, Ashlee M.
2000-01-01
Describes a case study in which third-grade children (n=19) examined a series of images of Ndop statues and visual information from the Kuba people of the Democratic Republic of the Congo using art history constructivist inquiry methods. Presents the results in detail. Includes references. (CMK)
Integration of Evidence into a Detailed Clinical Model-based Electronic Nursing Record System
Park, Hyeoun-Ae; Jeon, Eunjoo; Chung, Eunja
2012-01-01
Objectives The purpose of this study was to test the feasibility of an electronic nursing record system for perinatal care that is based on detailed clinical models and clinical practice guidelines in perinatal care. Methods This study was carried out in five phases: 1) generating nursing statements using detailed clinical models; 2) identifying the relevant evidence; 3) linking nursing statements with the evidence; 4) developing a prototype electronic nursing record system based on detailed clinical models and clinical practice guidelines; and 5) evaluating the prototype system. Results We first generated 799 nursing statements describing nursing assessments, diagnoses, interventions, and outcomes using entities, attributes, and value sets of detailed clinical models for perinatal care which we developed in a previous study. We then extracted 506 recommendations from nine clinical practice guidelines and created sets of nursing statements to be used for nursing documentation by grouping nursing statements according to these recommendations. Finally, we developed and evaluated a prototype electronic nursing record system that can provide nurses with recommendations for nursing practice and sets of nursing statements based on the recommendations for guiding nursing documentation. Conclusions The prototype system was found to be sufficiently complete, relevant, useful, and applicable in terms of content, and easy to use and useful in terms of system user interface. This study has revealed the feasibility of developing such an ENR system. PMID:22844649
Kimball, Briant A.; Runkel, Robert L.; Gerner, Linda J.
2009-01-01
Land-management agencies are faced with decisions about remediation in streams affected by mine drainage. In support of the U. S. Forest Service, for the Uinta National Forest, the U.S. Geological Survey conducted mass-loading studies in American Fork and Mary Ellen Gulch, Utah. Synoptic samples were collected along a 10,000-meter study reach in American Fork and 4,500-meter reach in Mary Ellen Gulch. Tracer-injection methods were combined with synoptic sampling methods to evaluate discharge and mass loading. This data-series report gives the results of the chemical analyses of these samples and provides the equations used to calculate discharge from tracer concentrations and loads from discharge and concentrations of the constituents. The detailed information from these studies will facilitate the preparation of interpretive reports and discussions with stakeholder groups. Data presented include detailed locations of the sampling sites, results of chemical analyses, and graphs of mass-loading profiles for major and trace elements in American Fork and Mary Ellen Gulch. Ultrafiltration was used to define filtered concentrations and total-recoverable concentrations were measured on unfiltered samples.
Simplification of an MCNP model designed for dose rate estimation
NASA Astrophysics Data System (ADS)
Laptev, Alexander; Perry, Robert
2017-09-01
A study was made to investigate the methods of building a simplified MCNP model for radiological dose estimation. The research was done using an example of a complicated glovebox with extra shielding. The paper presents several different calculations for neutron and photon dose evaluations where glovebox elements were consecutively excluded from the MCNP model. The analysis indicated that to obtain a fast and reasonable estimation of dose, the model should be realistic in details that are close to the tally. Other details may be omitted.
Using Toulmin analysis to analyse an instructor's proof presentation in abstract algebra
NASA Astrophysics Data System (ADS)
Fukawa-connelly, Timothy
2014-01-01
This paper provides a method for analysing undergraduate teaching of proof-based courses using Toulmin's model (1969) of argumentation. It presents a case study of one instructor's presentation of proofs. The analysis shows that the instructor presents different levels of detail in different proofs; thus, the students have an inconsistent set of written models for their work. Similarly, the analysis shows that the details the instructor says aloud differ from what she writes down. Although her verbal commentary provides additional detail and appears to have pedagogical value, for instance, by modelling thinking that supports proof writing, this value might be better realized if she were to change her teaching practices.
Geological analysis of parts of the southern Arabian Shield based on Landsat imagery
NASA Astrophysics Data System (ADS)
Qari, Mohammed Yousef Hedaytullah T.
This thesis examines the capability and applicability of Landsat multispectral remote sensing data for geological analysis in the arid southern Arabian Shield, which is the eastern segment of the Nubian-Arabian Shield surrounding the Red Sea. The major lithologies in the study area are Proterozoic metavolcanics, metasediments, gneisses and granites. Three test-sites within the study area, located within two tectonic assemblages, the Asir Terrane and the Nabitah Mobile Belt, were selected for detailed comparison of remote sensing methods and ground geological studies. Selected digital image processing techniques were applied to full-resolution Landsat TM imagery and the results are interpreted and discussed. Methods included: image contrast improvement, edge enhancement for detecting lineaments and spectral enhancement for geological mapping. The last method was based on two principles, statistical analysis of the data and the use of arithmetical operators. New and detailed lithological and structural maps were constructed and compared with previous maps of these sites. Examples of geological relations identified using TM imagery include: recognition and mapping of migmatites for the first time in the Arabian Shield; location of the contact between the Asir Terrane and the Nabitah Mobile Belt; and mapping of lithologies, some of which were not identified on previous geological maps. These and other geological features were confirmed by field checking. Methods of lineament enhancement implemented in this study revealed structural lineaments, mostly mapped for the first time, which can be related to regional tectonics. Structural analysis showed that the southern Arabian Shield has been affected by at least three successive phases of deformation. The third phase is the most dominant and widespread. A crustal evolutionary model in the vicinity of the study area is presented showing four stages, these are: arc stage, accretion stage, collision stage and post-collision stage. The results of this study demonstrate that Landsat TM data can be used reliably for geological investigations in the Arabian Shield and comparable areas, particularly to generate detailed geological maps over large areas by using quantitative remote sensing methods, providing there is prior knowledge of part of the area.
ERIC Educational Resources Information Center
Greenhill, Laurence L.; Vitiello, Benedetto; Fisher, Prudence; Levine, Jerome; Davies, Mark; Abikoff, Howard; Chrisman, Allan K.; Chuang, Shirley; Findling, Robert L.; March, John; Scahill, Lawrence; Walkup, John; Riddle, Mark A.
2004-01-01
Objective: To improve the gathering of adverse events (AEs) in pediatric psychopharmacology by examining the value and acceptability of increasingly detailed elicitation methods. Method: Trained clinicians administered the Safety Monitoring Uniform Report Form (SMURF) to 59 parents and outpatients (mean age [+ or -] SD = 11.9 [+ or -] 3.2 years)…
A feasibility study of methods for stopping the depletion of ozone over Antarctica
NASA Technical Reports Server (NTRS)
1988-01-01
Ways of stopping the ozone depletion in the ozone hole over Antarctica were studied. The basic objectives were: (1) to define and understand the phenomenon of the ozone hole; (2) to determine possible methods of stopping the ozone depletion; (3) to identify unknowns about the hole and possible solutions. Two basic ways of attacking the problem were identified. First is replenishment of ozone as it is being depleted. Second is elimination of ozone destroying agents from the atmosphere. The second method is a more permanent form of the solution. Elimination and replenishment methods are discussed in detail.
ERIC Educational Resources Information Center
Strouse, Lewis H.
2009-01-01
Before rehearsals begin, conductors need to thoroughly study the score. What elements go into a comprehensive score preparation? To learn music scores efficiently, having a detailed and systematic study method helps. The author has developed a score preparation guide that works for directors of bands, choruses, and orchestras, even when there's…
Using Aquatic Insects as Indicators of Water Quality
ERIC Educational Resources Information Center
Dyche, Steven E.
1977-01-01
Described is a science field activity that studies the presence of certain aquatic insects, like stoneflies, as indicators of water quality. Equipment, materials, and methods are listed in detail, including suggestions for building certain supplies. Results of previous studies on the Yellowstone River are included. (MA)
Invited presentation: no abstract submission fee required
Introduction abstract for Workshop.
CONTROL ID: 56947
CONTACT (NAME ONLY): Barbara Abbott
Abstract Details
PRESENTATION TYPE: Invited Presentation : Workshop
KEYWORDS: National Childrens Study, Ri...
DEVELOPMENT OF A SMALL CHAMBER METHOD FOR SVOC SINK EFFECT STUDY
This paper describes the details of the improved chamber system and reports the sink effect study for organophosphorus flame retardants (OP-FRs), including tris(2-chloroethyl) phosphate(TCEP), tris(1-chlor-2-propyl) phosphate (TCPP) and tris(1,3-dichloro-2-propyl) phosphate (TDC...
Cesarean section using the Misgav Ladach method.
Federici, D; Lacelli, B; Muggiasca, L; Agarossi, A; Cipolla, L; Conti, M
1997-06-01
To stress the advantages of the Misgav Ladach method for cesarean section. In this study operative details and the postoperative course of 139 patients who underwent cesarean section according to the Misgav Ladach method in 1995-96 are presented. The Misgav Ladach method reduces operation time, time of child delivery, and time of recovery. The rates of febrile morbidity, wound infection and wound dehiscence are not affected by the new technique. Our study highlights the efficiency and safety of the Misgav Ladach method, and points out the speeded recovery, with early ambulation and resumption of drinking and eating, that makes the cesarean section delivery closer and closer to natural childbirth.
Application of the spectral-correlation method for diagnostics of cellulose paper
NASA Astrophysics Data System (ADS)
Kiesewetter, D.; Malyugin, V.; Reznik, A.; Yudin, A.; Zhuravleva, N.
2017-11-01
The spectral-correlation method was described for diagnostics of optically inhomogeneous biological objects and materials of natural origin. The interrelation between parameters of the studied objects and parameters of the cross correlation function of speckle patterns produced by scattering of coherent light at different wavelengths is shown for thickness, optical density and internal structure of the material. A detailed study was performed for cellulose electric insulating paper with different parameters.
Design and Analysis of a Subcritical Airfoil for High Altitude, Long Endurance Missions.
1982-12-01
Airfoil Design and Analysis Method ......... .... 61 Appendix D: Boundary Layer Analysis Method ............. ... 81 Appendix E: Detailed Results ofr...attack. Computer codes designed by Richard Eppler were used for this study. The airfoil was anlayzed by using a viscous effects analysis program...inverse program designed by Eppler (Ref 5) was used in this study to accomplish this part. The second step involved the analysis of the airfoil under
Recent Progress in the p and h-p Version of the Finite Element Method.
1987-07-01
code PROBE which was developed recently by NOETIC Technologies, St. Louis £54]. PROBE solves two dimensional problems of linear elasticity, stationary...of the finite element method was studied in detail from various point of view. We will mention here some essential illustrative results. In one...28) Bathe, K. J., Brezzi, F., Studies of finite element procedures - the INF-SUP condition, equivalent forms and applications in Reliability of
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1992-01-01
Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.
Concordance Between Current Job and Usual Job in Occupational and Industry Groupings
Luckhaupt, Sara E.; Cohen, Martha A.; Calvert, Geoffrey M.
2015-01-01
Objective To determine whether current job is a reasonable surrogate for usual job. Methods Data from the 2010 National Health Interview Survey were utilized to determine concordance between current and usual jobs for workers employed within the past year. Concordance was quantitated by kappa values for both simple and detailed industry and occupational groups. Good agreement is considered to be present when kappa values exceed 60. Results Overall kappa values ± standard errors were 74.5 ± 0.5 for simple industry, 72.4 ± 0.5 for detailed industry, 76.3 ± 0.4 for simple occupation, 73.7 ± 0.5 for detailed occupation, and 80.4 ± 0.6 for very broad occupational class. Sixty-five of 73 detailed industry groups and 78 of 81 detailed occupation groups evaluated had good agreement between current and usual jobs. Conclusions Current job can often serve as a reliable surrogate for usual job in epidemiologic studies. PMID:23969506
A review of the reporting of web searching to identify studies for Cochrane systematic reviews.
Briscoe, Simon
2018-03-01
The literature searches that are used to identify studies for inclusion in a systematic review should be comprehensively reported. This ensures that the literature searches are transparent and reproducible, which is important for assessing the strengths and weaknesses of a systematic review and re-running the literature searches when conducting an update review. Web searching using search engines and the websites of topically relevant organisations is sometimes used as a supplementary literature search method. Previous research has shown that the reporting of web searching in systematic reviews often lacks important details and is thus not transparent or reproducible. Useful details to report about web searching include the name of the search engine or website, the URL, the date searched, the search strategy, and the number of results. This study reviews the reporting of web searching to identify studies for Cochrane systematic reviews published in the 6-month period August 2016 to January 2017 (n = 423). Of these reviews, 61 reviews reported using web searching using a search engine or website as a literature search method. In the majority of reviews, the reporting of web searching was found to lack essential detail for ensuring transparency and reproducibility, such as the search terms. Recommendations are made on how to improve the reporting of web searching in Cochrane systematic reviews. Copyright © 2017 John Wiley & Sons, Ltd.
Practical Approaches to Forced Degradation Studies of Vaccines.
Hasija, Manvi; Aboutorabian, Sepideh; Rahman, Nausheen; Ausar, Salvador F
2016-01-01
During the early stages of vaccine development, forced degradation studies are conducted to provide information about the degradation properties of vaccine formulations. In addition to supporting the development of analytical methods for the detection of degradation products, these stress studies are used to identify optimal long-term storage conditions and are part of the regulatory requirements for the submission of stability data. In this chapter, we provide detailed methods for forced degradation analysis under thermal, light, and mechanical stress conditions.
ERIC Educational Resources Information Center
Li, Winnie Sim Siew; Arshad, Mohammad Yusof
2015-01-01
Purpose: Inquiry teaching has been suggested as one of the important approaches in teaching chemistry. This study investigates the inquiry practices among chemistry teachers. Method: A combination of quantitative and qualitative study was applied in this study to provide detailed information about inquiry teaching practices. Questionnaires,…
Experimental aspect of solid-state nuclear magnetic resonance studies of biomaterials such as bones.
Singh, Chandan; Rai, Ratan Kumar; Sinha, Neeraj
2013-01-01
Solid-state nuclear magnetic resonance (SSNMR) spectroscopy is increasingly becoming a popular technique to probe micro-structural details of biomaterial such as bone with pico-meter resolution. Due to high-resolution structural details probed by SSNMR methods, handling of bone samples and experimental protocol are very crucial aspects of study. We present here first report of the effect of various experimental protocols and handling methods of bone samples on measured SSNMR parameters. Various popular SSNMR experiments were performed on intact cortical bone sample collected from fresh animal, immediately after removal from animal systems, and results were compared with bone samples preserved in different conditions. We find that the best experimental conditions for SSNMR parameters of bones correspond to preservation at -20 °C and in 70% ethanol solution. Various other SSNMR parameters were compared corresponding to different experimental conditions. Our study has helped in finding best experimental protocol for SSNMR studies of bone. This study will be of further help in the application of SSNMR studies on large bone disease related animal model systems for statistically significant results. © 2013 Elsevier Inc. All rights reserved.
Land cover mapping at sub-pixel scales
NASA Astrophysics Data System (ADS)
Makido, Yasuyo Kato
One of the biggest drawbacks of land cover mapping from remotely sensed images relates to spatial resolution, which determines the level of spatial details depicted in an image. Fine spatial resolution images from satellite sensors such as IKONOS and QuickBird are now available. However, these images are not suitable for large-area studies, since a single image is very small and therefore it is costly for large area studies. Much research has focused on attempting to extract land cover types at sub-pixel scale, and little research has been conducted concerning the spatial allocation of land cover types within a pixel. This study is devoted to the development of new algorithms for predicting land cover distribution using remote sensory imagery at sub-pixel level. The "pixel-swapping" optimization algorithm, which was proposed by Atkinson for predicting sub-pixel land cover distribution, is investigated in this study. Two limitations of this method, the arbitrary spatial range value and the arbitrary exponential model of spatial autocorrelation, are assessed. Various weighting functions, as alternatives to the exponential model, are evaluated in order to derive the optimum weighting function. Two different simulation models were employed to develop spatially autocorrelated binary class maps. In all tested models, Gaussian, Exponential, and IDW, the pixel swapping method improved classification accuracy compared with the initial random allocation of sub-pixels. However the results suggested that equal weight could be used to increase accuracy and sub-pixel spatial autocorrelation instead of using these more complex models of spatial structure. New algorithms for modeling the spatial distribution of multiple land cover classes at sub-pixel scales are developed and evaluated. Three methods are examined: sequential categorical swapping, simultaneous categorical swapping, and simulated annealing. These three methods are applied to classified Landsat ETM+ data that has been resampled to 210 meters. The result suggested that the simultaneous method can be considered as the optimum method in terms of accuracy performance and computation time. The case study employs remote sensing imagery at the following sites: tropical forests in Brazil and temperate multiple land mosaic in East China. Sub-areas for both sites are used to examine how the characteristics of the landscape affect the ability of the optimum technique. Three types of measurement: Moran's I, mean patch size (MPS), and patch size standard deviation (STDEV), are used to characterize the landscape. All results suggested that this technique could increase the classification accuracy more than traditional hard classification. The methods developed in this study can benefit researchers who employ coarse remote sensing imagery but are interested in detailed landscape information. In many cases, the satellite sensor that provides large spatial coverage has insufficient spatial detail to identify landscape patterns. Application of the super-resolution technique described in this dissertation could potentially solve this problem by providing detailed land cover predictions from the coarse resolution satellite sensor imagery.
Malchaire, J B
2004-08-01
The first section of the document describes a risk-prevention strategy, called SOBANE, in four levels: screening, observation, analysis and expertise. The aim is to make risk prevention faster, more cost effective, and more effective in coordinating the contributions of the workers themselves, their management, the internal and external occupational health (OH) practitioners and the experts. These four levels are: screening, where the risk factors are detected by the workers and their management, and obvious solutions are implemented; observation, where the remaining problems are studied in more detail, one by one, and the reasons and the solutions are discussed in detail; analysis, where, when necessary, an OH practitioner is called upon to carry out appropriate measurements to develop specific solutions; expertise, where, in very sophisticated and rare cases, the assistance of an expert is called upon to solve a particular problem. The method for the participatory screening of the risks (in French: Dépistage Participatif des Risques), Déparis, is proposed for the first level screening of the SOBANE strategy. The work situation is systematically reviewed and all the aspects conditioning the easiness, the effectiveness and the satisfaction at work are discussed, in search of practical prevention measures. The points to be studied more in detail at level 2, observation, are identified. The method is carried out during a meeting of key workers and technical staff. The method proves to be simple, sparing in time and means and playing a significant role in the development of a dynamic plan of risk management and of a culture of dialogue in the company.
Study of effects of injector geometry on fuel-air mixing and combustion
NASA Technical Reports Server (NTRS)
Bangert, L. H.; Roach, R. L.
1977-01-01
An implicit finite-difference method has been developed for computing the flow in the near field of a fuel injector as part of a broader study of the effects of fuel injector geometry on fuel-air mixing and combustion. Detailed numerical results have been obtained for cases of laminar and turbulent flow without base injection, corresponding to the supersonic base flow problem. These numerical results indicated that the method is stable and convergent, and that significant savings in computer time can be achieved, compared with explicit methods.
Empirical Force Fields for Mechanistic Studies of Chemical Reactions in Proteins.
Das, A K; Meuwly, M
2016-01-01
Following chemical reactions in atomistic detail is one of the most challenging aspects of current computational approaches to chemistry. In this chapter the application of adiabatic reactive MD (ARMD) and its multistate version (MS-ARMD) are discussed. Both methods allow to study bond-breaking and bond-forming processes in chemical and biological processes. Particular emphasis is put on practical aspects for applying the methods to investigate the dynamics of chemical reactions. The chapter closes with an outlook of possible generalizations of the methods discussed. © 2016 Elsevier Inc. All rights reserved.
Van’t Hoff global analyses of variable temperature isothermal titration calorimetry data
Freiburger, Lee A.; Auclair, Karine; Mittermaier, Anthony K.
2016-01-01
Isothermal titration calorimetry (ITC) can provide detailed information on the thermodynamics of biomolecular interactions in the form of equilibrium constants, KA, and enthalpy changes, ΔHA. A powerful application of this technique involves analyzing the temperature dependences of ITC-derived KA and ΔHA values to gain insight into thermodynamic linkage between binding and additional equilibria, such as protein folding. We recently developed a general method for global analysis of variable temperature ITC data that significantly improves the accuracy of extracted thermodynamic parameters and requires no prior knowledge of the coupled equilibria. Here we report detailed validation of this method using Monte Carlo simulations and an application to study coupled folding and binding in an aminoglycoside acetyltransferase enzyme. PMID:28018008
NASA Astrophysics Data System (ADS)
Fortin, W.; Holbrook, W. S.; Mallick, S.; Everson, E. D.; Tobin, H. J.; Keranen, K. M.
2014-12-01
Understanding the geologic composition of the Cascadia Subduction Zone (CSZ) is critically important in assessing seismic hazards in the Pacific Northwest. Despite being a potential earthquake and tsunami threat to millions of people, key details of the structure and fault mechanisms remain poorly understood in the CSZ. In particular, the position and character of the subduction interface remains elusive due to its relative aseismicity and low seismic reflectivity, making imaging difficult for both passive and active source methods. Modern active-source reflection seismic data acquired as part of the COAST project in 2012 provide an opportunity to study the transition from the Cascadia basin, across the deformation front, and into the accretionary prism. Coupled with advances in seismic inversion methods, this new data allow us to produce detailed velocity models of the CSZ and accurate pre-stack depth migrations for studying geologic structure. While still computationally expensive, current computing clusters can perform seismic inversions at resolutions that match that of the seismic image itself. Here we present pre-stack full waveform inversions of the central seismic line of the COAST survey offshore Washington state. The resultant velocity model is produced by inversion at every CMP location, 6.25 m laterally, with vertical resolution of 0.2 times the dominant seismic frequency. We report a good average correlation value above 0.8 across the entire seismic line, determined by comparing synthetic gathers to the real pre-stack gathers. These detailed velocity models, both Vp and Vs, along with the density model, are a necessary step toward a detailed porosity cross section to be used to determine the role of fluids in the CSZ. Additionally, the P-velocity model is used to produce a pre-stack depth migration image of the CSZ.
An Overview of Recent Studies in Junior College Electrical-Electronics Curriculum.
ERIC Educational Resources Information Center
Williams, Harry E.
This brief paper discusses in some detail several recent studies of electrical-electronics curricula at the junior college level and presents a 22-item bibliography of studies in this area. Comments based on the findings of the studies include: (1) A method of instruction, particulary applicable to the field of electronics education, seems to be…
NASA Astrophysics Data System (ADS)
Krzyżek, Robert; Przewięźlikowska, Anna
2017-12-01
When surveys of corners of building structures are carried out, surveyors frequently use a compilation of two surveying methods. The first one involves the determination of several corners with reference to a geodetic control using classical methods of surveying field details. The second method relates to the remaining corner points of a structure, which are determined in sequence from distance-distance intersection, using control linear values of the wall faces of the building, the so-called tie distances. This paper assesses the accuracy of coordinates of corner points of a building structure, determined using the method of distance-distance intersection, based on the corners which had previously been determined by the conducted surveys tied to a geodetic control. It should be noted, however, that such a method of surveying the corners of building structures from linear measures is based on the details of the first-order accuracy, while the regulations explicitly allow such measurement only for the details of the second- and third-order accuracy. Therefore, a question arises whether this legal provision is unfounded, or whether surveyors are acting not only against the applicable standards but also without due diligence while performing surveys? This study provides answers to the formulated problem. The main purpose of the study was to verify whether the actual method which is used in practice for surveying building structures allows to obtain the required accuracy of coordinates of the points being determined, or whether it should be strictly forbidden. The results of the conducted studies clearly demonstrate that the problem is definitely more complex. Eventually, however, it might be assumed that assessment of the accuracy in determining a location of corners of a building using a combination of two different surveying methods will meet the requirements of the regulation [MIA, 2011), subject to compliance with relevant baseline criteria, which have been presented in this study. Observance of the proposed boundary conditions would allow for frequent performance of surveys of building structures by surveyors (from tie distances), while maintaining the applicable accuracy criteria. This would allow for the inclusion of surveying documentation into the national geodetic and cartographic documentation center database pursuant to the legal bases.
High-quality observation of surface imperviousness for urban runoff modelling using UAV imagery
NASA Astrophysics Data System (ADS)
Tokarczyk, P.; Leitao, J. P.; Rieckermann, J.; Schindler, K.; Blumensaat, F.
2015-10-01
Modelling rainfall-runoff in urban areas is increasingly applied to support flood risk assessment, particularly against the background of a changing climate and an increasing urbanization. These models typically rely on high-quality data for rainfall and surface characteristics of the catchment area as model input. While recent research in urban drainage has been focusing on providing spatially detailed rainfall data, the technological advances in remote sensing that ease the acquisition of detailed land-use information are less prominently discussed within the community. The relevance of such methods increases as in many parts of the globe, accurate land-use information is generally lacking, because detailed image data are often unavailable. Modern unmanned aerial vehicles (UAVs) allow one to acquire high-resolution images on a local level at comparably lower cost, performing on-demand repetitive measurements and obtaining a degree of detail tailored for the purpose of the study. In this study, we investigate for the first time the possibility of deriving high-resolution imperviousness maps for urban areas from UAV imagery and of using this information as input for urban drainage models. To do so, an automatic processing pipeline with a modern classification method is proposed and evaluated in a state-of-the-art urban drainage modelling exercise. In a real-life case study (Lucerne, Switzerland), we compare imperviousness maps generated using a fixed-wing consumer micro-UAV and standard large-format aerial images acquired by the Swiss national mapping agency (swisstopo). After assessing their overall accuracy, we perform an end-to-end comparison, in which they are used as an input for an urban drainage model. Then, we evaluate the influence which different image data sources and their processing methods have on hydrological and hydraulic model performance. We analyse the surface runoff of the 307 individual subcatchments regarding relevant attributes, such as peak runoff and runoff volume. Finally, we evaluate the model's channel flow prediction performance through a cross-comparison with reference flow measured at the catchment outlet. We show that imperviousness maps generated from UAV images processed with modern classification methods achieve an accuracy comparable to standard, off-the-shelf aerial imagery. In the examined case study, we find that the different imperviousness maps only have a limited influence on predicted surface runoff and pipe flows, when traditional workflows are used. We expect that they will have a substantial influence when more detailed modelling approaches are employed to characterize land use and to predict surface runoff. We conclude that UAV imagery represents a valuable alternative data source for urban drainage model applications due to the possibility of flexibly acquiring up-to-date aerial images at a quality compared with off-the-shelf image products and a competitive price at the same time. We believe that in the future, urban drainage models representing a higher degree of spatial detail will fully benefit from the strengths of UAV imagery.
Gagani, Abedin I.; Echtermeyer, Andreas T.
2018-01-01
Monitoring water content and predicting the water-induced drop in strength of fiber-reinforced composites are of great importance for the oil and gas and marine industries. Fourier transform infrared (FTIR) spectroscopic methods are broadly available and often used for process and quality control in industrial applications. A benefit of using such spectroscopic methods over the conventional gravimetric analysis is the possibility to deduce the mass of an absolutely dry material and subsequently the true water content, which is an important indicator of water content-dependent properties. The objective of this study is to develop an efficient and detailed method for estimating the water content in epoxy resins and fiber-reinforced composites. In this study, Fourier transform near-infrared (FT-NIR) spectroscopy was applied to measure the water content of amine-epoxy neat resin. The method was developed and successfully extended to glass fiber-reinforced composite materials. Based on extensive measurements of neat resin and composite samples of varying water content and thickness, regression was performed, and the quantitative absorbance dependence on water content in the material was established. The mass of an absolutely dry resin was identified, and the true water content was obtained. The method was related to the Beer–Lambert law and explained in such terms. A detailed spectroscopic method for measuring water content in resins and fiber-reinforced composites was developed and described. PMID:29641451
How to write a materials and methods section of a scientific article?
Erdemir, Fikret
2013-09-01
In contrast to past centuries, scientific researchers have been currently conducted systematically in all countries as part of an education strategy. As a consequence, scientists have published thousands of reports. Writing an effective article is generally a significant problem for researchers. All parts of an article, specifically the abstract, material and methods, results, discussion and references sections should contain certain features that should always be considered before sending a manuscript to a journal for publication. It is generally known that the material and methods section is a relatively easy section of an article to write. Therefore, it is often a good idea to begin by writing the materials and methods section, which is also a crucial part of an article. Because "reproducible results" are very important in science, a detailed account of the study should be given in this section. If the authors provide sufficient detail, other scientists can repeat their experiments to verify their findings. It is generally recommended that the materials and methods should be written in the past tense, either in active or passive voice. In this section, ethical approval, study dates, number of subjects, groups, evaluation criteria, exclusion criteria and statistical methods should be described sequentially. It should be noted that a well-written materials and methods section markedly enhances the chances of an article being published.
Krauklis, Andrey E; Gagani, Abedin I; Echtermeyer, Andreas T
2018-04-11
Monitoring water content and predicting the water-induced drop in strength of fiber-reinforced composites are of great importance for the oil and gas and marine industries. Fourier transform infrared (FTIR) spectroscopic methods are broadly available and often used for process and quality control in industrial applications. A benefit of using such spectroscopic methods over the conventional gravimetric analysis is the possibility to deduce the mass of an absolutely dry material and subsequently the true water content, which is an important indicator of water content-dependent properties. The objective of this study is to develop an efficient and detailed method for estimating the water content in epoxy resins and fiber-reinforced composites. In this study, Fourier transform near-infrared (FT-NIR) spectroscopy was applied to measure the water content of amine-epoxy neat resin. The method was developed and successfully extended to glass fiber-reinforced composite materials. Based on extensive measurements of neat resin and composite samples of varying water content and thickness, regression was performed, and the quantitative absorbance dependence on water content in the material was established. The mass of an absolutely dry resin was identified, and the true water content was obtained. The method was related to the Beer-Lambert law and explained in such terms. A detailed spectroscopic method for measuring water content in resins and fiber-reinforced composites was developed and described.
Studying learning in the healthcare setting: the potential of quantitative diary methods.
Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke
2015-08-01
Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.
An improved method for determination of refractive index of absorbing films: A simulation study
NASA Astrophysics Data System (ADS)
Özcan, Seçkin; Coşkun, Emre; Kocahan, Özlem; Özder, Serhat
2017-02-01
In this work an improved version of the method presented by Gandhi was presented for determination of refractive index of absorbing films. In this method local maxima of consecutive interference order in transmittance spectrum are used. The method is based on the minimizing procedure leading to the determination of interference order accurately by using reasonable Cauchy parameters. It was tested on theoretically generated transmittance spectrum of absorbing film and the details of the minimization procedure were discussed.
Comparative evaluation of RetCam vs. gonioscopy images in congenital glaucoma
Azad, Raj V; Chandra, Parijat; Chandra, Anuradha; Gupta, Aparna; Gupta, Viney; Sihota, Ramanjit
2014-01-01
Purpose: To compare clarity, exposure and quality of anterior chamber angle visualization in congenital glaucoma patients, using RetCam and indirect gonioscopy images. Design: Cross-sectional study Participants. Congenital glaucoma patients over age of 5 years. Materials and Methods: A prospective consecutive pilot study was done in congenital glaucoma patients who were older than 5 years. Methods used are indirect gonioscopy and RetCam imaging. Clarity of the image, extent of angle visible and details of angle structures seen were graded for both methods, on digitally recorded images, in each eye, by two masked observers. Outcome Measures: Image clarity, interobserver agreement. Results: 40 eyes of 25 congenital glaucoma patients were studied. RetCam image had excellent clarity in 77.5% of patients versus 47.5% by gonioscopy. The extent of angle seen was similar by both methods. Agreement between RetCam and gonioscopy images regarding details of angle structures was 72.50% by observer 1 and 65.00% by observer 2. Conclusions: There was good agreement between RetCam and indirect gonioscopy images in detecting angle structures of congenital glaucoma patients. However, RetCam provided greater clarity, with better quality, and higher magnification images. RetCam can be a useful alternative to gonioscopy in infants and small children without the need for general anesthesia. PMID:24008788
DOT National Transportation Integrated Search
2014-05-01
Different problems in straight skewed steel I-girder bridges are often associated with the methods used for detailing the cross-frames. Use of theoretical terms to describe these detailing methods and absence of complete and simplified design approac...
Analytical study of striated nozzle flow with small radius of curvature ratio throats
NASA Technical Reports Server (NTRS)
Norton, D. J.; White, R. E.
1972-01-01
An analytical method was developed which is capable of estimating the chamber and throat conditions in a nozzle with a low radius of curvature throat. The method was programmed using standard FORTRAN 4 language and includes chemical equilibrium calculation subprograms (modified NASA Lewis program CEC71) as an integral part. The method determines detailed and gross rocket characteristics in the presence of striated flows and gives detailed results for the motor chamber and throat plane with as many as 20 discrete zones. The method employs a simultaneous solution of the mass, momentum, and energy equations and allows propellant types, 0/F ratios, propellant distribution, nozzle geometry, and injection schemes to be varied so to predict spatial velocity, density, pressure, and other thermodynamic variable distributions in the chamber as well as the throat. Results for small radius of curvature have shown good comparison to experimental results. Both gaseous and liquid injection may be considered with frozen or equilibrium flow calculations.
Methods of Visually Determining the Air Flow Around Airplanes
NASA Technical Reports Server (NTRS)
Gough, Melvin N; Johnson, Ernest
1932-01-01
This report describes methods used by the National Advisory Committee for Aeronautics to study visually the air flow around airplanes. The use of streamers, oil and exhaust gas streaks, lampblack and kerosene, powdered materials, and kerosene smoke is briefly described. The generation and distribution of smoke from candles and from titanium tetrachloride are described in greater detail because they appear most advantageous for general application. Examples are included showing results of the various methods.
Service Learning: An Auditing Project Study
ERIC Educational Resources Information Center
Laing, Gregory Kenneth
2013-01-01
There is a growing demand in higher education for universities to introduce teaching methods that achieve the learning outcomes of vocational education. The need for vocational educational outcomes was met in this study involving a service learning activity designed to provide basic professional auditing competencies. The details of the design and…
Lyons, Anthony; Heywood, Wendy; Fileborn, Bianca; Minichiello, Victor; Barrett, Catherine; Brown, Graham; Hinchliff, Sharron; Malta, Sue; Crameri, Pauline
2017-09-01
Older people are often excluded from large studies of sexual health, as it is assumed that they are not having sex or are reluctant to talk about sensitive topics and are therefore difficult to recruit. We outline the sampling and recruitment strategies from a recent study on sexual health and relationships among older people. Sex, Age and Me was a nationwide Australian study that examined sexual health, relationship patterns, safer-sex practices and STI knowledge of Australians aged 60 years and over. The study used a mixed-methods approach to establish baseline levels of knowledge and to develop deeper insights into older adult's understandings and practices relating to sexual health. Data collection took place in 2015, with 2137 participants completing a quantitative survey and 53 participating in one-on-one semi-structured interviews. As the feasibility of this type of study has been largely untested until now, we provide detailed information on the study's recruitment strategies and methods. We also compare key characteristics of our sample with national estimates to assess its degree of representativeness. This study provides evidence to challenge the assumptions that older people will not take part in sexual health-related research and details a novel and successful way to recruit participants in this area.
1987-11-30
currently evaluating two instrumental techniques which seem highly appropriate to this LPS project, supercritical fluid chromatography (SFC) and...NEW INSTRUMENTAL TECHNIQUES AND METHODS OF APPROACH 1. Supercritical Fluid Chromatography (SFC) ............... 6. 2. SFC and Mass Spectrometry...details are discussed below in the appropriate sections. B. NEW INSTRUMENTAL TECHNIQUES AND METHODS OF APPROACH 1. Supercritical Fluid Chromatography (SFC
Student's Perceptions of Quality Learning in a Malaysian University--A Mixed Method Approach
ERIC Educational Resources Information Center
Choy, S. Chee; Yim, Joanne Sau-Ching; Tan, Poh Leong
2017-01-01
Purpose: This paper aims to examine students' perceptions of quality learning using a mixed-methods approach in a Malaysian university, with an aim to fill existing knowledge gaps in the literature on relationships among relevant quality variables. The study also assesses the extent to which detailed results from a few participants can be…
From peds to paradoxes: Linkages between soil biota and their influences on ecological processes
David C. Coleman
2008-01-01
Soils and their biota have been studied by a variety of observational and experimental methods that have allowed biologists to infer their structural and functional interactions. Viewing progress made over the last 10 years, it is apparent that an increasing diversity of analytical and chemical methods are providing much more detailed information about feeding...
Estimation of vegetation-type areas by linear measurement
A.A. Hasel
1941-01-01
Maps are very useful in providing a picture of the location of vegetation types, but mapping as a method for determining type areas may be inadequate or costly. The measurement of vegetation type areas by means of line surveys is discussed in the following article, and the method is tested in connection with detailed studies on plots. The results indicate that the...
NASA Technical Reports Server (NTRS)
Plumblee, H. E., Jr.; Dean, P. D.; Wynne, G. A.; Burrin, R. H.
1973-01-01
The results of an experimental and theoretical study of many of the fundamental details of sound propagation in hard wall and soft wall annular flow ducts are reported. The theory of sound propagation along such ducts and the theory for determining the complex radiation impedance of higher order modes of an annulus are outlined, and methods for generating acoustic duct modes are developed. The results of a detailed measurement program on propagation in rigid wall annular ducts with and without airflow through the duct are presented. Techniques are described for measuring cut-on frequencies, modal phase speed, and radial and annular mode shapes. The effects of flow velocity on cut-on frequencies and phase speed are measured. Comparisons are made with theoretical predictions for all of the effects studies. The two microphone method of impedance is used to measure the effects of flow on acoustic liners. A numerical study of sound propagation in annular ducts with one or both walls acoustically lined is presented.
Peer Tutoring Programs in Health Professions Schools
Garavalia, Linda
2006-01-01
Objective Peer tutoring programs may be one method of maintaining quality of pharmacy education in the face of growing student enrollment and a small faculty body. A critical review of the literature was performed to ascertain whether peer tutoring programs improve or maintain the academic performance of health care professional students. Methods Various electronic databases and abstracts from past American Association of Colleges of Pharmacy's annual meetings were searched to identify pertinent research. Only those articles with quantitative data, an experimental design, and comparative statistical analysis were included for review. Results Most studies found that peer tutoring had a positive impact on academic performance. These results may not be readily generalizable as there were numerous methodological flaws and limited descriptions of the programs and participants. Implications Studies with better designs and more detail are needed to answer definitively whether peer tutoring is of benefit. Details of what resources were required should be included in the study to allow the reader to determine the feasibility of the intervention. PMID:17136190
Kabat, Geoffrey C; Cross, Amanda J; Park, Yikyung; Schatzkin, Arthur; Hollenbeck, Albert R; Rohan, Thomas E; Sinha, Rashmi
2009-05-15
A number of studies have reported that intake of red meat or meat cooked at high temperatures is associated with increased risk of breast cancer, but other studies have shown no association. We assessed the association between meat, meat-cooking methods, and meat-mutagen intake and postmenopausal breast cancer in the NIH-AARP Diet and Health Study cohort of 120,755 postmenopausal women who completed a food frequency questionnaire at baseline (1995-1996) as well as a detailed meat-cooking module within 6 months following baseline. During 8 years of follow-up, 3,818 cases of invasive breast cancer were identified in this cohort. Cox proportional hazards models were used to estimate hazard ratios (HR) and 95% confidence intervals (95% CI). After adjusting for covariates, intake of total meat, red meat, meat cooked at high temperatures, and meat mutagens showed no association with breast cancer risk. This large prospective study with detailed information on meat preparation methods provides no support for a role of meat mutagens in the development of postmenopausal breast cancer. (c) 2008 Wiley-Liss, Inc.
Software Engineering Laboratory (SEL) Ada performance study report
NASA Technical Reports Server (NTRS)
Booth, Eric W.; Stark, Michael E.
1991-01-01
The goals of the Ada Performance Study are described. The methods used are explained. Guidelines for future Ada development efforts are given. The goals and scope of the study are detailed, and the background of Ada development in the Flight Dynamics Division (FDD) is presented. The organization and overall purpose of each test are discussed. The purpose, methods, and results of each test and analyses of these results are given. Guidelines for future development efforts based on the analysis of results from this study are provided. The approach used on the performance tests is discussed.
Boushey, C J; Spoden, M; Zhu, F M; Delp, E J; Kerr, D A
2017-08-01
For nutrition practitioners and researchers, assessing dietary intake of children and adults with a high level of accuracy continues to be a challenge. Developments in mobile technologies have created a role for images in the assessment of dietary intake. The objective of this review was to examine peer-reviewed published papers covering development, evaluation and/or validation of image-assisted or image-based dietary assessment methods from December 2013 to January 2016. Images taken with handheld devices or wearable cameras have been used to assist traditional dietary assessment methods for portion size estimations made by dietitians (image-assisted methods). Image-assisted approaches can supplement either dietary records or 24-h dietary recalls. In recent years, image-based approaches integrating application technology for mobile devices have been developed (image-based methods). Image-based approaches aim at capturing all eating occasions by images as the primary record of dietary intake, and therefore follow the methodology of food records. The present paper reviews several image-assisted and image-based methods, their benefits and challenges; followed by details on an image-based mobile food record. Mobile technology offers a wide range of feasible options for dietary assessment, which are easier to incorporate into daily routines. The presented studies illustrate that image-assisted methods can improve the accuracy of conventional dietary assessment methods by adding eating occasion detail via pictures captured by an individual (dynamic images). All of the studies reduced underreporting with the help of images compared with results with traditional assessment methods. Studies with larger sample sizes are needed to better delineate attributes with regards to age of user, degree of error and cost.
Satellite Power System (SPS) financial/management scenarios
NASA Technical Reports Server (NTRS)
Vajk, J. P.
1978-01-01
The possible benefits of a Satellite Power System (SPS) program, both domestically and internationally, justify detailed and imaginative investigation of the issues involved in financing and managing such a large-scale program. In this study, ten possible methods of financing a SPS program are identified ranging from pure government agency to private corporations. The following were analyzed and evaluated: (1) capital requirements for SPS; (2) ownership and control; (3) management principles; (4) organizational forms for SPS; (5) criteria for evaluation; (6) detailed description and preliminary evaluation of alternatives; (7) phased approaches; and (8) comparative evaluation. Key issues and observations and recommendations for further study are also presented.
Solar thermoelectric generators
NASA Technical Reports Server (NTRS)
1977-01-01
The methods, the findings and the conclusions of a study for the design of a Solar Thermoelectric Generator (STG) intended for use as a power source for a spacecraft orbiting the planet Mercury are discussed. Several state-of-the-art thermoelectric technologies in the intended application were considered. The design of various STG configurations based on the thermoelectric technology selected from among the various technologies was examined in detail and a recommended STG design was derived. The performance characteristics of the selected STG technology and associated design were studied in detail as a function of the orbital characteristics of the STG in Mercury and throughout the orbit of Mercury around the sun.
HDlive rendering images of the fetal stomach: a preliminary report.
Inubashiri, Eisuke; Abe, Kiyotaka; Watanabe, Yukio; Akutagawa, Noriyuki; Kuroki, Katumaru; Sugawara, Masaki; Maeda, Nobuhiko; Minami, Kunihiro; Nomura, Yasuhiro
2015-01-01
This study aimed to show reconstruction of the fetal stomach using the HDlive rendering mode in ultrasound. Seventeen healthy singleton fetuses at 18-34 weeks' gestational age were observed using the HDlive rendering mode of ultrasound in utero. In all of the fetuses, we identified specific spatial structures, including macroscopic anatomical features (e.g., the pyrous, cardia, fundus, and great curvature) of the fetal stomach, using the HDlive rendering mode. In particular, HDlive rendering images showed remarkably fine details that appeared as if they were being viewed under an endoscope, with visible rugal folds after 27 weeks' gestational age. Our study suggests that the HDlive rendering mode can be used as an additional method for evaluating the fetal stomach. The HDlive rendering mode shows detailed 3D structural images and anatomically realistic images of the fetal stomach. This technique may be effective in prenatal diagnosis for examining detailed information of fetal organs.
Introducing DeBRa: a detailed breast model for radiological studies
NASA Astrophysics Data System (ADS)
Ma, Andy K. W.; Gunn, Spencer; Darambara, Dimitra G.
2009-07-01
Currently, x-ray mammography is the method of choice in breast cancer screening programmes. As the mammography technology moves from 2D imaging modalities to 3D, conventional computational phantoms do not have sufficient detail to support the studies of these advanced imaging systems. Studies of these 3D imaging systems call for a realistic and sophisticated computational model of the breast. DeBRa (Detailed Breast model for Radiological studies) is the most advanced, detailed, 3D computational model of the breast developed recently for breast imaging studies. A DeBRa phantom can be constructed to model a compressed breast, as in film/screen, digital mammography and digital breast tomosynthesis studies, or a non-compressed breast as in positron emission mammography and breast CT studies. Both the cranial-caudal and mediolateral oblique views can be modelled. The anatomical details inside the phantom include the lactiferous duct system, the Cooper ligaments and the pectoral muscle. The fibroglandular tissues are also modelled realistically. In addition, abnormalities such as microcalcifications, irregular tumours and spiculated tumours are inserted into the phantom. Existing sophisticated breast models require specialized simulation codes. Unlike its predecessors, DeBRa has elemental compositions and densities incorporated into its voxels including those of the explicitly modelled anatomical structures and the noise-like fibroglandular tissues. The voxel dimensions are specified as needed by any study and the microcalcifications are embedded into the voxels so that the microcalcification sizes are not limited by the voxel dimensions. Therefore, DeBRa works with general-purpose Monte Carlo codes. Furthermore, general-purpose Monte Carlo codes allow different types of imaging modalities and detector characteristics to be simulated with ease. DeBRa is a versatile and multipurpose model specifically designed for both x-ray and γ-ray imaging studies.
Brown, Adam D; Addis, Donna Rose; Romano, Tracy A; Marmar, Charles R; Bryant, Richard A; Hirst, William; Schacter, Daniel L
2014-01-01
Individuals with post-traumatic stress disorder (PTSD) tend to retrieve autobiographical memories with less episodic specificity, referred to as overgeneralised autobiographical memory. In line with evidence that autobiographical memory overlaps with one's capacity to imagine the future, recent work has also shown that individuals with PTSD also imagine themselves in the future with less episodic specificity. To date most studies quantify episodic specificity by the presence of a distinct event. However, this method does not distinguish between the numbers of internal (episodic) and external (semantic) details, which can provide additional insights into remembering the past and imagining the future. This study employed the Autobiographical Interview (AI) coding scheme to the autobiographical memory and imagined future event narratives generated by combat veterans with and without PTSD. Responses were coded for the number of internal and external details. Compared to combat veterans without PTSD, those with PTSD generated more external than internal details when recalling past or imagining future events, and fewer internal details were associated with greater symptom severity. The potential mechanisms underlying these bidirectional deficits and clinical implications are discussed.
NASA Astrophysics Data System (ADS)
Altschuler, Bruce R.; Monson, Keith L.
1998-03-01
Representation of crime scenes as virtual reality 3D computer displays promises to become a useful and important tool for law enforcement evaluation and analysis, forensic identification and pathological study and archival presentation during court proceedings. Use of these methods for assessment of evidentiary materials demands complete accuracy of reproduction of the original scene, both in data collection and in its eventual virtual reality representation. The recording of spatially accurate information as soon as possible after first arrival of law enforcement personnel is advantageous for unstable or hazardous crime scenes and reduces the possibility that either inadvertent measurement error or deliberate falsification may occur or be alleged concerning processing of a scene. Detailed measurements and multimedia archiving of critical surface topographical details in a calibrated, uniform, consistent and standardized quantitative 3D coordinate method are needed. These methods would afford professional personnel in initial contact with a crime scene the means for remote, non-contacting, immediate, thorough and unequivocal documentation of the contents of the scene. Measurements of the relative and absolute global positions of object sand victims, and their dispositions within the scene before their relocation and detailed examination, could be made. Resolution must be sufficient to map both small and large objects. Equipment must be able to map regions at varied resolution as collected from different perspectives. Progress is presented in devising methods for collecting and archiving 3D spatial numerical data from crime scenes, sufficient for law enforcement needs, by remote laser structured light and video imagery. Two types of simulation studies were done. One study evaluated the potential of 3D topographic mapping and 3D telepresence using a robotic platform for explosive ordnance disassembly. The second study involved using the laser mapping system on a fixed optical bench with simulated crime scene models of the people and furniture to assess feasibility, requirements and utility of such a system for crime scene documentation and analysis.
A brain MRI bias field correction method created in the Gaussian multi-scale space
NASA Astrophysics Data System (ADS)
Chen, Mingsheng; Qin, Mingxin
2017-07-01
A pre-processing step is needed to correct for the bias field signal before submitting corrupted MR images to such image-processing algorithms. This study presents a new bias field correction method. The method creates a Gaussian multi-scale space by the convolution of the inhomogeneous MR image with a two-dimensional Gaussian function. In the multi-Gaussian space, the method retrieves the image details from the differentiation of the original image and convolution image. Then, it obtains an image whose inhomogeneity is eliminated by the weighted sum of image details in each layer in the space. Next, the bias field-corrected MR image is retrieved after the Υ correction, which enhances the contrast and brightness of the inhomogeneity-eliminated MR image. We have tested the approach on T1 MRI and T2 MRI with varying bias field levels and have achieved satisfactory results. Comparison experiments with popular software have demonstrated superior performance of the proposed method in terms of quantitative indices, especially an improvement in subsequent image segmentation.
Lunar-base construction equipment and methods evaluation
NASA Technical Reports Server (NTRS)
Boles, Walter W.; Ashley, David B.; Tucker, Richard L.
1993-01-01
A process for evaluating lunar-base construction equipment and methods concepts is presented. The process is driven by the need for more quantitative, systematic, and logical methods for assessing further research and development requirements in an area where uncertainties are high, dependence upon terrestrial heuristics is questionable, and quantitative methods are seldom applied. Decision theory concepts are used in determining the value of accurate information and the process is structured as a construction-equipment-and-methods selection methodology. Total construction-related, earth-launch mass is the measure of merit chosen for mathematical modeling purposes. The work is based upon the scope of the lunar base as described in the National Aeronautics and Space Administration's Office of Exploration's 'Exploration Studies Technical Report, FY 1989 Status'. Nine sets of conceptually designed construction equipment are selected as alternative concepts. It is concluded that the evaluation process is well suited for assisting in the establishment of research agendas in an approach that is first broad, with a low level of detail, followed by more-detailed investigations into areas that are identified as critical due to high degrees of uncertainty and sensitivity.
Luijkx, Katrien; Calciolari, Stefano; González-Ortiz, Laura G.
2017-01-01
Introduction: In this paper, we provide a detailed and explicit description of the processes and decisions underlying and shaping the emergent multimethod research design of our study on workforce changes in integrated chronic care. Theory and methods: The study was originally planned as mixed method research consisting of a preliminary literature review and quantitative check of these findings via a Delphi panel. However, when the findings of the literature review were not appropriate for quantitative confirmation, we chose to continue our qualitative exploration of the topic via qualitative questionnaires and secondary analysis of two best practice case reports. Results: The resulting research design is schematically described as an emergent and interactive multimethod design with multiphase combination timing. In doing so, we provide other researchers with a set of theory- and experience-based options to develop their own multimethod research and provide an example for more detailed and structured reporting of emergent designs. Conclusion and discussion: We argue that the terminology developed for the description of mixed methods designs should also be used for multimethod designs such as the one presented here. PMID:29042843
Huizinga, Richard J.; Rydlund, Jr., Paul H.
2004-01-01
The evaluation of scour at bridges throughout the state of Missouri has been ongoing since 1991 in a cooperative effort by the U.S. Geological Survey and Missouri Department of Transportation. A variety of assessment methods have been used to identify bridges susceptible to scour and to estimate scour depths. A potential-scour assessment (Level 1) was used at 3,082 bridges to identify bridges that might be susceptible to scour. A rapid estimation method (Level 1+) was used to estimate contraction, pier, and abutment scour depths at 1,396 bridge sites to identify bridges that might be scour critical. A detailed hydraulic assessment (Level 2) was used to compute contraction, pier, and abutment scour depths at 398 bridges to determine which bridges are scour critical and would require further monitoring or application of scour countermeasures. The rapid estimation method (Level 1+) was designed to be a conservative estimator of scour depths compared to depths computed by a detailed hydraulic assessment (Level 2). Detailed hydraulic assessments were performed at 316 bridges that also had received a rapid estimation assessment, providing a broad data base to compare the two scour assessment methods. The scour depths computed by each of the two methods were compared for bridges that had similar discharges. For Missouri, the rapid estimation method (Level 1+) did not provide a reasonable conservative estimate of the detailed hydraulic assessment (Level 2) scour depths for contraction scour, but the discrepancy was the result of using different values for variables that were common to both of the assessment methods. The rapid estimation method (Level 1+) was a reasonable conservative estimator of the detailed hydraulic assessment (Level 2) scour depths for pier scour if the pier width is used for piers without footing exposure and the footing width is used for piers with footing exposure. Detailed hydraulic assessment (Level 2) scour depths were conservatively estimated by the rapid estimation method (Level 1+) for abutment scour, but there was substantial variability in the estimates and several substantial underestimations.
Freeze-quench (57)Fe-Mössbauer spectroscopy: trapping reactive intermediates.
Krebs, Carsten; Bollinger, J Martin
2009-01-01
(57)Fe-Mössbauer spectroscopy is a method that probes transitions between the nuclear ground state (I=1/2) and the first nuclear excited state (I=3/2). This technique provides detailed information about the chemical environment and electronic structure of iron. Therefore, it has played an important role in studies of the numerous iron-containing proteins and enzymes. In conjunction with the freeze-quench method, (57)Fe-Mössbauer spectroscopy allows for monitoring changes of the iron site(s) during a biochemical reaction. This approach is particularly powerful for detection and characterization of reactive intermediates. Comparison of experimentally determined Mössbauer parameters to those predicted by density functional theory for hypothetical model structures can then provide detailed insight into the structures of reactive intermediates. We have recently used this methodology to study the reactions of various mononuclear non-heme-iron enzymes by trapping and characterizing several Fe(IV)-oxo reaction intermediates. In this article, we summarize these findings and demonstrate the potential of the method. © Springer Science+Business Media B.V. 2009
Poor methodological detail precludes experimental repeatability and hampers synthesis in ecology.
Haddaway, Neal R; Verhoeven, Jos T A
2015-10-01
Despite the scientific method's central tenets of reproducibility (the ability to obtain similar results when repeated) and repeatability (the ability to replicate an experiment based on methods described), published ecological research continues to fail to provide sufficient methodological detail to allow either repeatability of verification. Recent systematic reviews highlight the problem, with one example demonstrating that an average of 13% of studies per year (±8.0 [SD]) failed to report sample sizes. The problem affects the ability to verify the accuracy of any analysis, to repeat methods used, and to assimilate the study findings into powerful and useful meta-analyses. The problem is common in a variety of ecological topics examined to date, and despite previous calls for improved reporting and metadata archiving, which could indirectly alleviate the problem, there is no indication of an improvement in reporting standards over time. Here, we call on authors, editors, and peer reviewers to consider repeatability as a top priority when evaluating research manuscripts, bearing in mind that legacy and integration into the evidence base can drastically improve the impact of individual research reports.
Vadose Zone Transport Field Study: Detailed Test Plan for Simulated Leak Tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Anderson L.; Gee, Glendon W.
2000-06-23
This report describes controlled transport experiments at well-instrumented field tests to be conducted during FY 2000 in support of DOE?s Vadose Zone Transport Field Study (VZTFS). The VZTFS supports the Groundwater/Vadose Zone Integration Project Science and Technology Initiative. The field tests will improve understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. These methods will capture the extent of contaminant plumes using existing steel-cased boreholes. Specific objectives are to 1) identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford?s waste disposal sites; 2) reduce uncertainty in conceptualmore » models; 3) develop a detailed and accurate data base of hydraulic and transport parameters for validation of three-dimensional numerical models; and 4) identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. Pacific Northwest National Laboratory (PNNL) manages the VZTFS for DOE.« less
Detailed Aerodynamic Analysis of a Shrouded Tail Rotor Using an Unstructured Mesh Flow Solver
NASA Astrophysics Data System (ADS)
Lee, Hee Dong; Kwon, Oh Joon
The detailed aerodynamics of a shrouded tail rotor in hover has been numerically studied using a parallel inviscid flow solver on unstructured meshes. The numerical method is based on a cell-centered finite-volume discretization and an implicit Gauss-Seidel time integration. The calculation was made for a single blade by imposing a periodic boundary condition between adjacent rotor blades. The grid periodicity was also imposed at the periodic boundary planes to avoid numerical inaccuracy resulting from solution interpolation. The results were compared with available experimental data and those from a disk vortex theory for validation. It was found that realistic three-dimensional modeling is important for the prediction of detailed aerodynamics of shrouded rotors including the tip clearance gap flow.
Workload, study methods, and motivation of students within a BVSc program.
Parkinson, Tim J; Gilling, Marg; Suddaby, Gordon T
2006-01-01
The workloads, study methods, and motivation of students in a five-year BVSc program were studied using questionnaires and focus groups. Students in each year of the program were asked, on three occasions over an academic year, to record details of their out-of-class study time for each course they were taking and to record the study methods they used, how they prioritized their time between subjects, and how they allocated time to study and leisure activities. Mean response rates were 57% (range: 43-85%). Overall mean out-of-class study time ranged from 19 hours per week in Year 2 to 28 hours per week in Year 4. Study time was related to the level of interest the student had in the subject, the demands of assessments, and the number of subjects being studied. Study methods were related to students' perceptions of the requirements of the subject as well as to their interest in it. Reliance on memorization and the use of set study materials were the predominant methods for courses with low interest scores, whereas higher interest was associated with a broad range of study methods. Leisure time was ring-fenced, especially when workloads were high. Students' motivation was high when they were studying subjects that were new or were seen as relevant to clinical practice; when working with animals or with enthusiastic faculty members; and when involved in subjects more tightly focused on the ultimate goal of becoming a practitioner. It was poor when students were faced with high workloads, disciplines becoming "stale," excessive detail, and low perceptions of relevance. Constant assessment activities were also seen as a burden. In terms of good learning practices, workload and the demands of assessment were considered to be antagonistic. A tension between these perceptions of students and the values of faculty in terms of the development of critical thinking skills in the program is evident.
Reserving by detailed conditioning on individual claim
NASA Astrophysics Data System (ADS)
Kartikasari, Mujiati Dwi; Effendie, Adhitya Ronnie; Wilandari, Yuciana
2017-03-01
The estimation of claim reserves is an important activity in insurance companies to fulfill their liabilities. Recently, reserving method of individual claim have attracted a lot of interest in the actuarial science, which overcome some deficiency of aggregated claim method. This paper explores the Reserving by Detailed Conditioning (RDC) method using all of claim information for reserving with individual claim of liability insurance from an Indonesian general insurance company. Furthermore, we compare it to Chain Ladder and Bornhuetter-Ferguson method.
ATLAS Test Program Generator II (AGEN II). Volume I. Executive Software System.
1980-08-01
features. l-1 C. To provide detailed descriptions of each of the system components and modules and their corresponding flowcharts. D. To describe methods of...contains the FORTRAN source code listings to enable programmer to do the expansions and modifications. The methods and details of adding another...characteristics of the network. The top-down implementa- tion method is therefore suggested. This method starts at the top by designing the IVT modules in
An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks
NASA Astrophysics Data System (ADS)
Zhao, Peng-yuan; Huang, Xiao-ping
2018-03-01
Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.
Kachalo, Sëma; Naveed, Hammad; Cao, Youfang; Zhao, Jieling; Liang, Jie
2015-01-01
Geometric and mechanical properties of individual cells and interactions among neighboring cells are the basis of formation of tissue patterns. Understanding the complex interplay of cells is essential for gaining insight into embryogenesis, tissue development, and other emerging behavior. Here we describe a cell model and an efficient geometric algorithm for studying the dynamic process of tissue formation in 2D (e.g. epithelial tissues). Our approach improves upon previous methods by incorporating properties of individual cells as well as detailed description of the dynamic growth process, with all topological changes accounted for. Cell size, shape, and division plane orientation are modeled realistically. In addition, cell birth, cell growth, cell shrinkage, cell death, cell division, cell collision, and cell rearrangements are now fully accounted for. Different models of cell-cell interactions, such as lateral inhibition during the process of growth, can be studied in detail. Cellular pattern formation for monolayered tissues from arbitrary initial conditions, including that of a single cell, can also be studied in detail. Computational efficiency is achieved through the employment of a special data structure that ensures access to neighboring cells in constant time, without additional space requirement. We have successfully generated tissues consisting of more than 20,000 cells starting from 2 cells within 1 hour. We show that our model can be used to study embryogenesis, tissue fusion, and cell apoptosis. We give detailed study of the classical developmental process of bristle formation on the epidermis of D. melanogaster and the fundamental problem of homeostatic size control in epithelial tissues. Simulation results reveal significant roles of solubility of secreted factors in both the bristle formation and the homeostatic control of tissue size. Our method can be used to study broad problems in monolayered tissue formation. Our software is publicly available. PMID:25974182
Studies of aerothermal loads generated in regions of shock/shock interaction in hypersonic flow
NASA Technical Reports Server (NTRS)
Holden, Michael S.; Moselle, John R.; Lee, Jinho
1991-01-01
Experimental studies were conducted to examine the aerothermal characteristics of shock/shock/boundary layer interaction regions generated by single and multiple incident shocks. The presented experimental studies were conducted over a Mach number range from 6 to 19 for a range of Reynolds numbers to obtain both laminar and turbulent interaction regions. Detailed heat transfer and pressure measurements were made for a range of interaction types and incident shock strengths over a transverse cylinder, with emphasis on the 3 and 4 type interaction regions. The measurements were compared with the simple Edney, Keyes, and Hains models for a range of interaction configurations and freestream conditions. The complex flowfields and aerothermal loads generated by multiple-shock impingement, while not generating as large peak loads, provide important test cases for code prediction. The detailed heat transfer and pressure measurements proved a good basis for evaluating the accuracy of simple prediction methods and detailed numerical solutions for laminar and transitional regions or shock/shock interactions.
Estimating tree heights from shadows on vertical aerial photographs
Earl J. Rogers
1947-01-01
Aerial photographs are now being applied more and more to practical forestry - especially to forest survey. Many forest characteristics can be recognized on aerial photographs in greater detail than is possible through ground methods alone. The basic need is for tools and methods for interpreting the detail in quantitative terms.
Real-Time Nonlocal Means-Based Despeckling.
Breivik, Lars Hofsoy; Snare, Sten Roar; Steen, Erik Normann; Solberg, Anne H Schistad
2017-06-01
In this paper, we propose a multiscale nonlocal means-based despeckling method for medical ultrasound. The multiscale approach leads to large computational savings and improves despeckling results over single-scale iterative approaches. We present two variants of the method. The first, denoted multiscale nonlocal means (MNLM), yields uniform robust filtering of speckle both in structured and homogeneous regions. The second, denoted unnormalized MNLM (UMNLM), is more conservative in regions of structure assuring minimal disruption of salient image details. Due to the popularity of anisotropic diffusion-based methods in the despeckling literature, we review the connection between anisotropic diffusion and iterative variants of NLM. These iterative variants in turn relate to our multiscale variant. As part of our evaluation, we conduct a simulation study making use of ground truth phantoms generated from clinical B-mode ultrasound images. We evaluate our method against a set of popular methods from the despeckling literature on both fine and coarse speckle noise. In terms of computational efficiency, our method outperforms the other considered methods. Quantitatively on simulations and on a tissue-mimicking phantom, our method is found to be competitive with the state-of-the-art. On clinical B-mode images, our method is found to effectively smooth speckle while preserving low-contrast and highly localized salient image detail.
Social Studies for EMR Pupils: A Course of Study for Senior High Schools.
ERIC Educational Resources Information Center
Casler, Al; And Others
Presented are 10 units of study for educable mentally retarded (EMR) senior high school students in the area of social studies. The outlined sequence and suggested time allotment for each unit covers grades 10-12. Subject matter content of each unit is outlined in detail; particular teaching methods and materials are not specified. Units cover the…
Estimating the number of animals in wildlife populations
Lancia, R.A.; Kendall, W.L.; Pollock, K.H.; Nichols, J.D.; Braun, Clait E.
2005-01-01
INTRODUCTION In 1938, Howard M. Wight devoted 9 pages, which was an entire chapter in the first wildlife management techniques manual, to what he termed 'census' methods. As books and chapters such as this attest, the volume of literature on this subject has grown tremendously. Abundance estimation remains an active area of biometrical research, as reflected in the many differences between this chapter and the similar contribution in the previous manual. Our intent in this chapter is to present an overview of the basic and most widely used population estimation techniques and to provide an entree to the relevant literature. Several possible approaches could be taken in writing a chapter dealing with population estimation. For example, we could provide a detailed treatment focusing on statistical models and on derivation of estimators based on these models. Although a chapter using this approach might provide a valuable reference for quantitative biologists and biometricians, it would be of limited use to many field biologists and wildlife managers. Another approach would be to focus on details of actually applying different population estimation techniques. This approach would include both field application (e.g., how to set out a trapping grid or conduct an aerial survey) and detailed instructions on how to use the resulting data with appropriate estimation equations. We are reluctant to attempt such an approach, however, because of the tremendous diversity of real-world field situations defined by factors such as the animal being studied, habitat, available resources, and because of our resultant inability to provide detailed instructions for all possible cases. We believe it is more useful to provide the reader with the conceptual basis underlying estimation methods. Thus, we have tried to provide intuitive explanations for how basic methods work. In doing so, we present relevant estimation equations for many methods and provide citations of more detailed treatments covering both statistical considerations and field applications. We have chosen to present methods that are representative of classes of estimators, rather than address every available method. Our hope is that this chapter will provide the reader with enough background to make an informed decision about what general method(s) will likely perform well in any particular field situation. Readers with a more quantitative background may then be able to consult detailed references and tailor the selected method to suit their particular needs. Less quantitative readers should consult a biometrician, preferably one with experience in wildlife studies, for this 'tailoring,' with the hope they will be able to do so with a basic understanding of the general method, thereby permitting useful interaction and discussion with the biometrician. SUMMARY Estimating the abundance or density of animals in wild populations is not a trivial matter. Virtually all techniques involve the basic problem of estimating the probability of seeing, capturing, or otherwise detecting animals during some type of survey and, in many cases, sampling concerns as well. In the case of indices, the detection probability is assumed to be constant (but unknown). We caution against use of indices unless this assumption can be verified for the comparison(s) of interest. In the case of population estimation, many methods have been developed over the years to estimate the probability of detection associated with various kinds of count statistics. Techniques range from complete counts, where sampling concerns often dominate, to incomplete counts where detection probabilities are also important. Some examples of the latter are multiple observers, removal methods, and capture-recapture. Before embarking on a survey to estimate the size of a population, one must understand clearly what information is needed and for what purpose the information will be used. The key to derivin
Considerations when conducting e-Delphi research: a case study.
Toronto, Coleen
2017-06-22
Background E-Delphi is a way to access a geographically dispersed group of experts. It is similar to other Delphi methods but conducted online. E-research methodologies, such as the e-Delphi method, have yet to undergo significant critical discussion. Aim To highlight some of the challenges nurse researchers may wish to consider when using e-Delphi in their research. Discussion This paper provides details about the author's approach to conducting an e-Delphi study in which a group of health literacy nurse experts (n=41) used an online survey platform to identify and prioritise essential health literacy competencies for registered nurses. Conclusion This paper advances methodological discourse about e-Delphi by critically assessing an e-Delphi case study. The online survey platform used in this study was advantageous for the researcher and the experts: the experts could participate at any time and place where the internet was available; the researcher could efficiently access a national group of experts, track responses and analyse data in each round. Implications for practice E-Delphi studies create opportunities for nurse researchers to conduct research nationally and internationally. Before conducting an e-Delphi study, researchers should carefully consider the design and methods for collecting data, to avoid challenges that could potentially compromise the quality of the findings. Researchers are encouraged to publish details about their approaches to e-Delphi studies, to advance the state of the science.
Improved numerical solutions for chaotic-cancer-model
NASA Astrophysics Data System (ADS)
Yasir, Muhammad; Ahmad, Salman; Ahmed, Faizan; Aqeel, Muhammad; Akbar, Muhammad Zubair
2017-01-01
In biological sciences, dynamical system of cancer model is well known due to its sensitivity and chaoticity. Present work provides detailed computational study of cancer model by counterbalancing its sensitive dependency on initial conditions and parameter values. Cancer chaotic model is discretized into a system of nonlinear equations that are solved using the well-known Successive-Over-Relaxation (SOR) method with a proven convergence. This technique enables to solve large systems and provides more accurate approximation which is illustrated through tables, time history maps and phase portraits with detailed analysis.
Evaluating nuclear physics inputs in core-collapse supernova models
NASA Astrophysics Data System (ADS)
Lentz, E.; Hix, W. R.; Baird, M. L.; Messer, O. E. B.; Mezzacappa, A.
Core-collapse supernova models depend on the details of the nuclear and weak interaction physics inputs just as they depend on the details of the macroscopic physics (transport, hydrodynamics, etc.), numerical methods, and progenitors. We present preliminary results from our ongoing comparison studies of nuclear and weak interaction physics inputs to core collapse supernova models using the spherically-symmetric, general relativistic, neutrino radiation hydrodynamics code Agile-Boltztran. We focus on comparisons of the effects of the nuclear EoS and the effects of improving the opacities, particularly neutrino--nucleon interactions.
Transformation of the θ-phase in Mg-Li-Al alloys: a density functional theory study.
Zhang, Caili; Han, Peide; Zhang, Zhuxia; Dong, Minghui; Zhang, Lili; Gu, Xiangyang; Yang, Yanqing; Xu, Bingshe
2012-03-01
In Mg-Li-Al alloys, θ-phase MgAlLi(2) is a strengthening and metastable phase which is liable to be transformed to the equilibrium phase AlLi on overaging. While the structural details of the θ-phase MgAlLi(2) and the microscopic transformation are still unknown. In this paper, the structure of MgAlLi(2) unit cell was determined through X-ray powder diffraction simulation. Microscopic transformation process of θ-phase MgAlLi(2) was discussed in detail using first principles method.
1988-06-01
partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN MANAGEMENT from the NAVAL POSTGRADUATE SCHOOL June 1988 Author: Denise M...of work), management study reviews and detailed cost comparisons. A Cost Comparison Handbook ( CCH ), also published in 1979, provided detailed...1, dated 12 August 1985. The cost comparison methodology was changed from the complex full cost method outlined in the CCH , to a simpler incremen- tal
NASA Technical Reports Server (NTRS)
Gottlieb, D.; Turkel, E.
1985-01-01
After detailing the construction of spectral approximations to time-dependent mixed initial boundary value problems, a study is conducted of differential equations of the form 'partial derivative of u/partial derivative of t = Lu + f', where for each t, u(t) belongs to a Hilbert space such that u satisfies homogeneous boundary conditions. For the sake of simplicity, it is assumed that L is an unbounded, time-independent linear operator. Attention is given to Fourier methods of both Galerkin and pseudospectral method types, the Galerkin method, the pseudospectral Chebyshev and Legendre methods, the error equation, hyperbolic partial differentiation equations, and time discretization and iterative methods.
Fatigue design procedure for the American SST prototype
NASA Technical Reports Server (NTRS)
Doty, R. J.
1972-01-01
For supersonic airline operations, significantly higher environmental temperature is the primary new factor affecting structural service life. Methods for incorporating the influence of temperature in detailed fatigue analyses are shown along with current test indications. Thermal effects investigated include real-time compared with short-time testing, long-time temperature exposure, and stress-temperature cycle phasing. A method is presented which allows designers and stress analyzers to check fatigue resistance of structural design details. A communicative rating system is presented which defines the relative fatigue quality of the detail so that the analyst can define cyclic-load capability of the design detail by entering constant-life charts for varying detail quality. If necessary then, this system allows the designer to determine ways to improve the fatigue quality for better life or to determine the operating stresses which will provide the required service life.
Detailed Hydrographic Feature Extraction from High-Resolution LiDAR Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danny L. Anderson
Detailed hydrographic feature extraction from high-resolution light detection and ranging (LiDAR) data is investigated. Methods for quantitatively evaluating and comparing such extractions are presented, including the use of sinuosity and longitudinal root-mean-square-error (LRMSE). These metrics are then used to quantitatively compare stream networks in two studies. The first study examines the effect of raster cell size on watershed boundaries and stream networks delineated from LiDAR-derived digital elevation models (DEMs). The study confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes generally yielded better stream network delineations, based on sinuosity and LRMSE. The second study demonstrates amore » new method of delineating a stream directly from LiDAR point clouds, without the intermediate step of deriving a DEM. Direct use of LiDAR point clouds could improve efficiency and accuracy of hydrographic feature extractions. The direct delineation method developed herein and termed “mDn”, is an extension of the D8 method that has been used for several decades with gridded raster data. The method divides the region around a starting point into sectors, using the LiDAR data points within each sector to determine an average slope, and selecting the sector with the greatest downward slope to determine the direction of flow. An mDn delineation was compared with a traditional grid-based delineation, using TauDEM, and other readily available, common stream data sets. Although, the TauDEM delineation yielded a sinuosity that more closely matches the reference, the mDn delineation yielded a sinuosity that was higher than either the TauDEM method or the existing published stream delineations. Furthermore, stream delineation using the mDn method yielded the smallest LRMSE.« less
Jorgenson, D B; Haynor, D R; Bardy, G H; Kim, Y
1995-02-01
A method for constructing and solving detailed patient-specific 3-D finite element models of the human thorax is presented for use in defibrillation studies. The method utilizes the patient's own X-ray CT scan and a simplified meshing scheme to quickly and efficiently generate a model typically composed of approximately 400,000 elements. A parameter sensitivity study on one human thorax model to examine the effects of variation in assigned tissue resistivity values, level of anatomical detail included in the model, and number of CT slices used to produce the model is presented. Of the seven tissue types examined, the average left ventricular (LV) myocardial voltage gradient was most sensitive to the values of myocardial and blood resistivity. Incorrectly simplifying the model, for example modeling the heart as a homogeneous structure by ignoring the blood in the chambers, caused the average LV myocardial voltage gradient to increase by 12%. The sensitivity of the model to variations in electrode size and position was also examined. Small changes (< 2.0 cm) in electrode position caused average LV myocardial voltage gradient values to increase by up to 12%. We conclude that patient-specific 3-D finite element modeling of human thoracic electric fields is feasible and may reduce the empiric approach to insertion of implantable defibrillators and improve transthoracic defibrillation techniques.
Algebraic methods in system theory
NASA Technical Reports Server (NTRS)
Brockett, R. W.; Willems, J. C.; Willsky, A. S.
1975-01-01
Investigations on problems of the type which arise in the control of switched electrical networks are reported. The main results concern the algebraic structure and stochastic aspects of these systems. Future reports will contain more detailed applications of these results to engineering studies.
Ubiquitous Creation of Bas-Relief Surfaces with Depth-of-Field Effects Using Smartphones.
Sohn, Bong-Soo
2017-03-11
This paper describes a new method to automatically generate digital bas-reliefs with depth-of-field effects from general scenes. Most previous methods for bas-relief generation take input in the form of 3D models. However, obtaining 3D models of real scenes or objects is often difficult, inaccurate, and time-consuming. From this motivation, we developed a method that takes as input a set of photographs that can be quickly and ubiquitously captured by ordinary smartphone cameras. A depth map is computed from the input photographs. The value range of the depth map is compressed and used as a base map representing the overall shape of the bas-relief. However, the resulting base map contains little information on details of the scene. Thus, we construct a detail map using pixel values of the input image to express the details. The base and detail maps are blended to generate a new depth map that reflects both overall depth and scene detail information. This map is selectively blurred to simulate the depth-of-field effects. The final depth map is converted to a bas-relief surface mesh. Experimental results show that our method generates a realistic bas-relief surface of general scenes with no expensive manual processing.
Ubiquitous Creation of Bas-Relief Surfaces with Depth-of-Field Effects Using Smartphones
Sohn, Bong-Soo
2017-01-01
This paper describes a new method to automatically generate digital bas-reliefs with depth-of-field effects from general scenes. Most previous methods for bas-relief generation take input in the form of 3D models. However, obtaining 3D models of real scenes or objects is often difficult, inaccurate, and time-consuming. From this motivation, we developed a method that takes as input a set of photographs that can be quickly and ubiquitously captured by ordinary smartphone cameras. A depth map is computed from the input photographs. The value range of the depth map is compressed and used as a base map representing the overall shape of the bas-relief. However, the resulting base map contains little information on details of the scene. Thus, we construct a detail map using pixel values of the input image to express the details. The base and detail maps are blended to generate a new depth map that reflects both overall depth and scene detail information. This map is selectively blurred to simulate the depth-of-field effects. The final depth map is converted to a bas-relief surface mesh. Experimental results show that our method generates a realistic bas-relief surface of general scenes with no expensive manual processing. PMID:28287487
Global/local stress analysis of composite panels
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Knight, Norman F., Jr.
1989-01-01
A method for performing a global/local stress analysis is described, and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
Global/local stress analysis of composite structures. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.
1989-01-01
A method for performing a global/local stress analysis is described and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
A COMPARISON OF RESPONSE CONFIRMATION TECHNIQUES FOR AN ADJUNCTIVE SELF-STUDY PROGRAM.
ERIC Educational Resources Information Center
MEYER, DONALD E.
AN EXPERIMENT COMPARED THE EFFECTIVENESS OF FOUR METHODS OF CONFIRMING RESPONSES TO AN ADJUNCTIVE SELF-STUDY PROGRAM. THE PROGRAM WAS DESIGNED FOR AIR FORCE AIRCREWS UNDERTAKING A REFRESHER COURSE IN ENGINEERING. A SERIES OF SEQUENCED MULTIPLE CHOICE QUESTIONS EACH REFERRED TO A PAGE AND PARAGRAPH OF A PUBLICATION CONTAINING DETAILED INFORMATION…
ERIC Educational Resources Information Center
Nelson, Erin M.
2010-01-01
Scope and Method of Study: The purpose of this qualitative study was to conduct detailed personal interviews with aerospace industry executives/managers from both the private and military sectors from across Oklahoma to determine their perceptions of intellectual capital needs of the industry. Interviews with industry executives regarding…
The Child and Adolescent Mental Health Studies (CAMS) Minor at New York University
ERIC Educational Resources Information Center
Shatkin, Jess P.; Koplewicz, Harold S.
2008-01-01
Objective: The authors describe the Child and Adolescent Mental Health Studies (CAMS) undergraduate college minor at New York University. Methods: The authors detail the development, structure, and operation of the CAMS minor. They describe the importance of identifying program goals, building coalitions, creating an advisory board, selecting…
Applications of DNA-Stable Isotope Probing in Bioremediation Studies
NASA Astrophysics Data System (ADS)
Chen, Yin; Vohra, Jyotsna; Murrell, J. Colin
DNA-stable isotope probing, a method to identify active microorganisms without the prerequisite of cultivation, has been widely applied in the study of microorganisms involved in the degradation of environmental pollutants. Recent advances and technique considerations in applying DNA-SIP in bioremediation are highlighted. A detailed protocol of a DNA-SIP experiment is provided.
Applications of DNA-stable isotope probing in bioremediation studies.
Chen, Yin; Vohra, Jyotsna; Murrell, J Colin
2010-01-01
DNA-stable isotope probing, a method to identify active microorganisms without the prerequisite of cultivation, has been widely applied in the study of microorganisms involved in the degradation of environmental pollutants. Recent advances and technique considerations in applying DNA-SIP in bioremediation are highlighted. A detailed protocol of a DNA-SIP experiment is provided.
NASA Astrophysics Data System (ADS)
Thorsson, Solver I.
Foreign object impact on composite materials continues to be an active field due to its importance in the design of load bearing composite aerostructures. The problem has been studied by many through the decades. Extensive experimental studies have been performed to characterize the impact damage and failure mechanisms. Leaders in aerospace industry are pushing for reliable, robust and efficient computational methods for predicting impact response of composite structures. Experimental and numerical investigations on the impact response of fiber reinforced polymer matrix composite (FRPC) laminates are presented. A detailed face-on and edge-on impact experimental study is presented. A novel method for conducting coupon-level edge-on impact experiments is introduced. The research is focused on impact energy levels that are in the vicinity of the barely visible impact damage (BVID) limit of the material system. A detailed post-impact damage study is presented where non-destructive inspection (NDI) methods such as ultrasound scanning and computed tomography (CT) are used. Detailed fractography studies are presented for further investigation of the through-the-thickness damage due to the impact event. Following the impact study, specimens are subjected to compression after impact (CAI) to establish the effect of BVID on the compressive strength after impact (CSAI). A modified combined loading compression (CLC) test method is proposed for compression testing following an edge-on impact. Experimental work on the rate sensitivity of the mode I and mode II inter-laminar fracture toughness is also investigated. An improved wedge-insert fracture (WIF) method for conducting mode I inter-laminar fracture at elevated loading rates is introduced. Based on the experimental results, a computational modeling approach for capturing face-on impact and CAI is developed. The model is then extended to edge-on impact and CAI. Enhanced Schapery Theory (EST) is utilized for modeling the full field damage and failure present in a unidirectional (UD) lamina within a laminate. Schapery Theory (ST) is a thermodynamically based work potential material model which captures the pre-peak softening due to matrix micro-cracking such as hackling, micro fissures, etc. The Crack Band (CB) method is utilized to capture macroscopic matrix and fiber failure modes such as ply splitting and fiber rupture. Discrete Cohesive Zone Method (DCZM) elements are implemented for capturing inter-laminar delaminations, using discrete nodal traction-separation governed interactions. The model is verified against the impact experimental results and the associated CAI procedures. The model results are in good agreement with experimental findings. The model proved capable of predicting the representative experimental failure modes.
ERIC Educational Resources Information Center
Richings, Vicky Ann; Nishimuro, Masateru
2017-01-01
This paper reports on findings from a classroom study on the introduction and effectiveness of new methods of instruction using English literature in a Japanese high school setting. It is based on data compiled during a two-year research project. In this paper, we will detail the investigation and findings from an analysis of student questionnaire…
BrdsNBz: A Mixed Methods Study Exploring Adolescents' Use of a Sexual Health Text Message Service
ERIC Educational Resources Information Center
Willoughby, Jessica Fitts
2013-01-01
Sexual health text message services are becoming increasingly popular, but little is known about who uses such services and why. This project details the implementation of a campaign promoting a state-wide sexual health text message service that allows teens to text directly with a health educator and uses a mixed method design to assess who uses…
A Study on Project Priority Evaluation Method on Road Slope Disaster Prevention Management
NASA Astrophysics Data System (ADS)
Sekiguchi, Nobuyasu; Ohtsu, Hiroyasu; Izu, Ryuutarou
To improve the safety and security of driving while coping with today's stagnant economy and frequent natural disasters, road slopes should be appropriately managed. To achieve the goals, road managers should establish project priority evaluation methods for each stage of road slope management by clarifying social losses that would result by drops in service levels. It is important that road managers evaluate a project priority properly to manage the road slope effectively. From this viewpoint, this study proposed "project priority evaluation methods" in road slope disaster prevention, which use available slope information at each stage of road slope management under limited funds. In addition, this study investigated the effect of managing it from the high slope of the priority by evaluating a risk of slope failure. In terms of the amount of available information, staged information provision is needed ranging from macroscopic studies, which involves evaluation of the entire route at each stage of decision making, to semi- and microscopic investigations for evaluating slopes, and microscopic investigations for evaluating individual slopes. With limited funds, additional detailed surveys are difficult to perform. It is effective to use the slope risk assessment system, which was constructed to complement detailed data, to extract sites to perform precise investigations.
Calibrating Detailed Chemical Analysis of M dwarfs
NASA Astrophysics Data System (ADS)
Veyette, Mark; Muirhead, Philip Steven; Mann, Andrew; Brewer, John; Allard, France; Homeier, Derek
2018-01-01
The ability to perform detailed chemical analysis of Sun-like F-, G-, and K-type stars is a powerful tool with many applications including studying the chemical evolution of the Galaxy, assessing membership in stellar kinematic groups, and constraining planet formation theories. Unfortunately, complications in modeling cooler stellar atmospheres has hindered similar analysis of M-dwarf stars. Large surveys of FGK abundances play an important role in developing methods to measure the compositions of M dwarfs by providing benchmark FGK stars that have widely-separated M dwarf companions. These systems allow us to empirically calibrate metallicity-sensitive features in M dwarf spectra. However, current methods to measure metallicity in M dwarfs from moderate-resolution spectra are limited to measuring overall metallicity and largely rely on astrophysical abundance correlations in stellar populations. In this talk, I will discuss how large, homogeneous catalogs of precise FGK abundances are crucial to advancing chemical analysis of M dwarfs beyond overall metallicity to direct measurements of individual elemental abundances. I will present a new method to analyze high-resolution, NIR spectra of M dwarfs that employs an empirical calibration of synthetic M dwarf spectra to infer effective temperature, Fe abundance, and Ti abundance. This work is a step toward detailed chemical analysis of M dwarfs at a similar precision achieved for FGK stars.
NASA Astrophysics Data System (ADS)
Koca, Aliihsan; Acikgoz, Ozgen; Çebi, Alican; Çetin, Gürsel; Dalkilic, Ahmet Selim; Wongwises, Somchai
2018-02-01
Investigations on heated ceiling method can be considered as a new research area in comparison to the common wall heating-cooling and cooled ceiling methods. In this work, heat transfer characteristics of a heated radiant ceiling system was investigated experimentally. There were different configurations for a single room design in order to determine the convective and radiative heat transfer rates. Almost all details on the arrangement of the test chamber, hydraulic circuit and radiant panels, the measurement equipment and experimental method including uncertainty analysis were revealed in detail indicating specific international standards. Total heat transfer amount from the panels were calculated as the sum of radiation to the unheated surfaces, convection to the air, and conduction heat loss from the backside of the panels. Integral expression of the view factors was calculated by means of the numerical evaluations using Matlab code. By means of this experimental chamber, the radiative, convective and total heat-transfer coefficient values along with the heat flux values provided from the ceiling to the unheated surrounding surfaces have been calculated. Moreover, the details of 28 different experimental case study measurements from the experimental chamber including the convective, radiative and total heat flux, and heat output results are given in a Table for other researchers to validate their theoretical models and empirical correlations.
Electronic states with nontrivial topology in Dirac materials
NASA Astrophysics Data System (ADS)
Turkevich, R. V.; Perov, A. A.; Protogenov, A. P.; Chulkov, E. V.
2017-08-01
The theoretical studies of phase states with a linear dispersion of the spectrum of low-energy electron excitations have been reviewed. Some main properties and methods of experimental study of these states in socalled Dirac materials have been discussed in detail. The results of modern studies of symmetry-protected electronic states with nontrivial topology have been reported. Combination of approaches based on geometry with homotopic topology methods and results of condensed matter physics makes it possible to clarify new features of topological insulators, as well as Dirac and Weyl semimetals.
Forty years of temporal analysis of products
Morgan, K.; Maguire, N.; Fushimi, R.; ...
2017-05-16
Detailed understanding of mechanisms and reaction kinetics are required in order to develop and optimize catalysts and catalytic processes. While steady state investigations are known to give a global view of the catalytic system, transient studies are invaluable since they can provide more detailed insight into elementary steps. For almost thirty years temporal analysis of products (TAP) has been successfully utilized for transient studies of gas phase heterogeneous catalysis, and there have been a number of advances in instrumentation and numerical modeling methods in that time. In the current work, the range of available TAP apparatus will be discussed whilemore » detailed explanations of the types of TAP experiment, the information that can be determined from these experiments and the analysis methods are also included. TAP is a complex methodology and is often viewed as a niche specialty. Here, part of the intention of this work is to highlight the significant contributions TAP can make to catalytic research, while also discussing the issues which will make TAP more relevant and approachable to a wider segment of the catalytic research community. With this in mind, an outlook is also disclosed for the technique in terms of what is needed to revitalize the field and make it more applicable to the recent advances in catalyst characterization (e.g. operando modes).« less
Forty years of temporal analysis of products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgan, K.; Maguire, N.; Fushimi, R.
Detailed understanding of mechanisms and reaction kinetics are required in order to develop and optimize catalysts and catalytic processes. While steady state investigations are known to give a global view of the catalytic system, transient studies are invaluable since they can provide more detailed insight into elementary steps. For almost thirty years temporal analysis of products (TAP) has been successfully utilized for transient studies of gas phase heterogeneous catalysis, and there have been a number of advances in instrumentation and numerical modeling methods in that time. In the current work, the range of available TAP apparatus will be discussed whilemore » detailed explanations of the types of TAP experiment, the information that can be determined from these experiments and the analysis methods are also included. TAP is a complex methodology and is often viewed as a niche specialty. Here, part of the intention of this work is to highlight the significant contributions TAP can make to catalytic research, while also discussing the issues which will make TAP more relevant and approachable to a wider segment of the catalytic research community. With this in mind, an outlook is also disclosed for the technique in terms of what is needed to revitalize the field and make it more applicable to the recent advances in catalyst characterization (e.g. operando modes).« less
Detection, Occurrence and Fate of Emerging Contaminants in Agricultural Environments
Cassada, David A.; Bartelt–Hunt, Shannon L.; Li, Xu; D’Alessio, Matteo; Zhang, Yun; Zhang, Yuping; Sallach, J. Brett
2018-01-01
A total of 59 papers published in 2015 were reviewed ranging from detailed descriptions of analytical methods, to fate and occurrence studies, to ecological effects and sampling techniques for a wide variety of emerging contaminants likely to occur in agricultural environments. New methods and studies on veterinary pharmaceuticals, steroids, antibiotic resistance genes in agricultural environments continue to expand our knowledge base on the occurrence and potential impacts of these compounds. This review is divided into the following sections: Introduction, Analytical Methods, Steroid Hormones, Pharmaceutical Contaminants, Transformation Products, and “Antibiotic Resistance, Drugs, Bugs and Genes”. PMID:27620078
García-Estévez, Ignacio; Alcalde-Eon, Cristina; Escribano-Bailón, M Teresa
2017-08-09
The determination of the detailed flavanol composition in food matrices is not a simple task because of the structural similarities of monomers and, consequently, oligomers and polymers. The aim of this study was the development and validation of an HPLC-MS/MS-multiple reaction monitoring (MRM) method that would allow the accurate and precise quantification of catechins, gallocatechins, and oligomeric proanthocyanidins. The high correlation coefficients of the calibration curves (>0.993), the recoveries not statistically different from 100%, the good intra- and interday precisions (<5%), and the LOD and LOQ values, low enough to quantify flavanols in grapes, are good results from the method validation procedure. Its usefulness has also been tested by determining the detailed composition of Vitis vinifera L. cv. Rufete grapes. Seventy-two (38 nongalloylated and 34 galloylated) and 53 (24 procyanidins and 29 prodelphinidins) flavanols have been identified and quantified in grape seed and grape skin, respectively. The use of HCA and PCA on the detailed flavanol composition has allowed differentiation among Rufete clones.
Hudnutt, K.W.; Borsa, A.; Glennie, C.; Minster, J.-B.
2002-01-01
In order to document surface rupture associated with the Hector Mine earthquake, in particular, the area of maximum slip and the deformed surface of Lavic Lake playa, we acquired high-resolution data using relatively new topographic-mapping methods. We performed a raster-laser scan of the main surface breaks along the entire rupture zone, as well as along an unruptured portion of the Bullion fault. The image of the ground surface produced by this method is highly detailed, comparable to that obtained when geologists make particularly detailed site maps for geomorphic or paleoseismic studies. In this case, however, for the first time after a surface-rupturing earthquake, the detailed mapping is along the entire fault zone rather than being confined to selected sites. These data are geodetically referenced, using the Global Positioning System, thus enabling more accurate mapping of the rupture traces. In addition, digital photographs taken along the same flight lines can be overlaid onto the precise topographic data, improving terrain visualization. We demonstrate the potential of these techniques for measuring fault-slip vectors.
Zhang, Ying; Sun, Jin; Zhang, Yun-Jiao; Chai, Qian-Yun; Zhang, Kang; Ma, Hong-Li; Wu, Xiao-Ke; Liu, Jian-Ping
2016-10-21
Although Traditional Chinese Medicine (TCM) has been widely used in clinical settings, a major challenge that remains in TCM is to evaluate its efficacy scientifically. This randomized controlled trial aims to evaluate the efficacy and safety of berberine in the treatment of patients with polycystic ovary syndrome. In order to improve the transparency and research quality of this clinical trial, we prepared this statistical analysis plan (SAP). The trial design, primary and secondary outcomes, and safety outcomes were declared to reduce selection biases in data analysis and result reporting. We specified detailed methods for data management and statistical analyses. Statistics in corresponding tables, listings, and graphs were outlined. The SAP provided more detailed information than trial protocol on data management and statistical analysis methods. Any post hoc analyses could be identified via referring to this SAP, and the possible selection bias and performance bias will be reduced in the trial. This study is registered at ClinicalTrials.gov, NCT01138930 , registered on 7 June 2010.
Gray, B.A.; Zori, Roberto T.; McGuire, P.M.; Bonde, R.K.
2002-01-01
Detailed chromosome studies were conducted for the Florida manatee (Trichechus manatus latirostris) utilizing primary chromosome banding techniques (G- and Q-banding). Digital microscopic imaging methods were employed and a standard G-banded karyotype was constructed for both sexes. Based on chromosome banding patterns and measurements obtained in these studies, a standard karyotype and ideogram are proposed. Characterization of additional cytogenetic features of this species by supplemental chromosome banding techniques, C-banding (constitutive heterochromatin), Ag-NOR staining (nucleolar organizer regions), and DA/DAPI staining, was also performed. These studies provide detailed cytogenetic data for T. manatus latirostris, which could enhance future genetic mapping projects and interspecific and intraspecific genomic comparisons by techniques such as zoo-FISH.
Nebraska's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett
2011-01-01
The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...
Kansas's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; W. Keith Moser; Charles J. Barnett
2011-01-01
The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...
Hunter, Kendall S.; Lanning, Craig J.; Chen, Shiuh-Yung J.; Zhang, Yanhang; Garg, Ruchira; Ivy, D. Dunbar; Shandas, Robin
2014-01-01
Clinical imaging methods are highly effective in the diagnosis of vascular pathologies, but they do not currently provide enough detail to shed light on the cause or progression of such diseases, and would be hard pressed to foresee the outcome of surgical interventions. Greater detail of and prediction capabilities for vascular hemodynamics and arterial mechanics are obtained here through the coupling of clinical imaging methods with computational techniques. Three-dimensional, patient-specific geometric reconstructions of the pediatric proximal pulmonary vasculature were obtained from x-ray angiogram images and meshed for use with commercial computational software. Two such models from hypertensive patients, one with multiple septal defects, the other who underwent vascular reactivity testing, were each completed with two sets of suitable fluid and structural initial and boundary conditions and used to obtain detailed transient simulations of artery wall motion and hemodynamics in both clinically measured and predicted configurations. The simulation of septal defect closure, in which input flow and proximal vascular stiffness were decreased, exhibited substantial decreases in proximal velocity, wall shear stress (WSS), and pressure in the post-op state. The simulation of vascular reactivity, in which distal vascular resistance and proximal vascular stiffness were decreased, displayed negligible changes in velocity and WSS but a significant drop in proximal pressure in the reactive state. This new patient-specific technique provides much greater detail regarding the function of the pulmonary circuit than can be obtained with current medical imaging methods alone, and holds promise for enabling surgical planning. PMID:16813447
An equivalent domain integral method in the two-dimensional analysis of mixed mode crack problems
NASA Technical Reports Server (NTRS)
Raju, I. S.; Shivakumar, K. N.
1990-01-01
An equivalent domain integral (EDI) method for calculating J-integrals for two-dimensional cracked elastic bodies is presented. The details of the method and its implementation are presented for isoparametric elements. The EDI method gave accurate values of the J-integrals for two mode I and two mixed mode problems. Numerical studies showed that domains consisting of one layer of elements are sufficient to obtain accurate J-integral values. Two procedures for separating the individual modes from the domain integrals are presented.
Modeling Electromagnetic Scattering From Complex Inhomogeneous Objects
NASA Technical Reports Server (NTRS)
Deshpande, Manohar; Reddy, C. J.
2011-01-01
This software innovation is designed to develop a mathematical formulation to estimate the electromagnetic scattering characteristics of complex, inhomogeneous objects using the finite-element-method (FEM) and method-of-moments (MoM) concepts, as well as to develop a FORTRAN code called FEMOM3DS (Finite Element Method and Method of Moments for 3-Dimensional Scattering), which will implement the steps that are described in the mathematical formulation. Very complex objects can be easily modeled, and the operator of the code is not required to know the details of electromagnetic theory to study electromagnetic scattering.
Principled Approaches to Missing Data in Epidemiologic Studies
Perkins, Neil J; Cole, Stephen R; Harel, Ofer; Tchetgen Tchetgen, Eric J; Sun, BaoLuo; Mitchell, Emily M; Schisterman, Enrique F
2018-01-01
Abstract Principled methods with which to appropriately analyze missing data have long existed; however, broad implementation of these methods remains challenging. In this and 2 companion papers (Am J Epidemiol. 2018;187(3):576–584 and Am J Epidemiol. 2018;187(3):585–591), we discuss issues pertaining to missing data in the epidemiologic literature. We provide details regarding missing-data mechanisms and nomenclature and encourage the conduct of principled analyses through a detailed comparison of multiple imputation and inverse probability weighting. Data from the Collaborative Perinatal Project, a multisite US study conducted from 1959 to 1974, are used to create a masked data-analytical challenge with missing data induced by known mechanisms. We illustrate the deleterious effects of missing data with naive methods and show how principled methods can sometimes mitigate such effects. For example, when data were missing at random, naive methods showed a spurious protective effect of smoking on the risk of spontaneous abortion (odds ratio (OR) = 0.43, 95% confidence interval (CI): 0.19, 0.93), while implementation of principled methods multiple imputation (OR = 1.30, 95% CI: 0.95, 1.77) or augmented inverse probability weighting (OR = 1.40, 95% CI: 1.00, 1.97) provided estimates closer to the “true” full-data effect (OR = 1.31, 95% CI: 1.05, 1.64). We call for greater acknowledgement of and attention to missing data and for the broad use of principled missing-data methods in epidemiologic research. PMID:29165572
Principled Approaches to Missing Data in Epidemiologic Studies.
Perkins, Neil J; Cole, Stephen R; Harel, Ofer; Tchetgen Tchetgen, Eric J; Sun, BaoLuo; Mitchell, Emily M; Schisterman, Enrique F
2018-03-01
Principled methods with which to appropriately analyze missing data have long existed; however, broad implementation of these methods remains challenging. In this and 2 companion papers (Am J Epidemiol. 2018;187(3):576-584 and Am J Epidemiol. 2018;187(3):585-591), we discuss issues pertaining to missing data in the epidemiologic literature. We provide details regarding missing-data mechanisms and nomenclature and encourage the conduct of principled analyses through a detailed comparison of multiple imputation and inverse probability weighting. Data from the Collaborative Perinatal Project, a multisite US study conducted from 1959 to 1974, are used to create a masked data-analytical challenge with missing data induced by known mechanisms. We illustrate the deleterious effects of missing data with naive methods and show how principled methods can sometimes mitigate such effects. For example, when data were missing at random, naive methods showed a spurious protective effect of smoking on the risk of spontaneous abortion (odds ratio (OR) = 0.43, 95% confidence interval (CI): 0.19, 0.93), while implementation of principled methods multiple imputation (OR = 1.30, 95% CI: 0.95, 1.77) or augmented inverse probability weighting (OR = 1.40, 95% CI: 1.00, 1.97) provided estimates closer to the "true" full-data effect (OR = 1.31, 95% CI: 1.05, 1.64). We call for greater acknowledgement of and attention to missing data and for the broad use of principled missing-data methods in epidemiologic research.
A generic method for evaluating crowding in the emergency department.
Eiset, Andreas Halgreen; Erlandsen, Mogens; Møllekær, Anders Brøns; Mackenhauer, Julie; Kirkegaard, Hans
2016-06-14
Crowding in the emergency department (ED) has been studied intensively using complicated non-generic methods that may prove difficult to implement in a clinical setting. This study sought to develop a generic method to describe and analyse crowding from measurements readily available in the ED and to test the developed method empirically in a clinical setting. We conceptualised a model with ED patient flow divided into separate queues identified by timestamps for predetermined events. With temporal resolution of 30 min, queue lengths were computed as Q(t + 1) = Q(t) + A(t) - D(t), with A(t) = number of arrivals, D(t) = number of departures and t = time interval. Maximum queue lengths for each shift of each day were found and risks of crowding computed. All tests were performed using non-parametric methods. The method was applied in the ED of Aarhus University Hospital, Denmark utilising an open cohort design with prospectively collected data from a one-year observation period. By employing the timestamps already assigned to the patients while in the ED, a generic queuing model can be computed from which crowding can be described and analysed in detail. Depending on availability of data, the model can be extended to include several queues increasing the level of information. When applying the method empirically, 41,693 patients were included. The studied ED had a high risk of bed occupancy rising above 100 % during day and evening shift, especially on weekdays. Further, a 'carry over' effect was shown between shifts and days. The presented method offers an easy and generic way to get detailed insight into the dynamics of crowding in an ED.
Toxicogenomics is the study of changes in gene expression, protein, and metabolite profiles within cells and tissues, complementary to more traditional toxicological methods. Genomics tools provide detailed molecular data about the underlying biochemical mechanisms of toxicity, a...
Social insects and selfish genes.
Bourke, A F
2001-10-01
Sometimes science advances because of a new idea. Sometimes, it's because of a new technique. When both occur together, exciting times result. In the study of social insects, DNA-based methods for measuring relatedness now allow increasingly detailed tests of Hamilton's theory of kin selection.
Redefining the utility of the three-isotope method
NASA Astrophysics Data System (ADS)
Cao, Xiaobin; Bao, Huiming
2017-09-01
The equilibrium isotope fractionation factor αeq is a fundamental parameter in the study of stable isotope effects. Experimentally, it has been difficult to establish that a system has attained equilibrium. The three-isotope method, using the initial trajectory of changing isotope ratios (e.g. 16O, 17O, and 18O) to deduce the final equilibrium point of isotope exchange, has long been hailed as the most rigorous experimental approach. However, over the years some researchers have cautioned on the limitations of this method, but the foundation of three-isotope method has not been properly examined and the method is still widely used in calibrating αeq for both traditional and increasingly non-traditional isotope systems today. Here, using water-water and dissolved CO2-water oxygen exchange as model systems, we conduct an isotopologues-specific kinetic analysis of the exchange processes and explore the underlying assumptions and validity of the three-isotope method. We demonstrate that without knowing the detailed exchange kinetics a priori the three-isotope method cannot lead to a reliable αeq. For a two-reservoir exchanging system, α determined by this method may be αeq, kinetic isotope effect, or apparent kinetic isotope effect, which can all bear different values. When multiple reservoirs exist during exchange, the evolving trajectory can be complex and hard to predict. Instead of being a tool for αeq determination, three-isotope method should be used as a tool for studying kinetic isotope effect, apparent kinetic isotope effect, and detailed exchange kinetics in diverse systems.
Methods for delineating flood-prone areas in the Great Basin of Nevada and adjacent states
Burkham, D.E.
1988-01-01
The Great Basin is a region of about 210,000 square miles having no surface drainage to the ocean; it includes most of Nevada and parts of Utah, California, Oregon, Idaho, and Wyoming. The area is characterized by many parallel mountain ranges and valleys trending north-south. Stream channels usually are well defined and steep within the mountains, but on reaching the alluvial fan at the canyon mouth, they may diverge into numerous distributary channels, be discontinuous near the apex of the fan, or be deeply entrenched in the alluvial deposits. Larger rivers normally have well-defined channels to or across the valley floors, but all terminate at lakes or playas. Major floods occur in most parts of the Great Basin and result from snowmelt, frontal-storm rainfall, and localized convective rainfall. Snowmelt floods typically occur during April-June. Floods resulting from frontal rain and frontal rain on snow generally occur during November-March. Floods resulting from convective-type rainfall during localized thunderstorms occur most commonly during the summer months. Methods for delineating flood-prone areas are grouped into five general categories: Detailed, historical, analytical, physiographic, and reconnaissance. The detailed and historical methods are comprehensive methods; the analytical and physiographic are intermediate; and the reconnaissance method is only approximate. Other than the reconnaissance method, each method requires determination of a T-year discharge (the peak rate of flow during a flood with long-term average recurrence interval of T years) and T-year profile and the development of a flood-boundary map. The procedure is different, however, for each method. Appraisal of the applicability of each method included consideration of its technical soundness, limitations and uncertainties, ease of use, and costs in time and money. Of the five methods, the detailed method is probably the most accurate, though most expensive. It is applicable to hydraulic and topographic conditions found in many parts of the Great Basin. The historical method is also applicable over a wide range of conditions and is less expensive than the detailed method. However, it requires more historical flood data than are usually available, and experience and judgement are needed to obtain meaningful results. The analytical method is also less expensive than the detailed method and can be used over a wide range of conditions in which the T-year discharge can be determined directly. Experience, good judgement, and thorough knowledge of hydraulic principles are required to obtain adequate results, and the method has limited application in other than rigid-channel situations. The physiographic method is applicable to rigid-boundary channels and is less accurate than the detailed method. The reconnaissance method is relatively imprecise, but it may be the most rational method to use on alluvial fans or valley floors with discontinuous channels. In general, a comprehensive method is most suitable for use with rigid-bank streams in urban areas; only an approximate method seems justified in undeveloped areas.
Using a Thematic Analysis of Literature to Survey Subfields within Communication Studies
ERIC Educational Resources Information Center
Garner, Johny T.; Ragland, J. Parker
2015-01-01
The activity described by the authors here is a thematic analysis of published articles in a broad area of study. Students search for articles relating to the topic of study in different academic journals that fall in a specific date range. Students record details about the topics covered and theories/methods used. The class then assembles to…
Burgert, James M; Johnson, Arthur D; Garcia-Blanco, Jose C; Craig, W John; O'Sullivan, Joseph C
2015-10-01
Transcutaneous electrical induction (TCEI) has been used to induce ventricular fibrillation (VF) in laboratory swine for physiologic and resuscitation research. Many studies do not describe the method of TCEI in detail, thus making replication by future investigators difficult. Here we describe a detailed method of electrically inducing VF that was used successfully in a prospective, experimental resuscitation study. Specifically, an electrical current was passed through the heart to induce VF in crossbred Yorkshire swine (n = 30); the current was generated by using two 22-gauge spinal needles, with one placed above and one below the heart, and three 9V batteries connected in series. VF developed in 28 of the 30 pigs (93%) within 10 s of beginning the procedure. In the remaining 2 swine, VF was induced successfully after medial redirection of the superior parasternal needle. The TCEI method is simple, reproducible, and cost-effective. TCEI may be especially valuable to researchers with limited access to funding, sophisticated equipment, or colleagues experienced in interventional cardiology techniques. The TCEI method might be most appropriate for pharmacologic studies requiring VF, VF resulting from the R-on-T phenomenon (as in prolonged QT syndrome), and VF arising from other ectopic or reentrant causes. However, the TCEI method does not accurately model the most common cause of VF, acute coronary occlusive disease. Researchers must consider the limitations of TCEI that may affect internal and external validity of collected data, when designing experiments using this model of VF.
Tau Oligomers as Pathogenic Seeds: Preparation and Propagation In Vitro and In Vivo.
Gerson, Julia E; Sengupta, Urmi; Kayed, Rakez
2017-01-01
Tau oligomers have been shown to be the main toxic tau species in a number of neurodegenerative disorders. In order to study tau oligomers both in vitro and in vivo, we have established methods for the reliable preparation, isolation, and detection of tau oligomers. Methods for the seeding of tau oligomers, isolation of tau oligomers from tissue, and detection of tau oligomers using tau oligomer-specific antibodies by biochemical and immunohistochemical methods are detailed below.
Modelling crystal growth: Convection in an asymmetrically heated ampoule
NASA Technical Reports Server (NTRS)
Alexander, J. Iwan D.; Rosenberger, Franz; Pulicani, J. P.; Krukowski, S.; Ouazzani, Jalil
1990-01-01
The objective was to develop and implement a numerical method capable of solving the nonlinear partial differential equations governing heat, mass, and momentum transfer in a 3-D cylindrical geometry in order to examine the character of convection in an asymmetrically heated cylindrical ampoule. The details of the numerical method, including verification tests involving comparison with results obtained from other methods, are presented. The results of the study of 3-D convection in an asymmetrically heated cylinder are described.
Simple methods to reduce patient exposure during scoliosis radiography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, P.F.; Thomas, A.W.; Thompson, W.E.
1986-05-01
Radiation exposure to the breasts of adolescent females can be reduced significantly through the use of one or all of the following methods: fast, rare-earth screen-film combinations; specially designed compensating filters; and breast shielding. The importance of exposure reduction during scoliosis radiography as well as further details on the above described methods are discussed. In addition, the early results of a Center for Devices and Radiological Health study, which recorded exposure and technique data for scoliosis radiography, is presented.
Davis, Jenna L.; Green, B. Lee; Katz, Ralph V.
2013-01-01
Objectives To assess whether scary/alarming beliefs about details on the Tuskegee Syphilis Study (TSS) are associated with willingness and/or fear to participate in biomedical research. Methods Scary beliefs about TSS were examined for 565 Black and White adults who had heard of the TSS. Multivariate analyses by race were used to measure association. Results No association between scary beliefs and willingness or fear to participate in research was found (P>0.05). Conclusions These findings provide additional evidence that awareness or detailed knowledge about the TSS does not appear today to be a major factor influencing Blacks’ willingness to participate in research. PMID:22924230
Grazing-incidence small angle x-ray scattering studies of nanoscale polymer gratings
NASA Astrophysics Data System (ADS)
Doxastakis, Manolis; Suh, Hyo Seon; Chen, Xuanxuan; Rincon Delgadillo, Paulina A.; Wan, Lingshu; Williamson, Lance; Jiang, Zhang; Strzalka, Joseph; Wang, Jin; Chen, Wei; Ferrier, Nicola; Ramirez-Hernandez, Abelardo; de Pablo, Juan J.; Gronheid, Roel; Nealey, Paul
2015-03-01
Grazing-Incidence Small Angle X-ray Scattering (GISAXS) offers the ability to probe large sample areas, providing three-dimensional structural information at high detail in a thin film geometry. In this study we exploit the application of GISAXS to structures formed at one step of the LiNe (Liu-Nealey) flow using chemical patterns for directed self-assembly of block copolymer films. Experiments conducted at the Argonne National Laboratory provided scattering patterns probing film characteristics at both parallel and normal directions to the surface. We demonstrate the application of new computational methods to construct models based on scattering measured. Such analysis allows for extraction of structural characteristics at unprecedented detail.
An Introduction to the BFS Method and Its Use to Model Binary NiAl Alloys
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Noebe, Ronald D.; Ferrante, J.; Amador, C.
1998-01-01
We introduce the Bozzolo-Ferrante-Smith (BFS) method for alloys as a computationally efficient tool for aiding in the process of alloy design. An intuitive description of the BFS method is provided, followed by a formal discussion of its implementation. The method is applied to the study of the defect structure of NiAl binary alloys. The groundwork is laid for a detailed progression to higher order NiAl-based alloys linking theoretical calculations and computer simulations based on the BFS method and experimental work validating each step of the alloy design process.
Study on the Application of TOPSIS Method to the Introduction of Foreign Players in CBA Games
NASA Astrophysics Data System (ADS)
Zhongyou, Xing
The TOPSIS method is a multiple attribute decision-making method. This paper introduces the current situation of the introduction of foreign players in CBA games, presents the principles and calculation steps of TOPSIS method in detail, and applies it to the quantitative evaluation of the comprehensively competitive ability during the introduction of foreign players. Through the analysis of practical application, we found that the TOPSIS method has relatively high rationality and applicability when it is used to evaluate the comprehensively competitive ability during the introduction of foreign players.
Brown, Derick; Parvanta, Sarah; Dolina, Suzanne; Kelly, Bridget; Dever, Jill; Southwell, Brian G; Sanders, Amy; Augustson, Erik
2016-01-01
Background Text messaging (short message service, SMS) has been shown to be effective in delivering interventions for various diseases and health conditions, including smoking cessation. While there are many published studies regarding smoking cessation text messaging interventions, most do not provide details about the study’s operational methods. As a result, there is a gap in our understanding of how best to design studies of smoking cessation text messaging programs. Objective The purpose of this paper is to detail the operational methods used to conduct a randomized trial comparing three different versions of the National Cancer Institute’s SmokefreeText (SFTXT) program, designed for smokers 18 to 29 years of age. We detail our methods for recruiting participants from the Internet, reducing fraud, conducting online data collection, and retaining panel study participants. Methods Participants were recruited through website advertisements and market research online panels. Screening questions established eligibility for the study (eg, 18 to 29 years of age, current smoker). Antifraud measures screened out participants who could not meet the study requirements. After completing a baseline survey, participants were randomized to one of three study arms, which varied by type and timing of text message delivery. The study offered US $20 gift cards as incentives to complete each of four follow-up surveys. Automated email reminders were sent at designated intervals to increase response rates. Researchers also provided telephone reminders to those who had not completed the survey after multiple email reminders. We calculated participation rates across study arms and compared the final sample characteristics to the Current Population Survey to examine generalizability. Results Recruitment methods drove 153,936 unique visitors to the SFTXT Study landing page and 27,360 began the screener. Based on the screening questions, 15,462 out of 27,360 responders (56.51%) were eligible to participate. Of the 15,462 who were eligible, 9486 passed the antifraud measures that were implemented; however, 3882 failed to verify their email addresses or cell phone numbers, leaving 5604 who were invited to complete the baseline survey. Of the 5604 who were invited, 4432 completed the baseline survey, but only 4027 were retained for analysis because 405 did not receive the intervention. Conclusions Although antifraud measures helped to catch participants who failed study requirements and could have biased the data collected, it is possible that the email and cell phone verification check excluded some potentially eligible participants from the study. Future research should explore ways to implement verification methods without risking the loss of so many potential participants. ClinicalTrial Clinical Trials.gov NCT01885052; https://clinicaltrials.gov/ct2/show/NCT01885052; (Archived by WebCite at http://www.webcitation.org/6iWzcmFdw) PMID:27349898
Edge enhancement algorithm for low-dose X-ray fluoroscopic imaging.
Lee, Min Seok; Park, Chul Hee; Kang, Moon Gi
2017-12-01
Low-dose X-ray fluoroscopy has continually evolved to reduce radiation risk to patients during clinical diagnosis and surgery. However, the reduction in dose exposure causes quality degradation of the acquired images. In general, an X-ray device has a time-average pre-processor to remove the generated quantum noise. However, this pre-processor causes blurring and artifacts within the moving edge regions, and noise remains in the image. During high-pass filtering (HPF) to enhance edge detail, this noise in the image is amplified. In this study, a 2D edge enhancement algorithm comprising region adaptive HPF with the transient improvement (TI) method, as well as artifacts and noise reduction (ANR), was developed for degraded X-ray fluoroscopic images. The proposed method was applied in a static scene pre-processed by a low-dose X-ray fluoroscopy device. First, the sharpness of the X-ray image was improved using region adaptive HPF with the TI method, which facilitates sharpening of edge details without overshoot problems. Then, an ANR filter that uses an edge directional kernel was developed to remove the artifacts and noise that can occur during sharpening, while preserving edge details. The quantitative and qualitative results obtained by applying the developed method to low-dose X-ray fluoroscopic images and visually and numerically comparing the final images with images improved using conventional edge enhancement techniques indicate that the proposed method outperforms existing edge enhancement methods in terms of objective criteria and subjective visual perception of the actual X-ray fluoroscopic image. The developed edge enhancement algorithm performed well when applied to actual low-dose X-ray fluoroscopic images, not only by improving the sharpness, but also by removing artifacts and noise, including overshoot. Copyright © 2017 Elsevier B.V. All rights reserved.
Evaluation of different flamelet tabulation methods for laminar spray combustion
NASA Astrophysics Data System (ADS)
Luo, Yujuan; Wen, Xu; Wang, Haiou; Luo, Kun; Fan, Jianren
2018-05-01
In this work, three different flamelet tabulation methods for spray combustion are evaluated. Major differences among these methods lie in the treatment of the temperature boundary conditions of the flamelet equations. Particularly, in the first tabulation method ("M1"), both the fuel and oxidizer temperature boundary conditions are set to be fixed. In the second tabulation method ("M2"), the fuel temperature boundary condition is varied while the oxidizer temperature boundary condition is fixed. In the third tabulation method ("M3"), both the fuel and oxidizer temperature boundary conditions are varied and set to be equal. The focus of this work is to investigate whether the heat transfer between the droplet phase and gas phase can be represented by the studied tabulation methods through a priori analyses. To this end, spray flames stabilized in a three-dimensional counterflow are first simulated with detailed chemistry. Then, the trajectory variables are calculated from the detailed chemistry solutions. Finally, the tabulated thermo-chemical quantities are compared to the corresponding values from the detailed chemistry solutions. The comparisons show that the gas temperature cannot be predicted by "M1" with only a mixture fraction and reaction progress variable being the trajectory variables. The gas temperature can be correctly predicted by both "M2" and "M3," in which the total enthalpy is introduced as an additional manifold. In "M2," variations of the oxidizer temperature are considered with a temperature modification technique, which is not required in "M3." Interestingly, it is found that the mass fractions of the reactants and major products are not sensitive to the representation of the interphase heat transfer in the flamelet chemtables, and they can be correctly predicted by all tabulation methods. By contrast, the intermediate species CO and H2 in the premixed flame reaction zone are over-predicted by all tabulation methods.
Patrício, João; Kalmykova, Yuliya; Berg, Per E O; Rosado, Leonardo; Åberg, Helena
2015-05-01
In this article, a new method based on Material Flow Accounting is proposed to study detailed material flows in battery consumption that can be replicated for other countries. The method uses regularly available statistics on import, industrial production and export of batteries and battery-containing electric and electronic equipment (EEE). To promote method use by other scholars with no access to such data, several empirically results and their trends over time, for different types of batteries occurrence among the EEE types are provided. The information provided by the method can be used to: identify drivers of battery consumption; study the dynamic behavior of battery flows - due to technology development, policies, consumers behavior and infrastructures. The method is exemplified by the study of battery flows in Sweden for years 1996-2013. The batteries were accounted, both in units and weight, as primary and secondary batteries; loose and integrated; by electrochemical composition and share of battery use between different types of EEE. Results show that, despite a fivefold increase in the consumption of rechargeable batteries, they account for only about 14% of total use of portable batteries. Recent increase in digital convergence has resulted in a sharp decline in the consumption of primary batteries, which has now stabilized at a fairly low level. Conversely, the consumption of integrated batteries has increased sharply. In 2013, 61% of the total weight of batteries sold in Sweden was collected, and for the particular case of alkaline manganese dioxide batteries, the value achieved 74%. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kasprzak, Marek; Jancewicz, Kacper; Michniewicz, Aleksandra
2017-11-01
The paper presents an example of using photographs taken by unmanned aerial vehicles (UAV) and processed using the structure from motion (SfM) procedure in a geomorphological study of rock relief. Subject to analysis is a small rock city in the West Sudetes (SW Poland), known as Starościńskie Skały and developed in coarse granite bedrock. The aims of this paper were, first, to compare UAV/SfM-derived data with the cartographical image based on the traditional geomorphological field-mapping methods and the digital elevation model derived from airborne laser scanning (ALS). Second, to test if the proposed combination of UAV and SfM methods may be helpful in recognizing the detailed structure of granite tors. As a result of conducted UAV flights and digital image post-processing in AgiSoft software, it was possible to obtain datasets (dense point cloud, texture model, orthophotomap, bare-ground-type digital terrain model—DTM) which allowed to visualize in detail the surface of the study area. In consequence, it was possible to distinguish even the very small forms of rock surface microrelief: joints, aplite veins, rills and karren, weathering pits, etc., otherwise difficult to map and measure. The study includes also valorization of particular datasets concerning microtopography and allows to discuss indisputable advantages of using the UAV/SfM-based DTM in geomorphic studies of tors and rock cities, even those located within forest as in the presented case study.
Jean-Christophe Domec; Ge Sun; Asko Noormets; Michael J. Gavazzi; Emrys A. Treasure; Erika Cohen; Jennifer J. Swenson; Steve G. McNulty; John S. King
2012-01-01
Increasing variability of rainfall patterns requires detailed understanding of the pathways of water loss from ecosystems to optimize carbon uptake and management choices. In the current study we characterized the usability of three alternative methods of different rigor for quantifying stand-level evapotranspiration (ET), partitioned ET into tree transpiration (T),...
ERIC Educational Resources Information Center
Spalding, Romalda Bishop; Spalding, Walter T.
A fully detailed teacher's manual for classroom, home, or tutorial use, this book is for the child in the beginning grades, the pupil who needs remedial work, or any adult studying English as a second language, who will find learning by the Spalding Method a practical, quick way of mastering verbal skills. The book notes that the Spalding Method…
Experiences in Eliciting Security Requirements
2006-12-01
FODA ) FODA is a domain analysis and engineer- ing method that focuses on developing reusable assets [9]. By examining related software systems and...describe a trade-off analysis that we used to select a suitable requirements elici- tation method and present results detailed from a case study of one...disaster planning, and how to improve Medicare. Eventually, technology-oriented problems may emerge from these soft problems, but much more analysis is
North Dakota's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; David E. Haugen; Charles J. Barnett
2011-01-01
The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...
South Dakota's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; Ronald J. Piva; Charles J. Barnett
2011-01-01
The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...
Hydrochemical analysis to evaluate the seawater ingress in a small coral island of India.
Banerjee, Pallavi; Singh, V S; Singh, Ajay; Prasad, R K; Rangarajan, R
2012-06-01
The sustainable development of the limited groundwater resources in the tropical island requires a thorough understanding of detail hydrogeological regime including the hydrochemical behavior of groundwater. Detail analysis of chemical data of groundwater helps in assessing the different groundwater zone affected by formation as well as sea water. Groundwater and saline water interaction is better understood using groundwater major ion chemistry over an island aquifer. Multivariate methods to analyze the geochemical data are used to understand geochemical evolution of groundwater. The methods are successfully used to group the data to evaluate influence of various environs in the study area. Various classification methods such as piper, correlation method, and salinity hazard measurements are also employed to critical study of geochemical characteristics of groundwater to identify vulnerable parts of the aquifer. These approaches have been used to successfully evaluate the aquifer zones of a tiny island off the west coast of India. The most part of island is found to be safe for drinking, however some parts of island are identified that are affected by sea water ingress and dissolution of formation minerals. The analysis has successfully leaded to identification of that part of aquifer on the island which needs immediate attention for restoration and avoids further deterioration.
Liu, Yang; Wilson, W David
2010-01-01
Surface plasmon resonance (SPR) technology with biosensor surfaces has become a widely-used tool for the study of nucleic acid interactions without any labeling requirements. The method provides simultaneous kinetic and equilibrium characterization of the interactions of biomolecules as well as small molecule-biopolymer binding. SPR monitors molecular interactions in real time and provides significant advantages over optical or calorimetic methods for systems with strong binding coupled to small spectroscopic signals and/or reaction heats. A detailed and practical guide for nucleic acid interaction analysis using SPR-biosensor methods is presented. Details of the SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips, and samples, as well as extensive information on experimental design, quantitative and qualitative data analysis and presentation. A specific example of the interaction of a minor-groove-binding agent with DNA is evaluated by both kinetic and steady-state SPR methods to illustrate the technique. Since the molecules that bind cooperatively to specific DNA sequences are attractive for many applications, a cooperative small molecule-DNA interaction is also presented.
Extending methods: using Bourdieu's field analysis to further investigate taste
NASA Astrophysics Data System (ADS)
Schindel Dimick, Alexandra
2015-06-01
In this commentary on Per Anderhag, Per-Olof Wickman and Karim Hamza's article Signs of taste for science, I consider how their study is situated within the concern for the role of science education in the social and cultural production of inequality. Their article provides a finely detailed methodology for analyzing the constitution of taste within science education classrooms. Nevertheless, because the authors' socially situated methodology draws upon Bourdieu's theories, it seems equally important to extend these methods to consider how and why students make particular distinctions within a relational context—a key aspect of Bourdieu's theory of cultural production. By situating the constitution of taste within Bourdieu's field analysis, researchers can explore the ways in which students' tastes and social positionings are established and transformed through time, space, place, and their ability to navigate the field. I describe the process of field analysis in relation to the authors' paper and suggest that combining the authors' methods with a field analysis can provide a strong methodological and analytical framework in which theory and methods combine to create a detailed understanding of students' interest in relation to their context.
Marshall, N W
2001-06-01
This paper applies a published version of signal detection theory to x-ray image intensifier fluoroscopy data and compares the results with more conventional subjective image quality measures. An eight-bit digital framestore was used to acquire temporally contiguous frames of fluoroscopy data from which the modulation transfer function (MTF(u)) and noise power spectrum were established. These parameters were then combined to give detective quantum efficiency (DQE(u)) and used in conjunction with signal detection theory to calculate contrast-detail performance. DQE(u) was found to lie between 0.1 and 0.5 for a range of fluoroscopy systems. Two separate image quality experiments were then performed in order to assess the correspondence between the objective and subjective methods. First, image quality for a given fluoroscopy system was studied as a function of doserate using objective parameters and a standard subjective contrast-detail method. Following this, the two approaches were used to assess three different fluoroscopy units. Agreement between objective and subjective methods was good; doserate changes were modelled correctly while both methods ranked the three systems consistently.
Reinforcing aluminum alloys with high strength fibers
NASA Technical Reports Server (NTRS)
Kolpashnikov, A. I.; Manuylov, V. F.; Chukhin, B. D.; Shiryayev, Y. V.; Shurygin, A. S.
1982-01-01
A study is made of the possibility of reinforcing aluminum and aluminum based alloys with fibers made of high strength steel wire. The method of introducing the fibers is described in detail. Additional strengthening by reinforcement of the high alloy system Al - An - Mg was investigated.
This is a site assessment and feasibility study of incineration alternatives at the ACME Solvents Site at Rockford, Illinois. The document contains laboratory results that are reported to simulate incineration conditions but no details on test methods were provided. The d...
Study of the relationship between solar activity and terrestrial weather
NASA Technical Reports Server (NTRS)
Sturrock, P. A.; Brueckner, G. E.; Dickinson, R. E.; Fukuta, N.; Lanzerotti, L. J.; Lindzen, R. S.; Park, C. G.; Wilcox, J. M.
1976-01-01
Evidence for some connection between weather and solar related phenomena is presented. Historical data of world wide temperature variations with relationship to change in solar luminosity are examined. Several test methods for estimating the statistical significance of such phenomena are discussed in detail.
A TEACHER'S GUIDE FOR ADULT BASIC EDUCATION.
ERIC Educational Resources Information Center
BROWN, ANTRONETTE
COMPILED AS AN IDEA AND INFORMATION GUIDE FOR TEACHERS OF ADULT BASIC EDUCATION, THIS DOCUMENT INCLUDES DETAILED TEACHING OBJECTIVES, METHODS, AND MATERIALS (FILMS, FILMSTRIPS, BOOKS, TRANSPARENCIES). THE COURSE INCLUDES (1) READING AND COMMUNICATION SKILLS--PHONICS, VOCABULARY, REFERENCE, AND SO ON, (2) SOCIAL STUDIES--GOVERNMENT, GEOGRAPHY,…
Gerson Therapy (PDQ®)—Health Professional Version
Gerson therapy is advocated by its supporters as a method of treating cancer patients based on changes in diet and nutrient intake. No results of laboratory or animal studies have been published in scientific journals. Get detailed information about Gerson therapy in this summary for clinicians.
Load rating and FRP retrofitting of bridge abutment timber piles.
DOT National Transportation Integrated Search
2016-05-01
This report details Phase II of the study titled Strengthening of Bridge Wood Piling Retrofits for Moment Resistance. Phase I of the research (project : R27-082) was focused on developing a load rating method for timber piles under eccentric load and...
Fast and Efficient Stochastic Optimization for Analytic Continuation
Bao, Feng; Zhang, Guannan; Webster, Clayton G; ...
2016-09-28
In this analytic continuation of imaginary-time quantum Monte Carlo data to extract real-frequency spectra remains a key problem in connecting theory with experiment. Here we present a fast and efficient stochastic optimization method (FESOM) as a more accessible variant of the stochastic optimization method introduced by Mishchenko et al. [Phys. Rev. B 62, 6317 (2000)], and we benchmark the resulting spectra with those obtained by the standard maximum entropy method for three representative test cases, including data taken from studies of the two-dimensional Hubbard model. Genearally, we find that our FESOM approach yields spectra similar to the maximum entropy results.more » In particular, while the maximum entropy method yields superior results when the quality of the data is strong, we find that FESOM is able to resolve fine structure with more detail when the quality of the data is poor. In addition, because of its stochastic nature, the method provides detailed information on the frequency-dependent uncertainty of the resulting spectra, while the maximum entropy method does so only for the spectral weight integrated over a finite frequency region. Therefore, we believe that this variant of the stochastic optimization approach provides a viable alternative to the routinely used maximum entropy method, especially for data of poor quality.« less
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Research as a Respectful Practice: An Exploration of the Practice of Respect in Qualitative Research
ERIC Educational Resources Information Center
O'Grady, Emmanuel
2016-01-01
This article explores the practice of respect within qualitative research methods. As interpersonal respect plays a significant role in the esteem felt within a relationship, it can also serve to cultivate trust between researchers and their participants in a research study. This article details the findings of a research study examining respect…
Project-Based Learning in Education: Integrating Business Needs and Student Learning
ERIC Educational Resources Information Center
Cho, Yonjoo; Brown, Catherine
2013-01-01
Purpose: The purpose of this case study was to investigate how project-based learning (PBL) is being practiced in Columbus Signature Academy (CSA), a high school located in Columbus, Indiana, USA. Design/methodology/approach: The authors used the case study method to provide qualitative details about CSA's use of PBL that is being practiced in a…
ERIC Educational Resources Information Center
Erez, Daniella Levy; Levy, Jacov; Friger, Michael; Aharoni-Mayer, Yael; Cohen-Iluz, Moran; Goldstein, Esther
2010-01-01
Aim: Individuals with congenital insensitivity to pain with anhidrosis (CIPA) are reported to have mental retardation but to our knowledge no detailed study on the subject has ever been published. The present study assessed and documented cognitive and adaptive behaviour among Arab Bedouin children with CIPA. Methods: Twenty-three Arab Bedouin…
Framing Prospective Elementary Teachers' Conceptions of Dissolving as a Ladder of Explanations
ERIC Educational Resources Information Center
Subramaniam, Karthigeyan; Esprivalo Harrell, Pamela
2013-01-01
The paper details an exploratory qualitative study that investigated 61 prospective teachers' conceptual understanding of dissolving salt and sugar in water respectively. The study was set within a 15-week elementary science methods course that included a 5E learning cycle lesson on dissolving, the instructional context. Oversby's…
ERIC Educational Resources Information Center
Tourangeau, Karen; Brick, Mike; Byrne, Lauren; Le, Thanh; Nord, Christine; West, Jerry; Hausken, Elvira Germino
2005-01-01
This methodology report provides technical information about the development, design, and conduct of the third grade data collection of the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 (ECLS-K). Detailed information on the development of the instruments, sample design, data collection methods, data preparation and editing,…
ERIC Educational Resources Information Center
Bates, Samantha; Ball, Annahita; Wilks, Scott
2016-01-01
Objective: Psychometric details of the Parental Press for Academic Achievement and Postsecondary Planning Scale (PPS), developed by Chicago Consortium of Chicago School Research, are scarce. The purpose of this study was to reexamine the properties of this 7-item measure. Method: The study utilized cross-sectional, self-reported data from 100…
NASA Technical Reports Server (NTRS)
Newman, C. M.
1976-01-01
The constraints and limitations for STS Consumables Management are studied. Variables imposing constraints on the consumables related subsystems are identified, and a method determining constraint violations with the simplified consumables model in the Mission Planning Processor is presented.
Design Fixation and Cooperative Learning in Elementary Engineering Design Project: A Case Study
ERIC Educational Resources Information Center
Luo, Yi
2015-01-01
This paper presents a case study examining 3rd, 4th and 5th graders' design fixation and cooperative learning in an engineering design project. A mixed methods instrument, the Cooperative Learning Observation Protocol (CLOP), was adapted to record frequency and class observation on cooperative learning engagement through detailed field notes.…
Romanenko, A.; Bebeshko, V; Hatch, M; Bazyka, D; Finch, S.; Dyagil, I; Reiss, R.; Chumak, V; Bouville, A; Gudzenko, N; Zablotska, L; Pilinskaya, M.; Lyubarets, T.; Bakhanova, E.; Babkina, N.; Trotsiuk, N.; Ledoschuk, B.; Belayev, Y.; Dybsky, S.S.; Ron, E.; Howe, G.
2010-01-01
Thus far there are relatively few data on the risk of leukemia among those who were exposed to external radiation during cleanup operations following the Chornobyl nuclear accident, and results have not been consistent. To investigate this issue further, we assembled a cohort of 110,645 male cleanup workers from Ukraine and identified cases of leukemia occurring during the period 1986 to 2000. Detailed interviews were conducted and individual bone marrow doses were estimated using a new time-and-motion method known as RADRUE (Realistic Analytical Dose Reconstruction with Uncertainty Estimate). See companion paper II for a detailed description of the dosimetry. For the initial analyses we used a nested case-control approach with a minimum of five controls per case, matched for year of birth, oblast (region) of registration and residence. All identified cases were reviewed by an international panel of experts. The dose-response analysis and results are given in companion paper III. PMID:19138036
Numerical simulations of turbulent jet ignition and combustion
NASA Astrophysics Data System (ADS)
Validi, Abdoulahad; Irannejad, Abolfazl; Jaberi, Farhad
2013-11-01
The ignition and combustion of a homogeneous lean hydrogen-air mixture by a turbulent jet flow of hot combustion products injected into a colder gas mixture are studied by a high fidelity numerical model. Turbulent jet ignition can be considered as an efficient method for starting and controlling the reaction in homogeneously charged combustion systems used in advanced internal combustion and gas turbine engines. In this work, we study in details the physics of turbulent jet ignition in a fundamental flow configuration. The flow and combustion are modeled with the hybrid large eddy simulation/filtered mass density function (LES/FMDF) approach, in which the filtered form the compressible Navier-Stokes equations are solved with a high-order finite difference scheme for the turbulent velocity and the FMDF transport equations are solved with a Lagrangian stochastic method to obtain the scalar (temperature and species mass fractions) field. The hydrogen oxidation is described by a detailed reaction mechanism with 37 elementary reactions and 9 species.
Visualization of chorioretinal vasculature in mice in vivo using a combined OCT/SLO imaging system
NASA Astrophysics Data System (ADS)
Goswami, Mayank; Zhang, Pengfei; Pugh, Edward N.; Zawadzki, Robert J.
2016-03-01
Chorioretinal blood vessel morphology in mice is of great interest to researchers studying eye disease mechanisms in animal models. Two leading retinal imaging modalities -- Optical Coherence Tomography (OCT) and Scanning Laser Ophthalmoscopy (SLO) -- have offered much insight into vascular morphology and blood flow. OCT "flow-contrast" methods have provided detailed mapping of vascular morphology with micrometer depth resolution, while OCT Doppler methods have enabled the measurement of local flow velocities. SLO remains indispensable in studying blood leakage, microaneurysms, and the clearance time of contrast agents of different sizes. In this manuscript we present results obtained with a custom OCT/SLO system applied to visualize the chorioretinal vascular morphology of pigmented C57Bl/6J and albino nude (Nu/Nu) mice. Blood perfusion maps of choroidal vessels and choricapillaris created by OCT and SLO are presented, along with detailed evaluation of different OCT imaging parameters, including the use of the scattering contrast agent Intralipid. Future applications are discussed.
An a priori study of different tabulation methods for turbulent pulverised coal combustion
NASA Astrophysics Data System (ADS)
Luo, Yujuan; Wen, Xu; Wang, Haiou; Luo, Kun; Jin, Hanhui; Fan, Jianren
2018-05-01
In many practical pulverised coal combustion systems, different oxidiser streams exist, e.g. the primary- and secondary-air streams in the power plant boilers, which makes the modelling of these systems challenging. In this work, three tabulation methods for modelling pulverised coal combustion are evaluated through an a priori study. Pulverised coal flames stabilised in a three-dimensional turbulent counterflow, consisting of different oxidiser streams, are simulated with detailed chemistry first. Then, the thermo-chemical quantities calculated with different tabulation methods are compared to those from detailed chemistry solutions. The comparison shows that the conventional two-stream flamelet model with a fixed oxidiser temperature cannot predict the flame temperature correctly. The conventional two-stream flamelet model is then modified to set the oxidiser temperature equal to the fuel temperature, both of which are varied in the flamelets. By this means, the variations of oxidiser temperature can be considered. It is found that this modified tabulation method performs very well on prediction of the flame temperature. The third tabulation method is an extended three-stream flamelet model that was initially proposed for gaseous combustion. The results show that the reference gaseous temperature profile can be overall reproduced by the extended three-stream flamelet model. Interestingly, it is found that the predictions of major species mass fractions are not sensitive to the oxidiser temperature boundary conditions for the flamelet equations in the a priori analyses.
Convergence acceleration of the Proteus computer code with multigrid methods
NASA Technical Reports Server (NTRS)
Demuren, A. O.; Ibraheem, S. O.
1995-01-01
This report presents the results of a study to implement convergence acceleration techniques based on the multigrid concept in the two-dimensional and three-dimensional versions of the Proteus computer code. The first section presents a review of the relevant literature on the implementation of the multigrid methods in computer codes for compressible flow analysis. The next two sections present detailed stability analysis of numerical schemes for solving the Euler and Navier-Stokes equations, based on conventional von Neumann analysis and the bi-grid analysis, respectively. The next section presents details of the computational method used in the Proteus computer code. Finally, the multigrid implementation and applications to several two-dimensional and three-dimensional test problems are presented. The results of the present study show that the multigrid method always leads to a reduction in the number of iterations (or time steps) required for convergence. However, there is an overhead associated with the use of multigrid acceleration. The overhead is higher in 2-D problems than in 3-D problems, thus overall multigrid savings in CPU time are in general better in the latter. Savings of about 40-50 percent are typical in 3-D problems, but they are about 20-30 percent in large 2-D problems. The present multigrid method is applicable to steady-state problems and is therefore ineffective in problems with inherently unstable solutions.
Layer-Based Approach for Image Pair Fusion.
Son, Chang-Hwan; Zhang, Xiao-Ping
2016-04-20
Recently, image pairs, such as noisy and blurred images or infrared and noisy images, have been considered as a solution to provide high-quality photographs under low lighting conditions. In this paper, a new method for decomposing the image pairs into two layers, i.e., the base layer and the detail layer, is proposed for image pair fusion. In the case of infrared and noisy images, simple naive fusion leads to unsatisfactory results due to the discrepancies in brightness and image structures between the image pair. To address this problem, a local contrast-preserving conversion method is first proposed to create a new base layer of the infrared image, which can have visual appearance similar to another base layer such as the denoised noisy image. Then, a new way of designing three types of detail layers from the given noisy and infrared images is presented. To estimate the noise-free and unknown detail layer from the three designed detail layers, the optimization framework is modeled with residual-based sparsity and patch redundancy priors. To better suppress the noise, an iterative approach that updates the detail layer of the noisy image is adopted via a feedback loop. This proposed layer-based method can also be applied to fuse another noisy and blurred image pair. The experimental results show that the proposed method is effective for solving the image pair fusion problem.
Van Oosten, John
1928-01-01
This study shows that the structural characters of the scales of the coregonid fishes of Lake Huron are so clearly recognizable as to permit their use by the scale method. It shows, further, that the fundamental assumptions underlying the scale method are warranted in so far as they apply to the lake herring (Leucichthys artedi Le Sueur). The scale method is therefore valid when applied in a study fo the life history of the lake herring. The life history of the lake herring that occur in Lake Huron is described in detail in this paper for the first time.
Medical Representatives' Intention to Use Information Technology in Pharmaceutical Marketing
Kwak, Eun-Seon
2016-01-01
Objectives Electronic detailing (e-detailing), the use of electronic devices to facilitate sales presentations to physicians, has been adopted and expanded in the pharmaceutical industry. To maximize the potential outcome of e-detailing, it is important to understand medical representatives (MRs)' behavior and attitude to e-detailing. This study investigates how information technology devices such as laptop computers and tablet PCs are utilized in pharmaceutical marketing, and it analyzes the factors influencing MRs' intention to use devices. Methods This study has adopted and modified the theory of Roger's diffusion of innovation model and the technology acceptance model. To test the model empirically, a questionnaire survey was conducted with 221 MRs who were working in three multinational or eleven domestic pharmaceutical companies in Korea. Results Overall, 28% and 35% of MRs experienced using laptop computers and tablet PCs in pharmaceutical marketing, respectively. However, the rates were different across different groups of MRs, categorized by age, education level, position, and career. The results showed that MRs' intention to use information technology devices was significantly influenced by perceived usefulness in general. Perceived ease of use, organizational and individual innovativeness, and several MR characteristics were also found to have significant impacts. Conclusions This study provides timely information about e-detailing devices to marketing managers and policy makers in the pharmaceutical industry for successful marketing strategy development by understanding the needs of MRs' intention to use information technology. Further in-depth study should be conducted to understand obstacles and limitations and to improve the strategies for better marketing tools. PMID:27895967
Super-Resolution Reconstruction of Remote Sensing Images Using Multifractal Analysis
Hu, Mao-Gui; Wang, Jin-Feng; Ge, Yong
2009-01-01
Satellite remote sensing (RS) is an important contributor to Earth observation, providing various kinds of imagery every day, but low spatial resolution remains a critical bottleneck in a lot of applications, restricting higher spatial resolution analysis (e.g., intra-urban). In this study, a multifractal-based super-resolution reconstruction method is proposed to alleviate this problem. The multifractal characteristic is common in Nature. The self-similarity or self-affinity presented in the image is useful to estimate details at larger and smaller scales than the original. We first look for the presence of multifractal characteristics in the images. Then we estimate parameters of the information transfer function and noise of the low resolution image. Finally, a noise-free, spatial resolution-enhanced image is generated by a fractal coding-based denoising and downscaling method. The empirical case shows that the reconstructed super-resolution image performs well in detail enhancement. This method is not only useful for remote sensing in investigating Earth, but also for other images with multifractal characteristics. PMID:22291530
Topology optimization in acoustics and elasto-acoustics via a level-set method
NASA Astrophysics Data System (ADS)
Desai, J.; Faure, A.; Michailidis, G.; Parry, G.; Estevez, R.
2018-04-01
Optimizing the shape and topology (S&T) of structures to improve their acoustic performance is quite challenging. The exact position of the structural boundary is usually of critical importance, which dictates the use of geometric methods for topology optimization instead of standard density approaches. The goal of the present work is to investigate different possibilities for handling topology optimization problems in acoustics and elasto-acoustics via a level-set method. From a theoretical point of view, we detail two equivalent ways to perform the derivation of surface-dependent terms and propose a smoothing technique for treating problems of boundary conditions optimization. In the numerical part, we examine the importance of the surface-dependent term in the shape derivative, neglected in previous studies found in the literature, on the optimal designs. Moreover, we test different mesh adaptation choices, as well as technical details related to the implicit surface definition in the level-set approach. We present results in two and three-space dimensions.
Mission and system optimization of nuclear electric propulsion vehicles for lunar and Mars missions
NASA Technical Reports Server (NTRS)
Gilland, James H.
1991-01-01
The detailed mission and system optimization of low thrust electric propulsion missions is a complex, iterative process involving interaction between orbital mechanics and system performance. Through the use of appropriate approximations, initial system optimization and analysis can be performed for a range of missions. The intent of these calculations is to provide system and mission designers with simple methods to assess system design without requiring access or detailed knowledge of numerical calculus of variations optimizations codes and methods. Approximations for the mission/system optimization of Earth orbital transfer and Mars mission have been derived. Analyses include the variation of thruster efficiency with specific impulse. Optimum specific impulse, payload fraction, and power/payload ratios are calculated. The accuracy of these methods is tested and found to be reasonable for initial scoping studies. Results of optimization for Space Exploration Initiative lunar cargo and Mars missions are presented for a range of power system and thruster options.
Comparison of 15 evaporation methods applied to a small mountain lake in the northeastern USA
Rosenberry, D.O.; Winter, T.C.; Buso, D.C.; Likens, G.E.
2007-01-01
Few detailed evaporation studies exist for small lakes or reservoirs in mountainous settings. A detailed evaporation study was conducted at Mirror Lake, a 0.15 km2 lake in New Hampshire, northeastern USA, as part of a long-term investigation of lake hydrology. Evaporation was determined using 14 alternate evaporation methods during six open-water seasons and compared with values from the Bowen-ratio energy-budget (BREB) method, considered the standard. Values from the Priestley-Taylor, deBruin-Keijman, and Penman methods compared most favorably with BREB-determined values. Differences from BREB values averaged 0.19, 0.27, and 0.20 mm d-1, respectively, and results were within 20% of BREB values during more than 90% of the 37 monthly comparison periods. All three methods require measurement of net radiation, air temperature, change in heat stored in the lake, and vapor pressure, making them relatively data intensive. Several of the methods had substantial bias when compared with BREB values and were subsequently modified to eliminate bias. Methods that rely only on measurement of air temperature, or air temperature and solar radiation, were relatively cost-effective options for measuring evaporation at this small New England lake, outperforming some methods that require measurement of a greater number of variables. It is likely that the atmosphere above Mirror Lake was affected by occasional formation of separation eddies on the lee side of nearby high terrain, although those influences do not appear to be significant to measured evaporation from the lake when averaged over monthly periods. ?? 2007 Elsevier B.V. All rights reserved.
Characteristics and Impact of Drug Detailing for Gabapentin
Steinman, Michael A; Harper, G. Michael; Chren, Mary-Margaret; Landefeld, C. Seth; Bero, Lisa A
2007-01-01
Background Sales visits by pharmaceutical representatives (“drug detailing”) are common, but little is known about the content of these visits or about the impact of visit characteristics on prescribing behavior. In this study, we evaluated the content and impact of detail visits for gabapentin by analyzing market research forms completed by physicians after receiving a detail visit for this drug. Methods and Findings Market research forms that describe detail visits for gabapentin became available through litigation that alleged that gabapentin was promoted for “off-label” uses. Forms were available for 97 physicians reporting on 116 detail visits between 1995 and 1999. Three-quarters of recorded visits (91/116) occurred in 1996. Two-thirds of visits (72/107) were 5 minutes or less in duration, 65% (73/113) were rated of high informational value, and 39% (42/107) were accompanied by the delivery or promise of samples. During the period of this study, gabapentin was approved by the US Food and Drug Administration only for the adjunctive treatment of partial seizures, but in 38% of visits (44/115) the “main message” of the visit involved at least one off-label use. After receiving the detail visit, 46% (50/108) of physicians reported the intention to increase their prescribing or recommending of gabapentin in the future. In multivariable analysis, intent to increase future use or recommendation of gabapentin was associated with receiving the detail in a small group (versus one-on-one) setting and with low or absent baseline use of the drug, but not with other factors such as visit duration, discussion of “on-label” versus “off-label” content, and the perceived informational value of the presentation. Conclusions Detail visits for gabapentin were of high perceived informational value and often involved messages about unapproved uses. Despite their short duration, detail visits were frequently followed by physician intentions to increase their future recommending or prescribing of the drug. PMID:17455990
The CRDS method application for study of the gas-phase processes in the hot CVD diamond thin film.
NASA Astrophysics Data System (ADS)
Buzaianumakarov, Vladimir; Hidalgo, Arturo; Morell, Gerardo; Weiner, Brad; Buzaianu, Madalina
2006-03-01
For detailed analysis of problem related to the hot CVD carbon-containing nano-material growing, we have to detect different intermediate species forming during the growing process as well as investigate dependences of concentrations of these species on different experimental parameters (concentrations of the CJH4, H2S stable chemical compounds and distance from the filament system to the substrate surface). In the present study, the HS and CS radicals were detected using the Cavity Ring Down Spectroscopic (CRDS) method in the hot CVD diamond thin film for the CH4(0.4 %) + H2 mixture doped by H2S (400 ppm). The absolute absorption density spectra of the HS and CS radicals were obtained as a function of different experimental parameters. This study proofs that the HS and CS radicals are an intermediate, which forms during the hot filament CVD process. The kinetics approach was developed for detailed analysis of the experimental data obtained. The kinetics scheme includes homogenous and heterogenous processes as well as processes of the chemical species transport in the CVD chamber.
Busetto, Loraine; Luijkx, Katrien; Calciolari, Stefano; González-Ortiz, Laura G; Vrijhoef, Hubertus J M
2017-03-08
In this paper, we provide a detailed and explicit description of the processes and decisions underlying and shaping the emergent multimethod research design of our study on workforce changes in integrated chronic care. The study was originally planned as mixed method research consisting of a preliminary literature review and quantitative check of these findings via a Delphi panel. However, when the findings of the literature review were not appropriate for quantitative confirmation, we chose to continue our qualitative exploration of the topic via qualitative questionnaires and secondary analysis of two best practice case reports. The resulting research design is schematically described as an emergent and interactive multimethod design with multiphase combination timing. In doing so, we provide other researchers with a set of theory- and experience-based options to develop their own multimethod research and provide an example for more detailed and structured reporting of emergent designs. We argue that the terminology developed for the description of mixed methods designs should also be used for multimethod designs such as the one presented here.
Selection and application of microbial source tracking tools for water-quality investigations
Stoeckel, Donald M.
2005-01-01
Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.
NASA Technical Reports Server (NTRS)
Deepak, Adarsh; Wang, Pi-Huan
1985-01-01
The research program is documented for developing space and ground-based remote sensing techniques performed during the period from December 15, 1977 to March 15, 1985. The program involved the application of sophisticated radiative transfer codes and inversion methods to various advanced remote sensing concepts for determining atmospheric constituents, particularly aerosols. It covers detailed discussions of the solar aureole technique for monitoring columnar aerosol size distribution, and the multispectral limb scattered radiance and limb attenuated radiance (solar occultation) techniques, as well as the upwelling scattered solar radiance method for determining the aerosol and gaseous characteristics. In addition, analytical models of aerosol size distribution and simulation studies of the limb solar aureole radiance technique and the variability of ozone at high altitudes during satellite sunrise/sunset events are also described in detail.
Mendieta-Moreno, Jesús I; Marcos-Alcalde, Iñigo; Trabada, Daniel G; Gómez-Puertas, Paulino; Ortega, José; Mendieta, Jesús
2015-01-01
Quantum mechanics/molecular mechanics (QM/MM) methods are excellent tools for the modeling of biomolecular reactions. Recently, we have implemented a new QM/MM method (Fireball/Amber), which combines an efficient density functional theory method (Fireball) and a well-recognized molecular dynamics package (Amber), offering an excellent balance between accuracy and sampling capabilities. Here, we present a detailed explanation of the Fireball method and Fireball/Amber implementation. We also discuss how this tool can be used to analyze reactions in biomolecules using steered molecular dynamics simulations. The potential of this approach is shown by the analysis of a reaction catalyzed by the enzyme triose-phosphate isomerase (TIM). The conformational space and energetic landscape for this reaction are analyzed without a priori assumptions about the protonation states of the different residues during the reaction. The results offer a detailed description of the reaction and reveal some new features of the catalytic mechanism. In particular, we find a new reaction mechanism that is characterized by the intramolecular proton transfer from O1 to O2 and the simultaneous proton transfer from Glu 165 to C2. Copyright © 2015 Elsevier Inc. All rights reserved.
Baldrian, Petr; López-Mondéjar, Rubén
2014-02-01
Molecular methods for the analysis of biomolecules have undergone rapid technological development in the last decade. The advent of next-generation sequencing methods and improvements in instrumental resolution enabled the analysis of complex transcriptome, proteome and metabolome data, as well as a detailed annotation of microbial genomes. The mechanisms of decomposition by model fungi have been described in unprecedented detail by the combination of genome sequencing, transcriptomics and proteomics. The increasing number of available genomes for fungi and bacteria shows that the genetic potential for decomposition of organic matter is widespread among taxonomically diverse microbial taxa, while expression studies document the importance of the regulation of expression in decomposition efficiency. Importantly, high-throughput methods of nucleic acid analysis used for the analysis of metagenomes and metatranscriptomes indicate the high diversity of decomposer communities in natural habitats and their taxonomic composition. Today, the metaproteomics of natural habitats is of interest. In combination with advanced analytical techniques to explore the products of decomposition and the accumulation of information on the genomes of environmentally relevant microorganisms, advanced methods in microbial ecophysiology should increase our understanding of the complex processes of organic matter transformation.
Griffiths, Frances; Sidebotham, Peter
2016-01-01
Objectives Improvements in our understanding of the role of modifiable risk factors for sudden infant death syndrome (SIDS) mean that previous reassurance to parents that these deaths were unpreventable may no longer be appropriate. This study aimed to learn of bereaved parents' and healthcare professionals' experiences of understanding causes of death following detailed sudden unexpected death in infancy (SUDI) investigations. The research questions were: How do bereaved parents understand the cause of death and risk factors identified during detailed investigation following a sudden unexpected infant death? What is the association between bereaved parents' mental health and this understanding? What are healthcare professionals' experiences of sharing such information with families? Design This was a mixed-methods study using a Framework Approach. Setting Specialist paediatric services. Participants Bereaved parents were recruited following detailed multiagency SUDI investigations; 21/113 eligible families and 27 professionals participated giving theoretical saturation of data. Data collection We analysed case records from all agencies, interviewed professionals and invited parents to complete the Hospital Anxiety and Depression Scale (HADS) and questionnaires or in-depth interviews. Results Nearly all bereaved parents were able to understand the cause of death and several SIDS parents had a good understanding of the relevant modifiable risk factors even when these related directly to their actions. Paediatricians worried that discussing risk factors with parents would result in parental self-blame and some deliberately avoided these discussions. Over half the families did not mention blame or blamed no one. The cause of death of the infants of these families varied. 3/21 mothers expressed overwhelming feelings of self-blame and had clinically significant scores on HADS. Conclusions Bereaved parents want detailed information about their child's death. Our study suggests parents want health professionals to explain the role of risk factors in SIDS. We found no evidence that sharing this information is a direct cause of parental self-blame. PMID:27198994
A method for detergent-free isolation of membrane proteins in their local lipid environment.
Lee, Sarah C; Knowles, Tim J; Postis, Vincent L G; Jamshad, Mohammed; Parslow, Rosemary A; Lin, Yu-Pin; Goldman, Adrian; Sridhar, Pooja; Overduin, Michael; Muench, Stephen P; Dafforn, Timothy R
2016-07-01
Despite the great importance of membrane proteins, structural and functional studies of these proteins present major challenges. A significant hurdle is the extraction of the functional protein from its natural lipid membrane. Traditionally achieved with detergents, purification procedures can be costly and time consuming. A critical flaw with detergent approaches is the removal of the protein from the native lipid environment required to maintain functionally stable protein. This protocol describes the preparation of styrene maleic acid (SMA) co-polymer to extract membrane proteins from prokaryotic and eukaryotic expression systems. Successful isolation of membrane proteins into SMA lipid particles (SMALPs) allows the proteins to remain with native lipid, surrounded by SMA. We detail procedures for obtaining 25 g of SMA (4 d); explain the preparation of protein-containing SMALPs using membranes isolated from Escherichia coli (2 d) and control protein-free SMALPS using E. coli polar lipid extract (1-2 h); investigate SMALP protein purity by SDS-PAGE analysis and estimate protein concentration (4 h); and detail biophysical methods such as circular dichroism (CD) spectroscopy and sedimentation velocity analytical ultracentrifugation (svAUC) to undertake initial structural studies to characterize SMALPs (∼2 d). Together, these methods provide a practical tool kit for those wanting to use SMALPs to study membrane proteins.
NASA Astrophysics Data System (ADS)
Sangeetha, M.; Mathammal, R.
2018-02-01
The ionic cocrystals of 5-amino-2-naphthalene sulfonate · ammonium ions (ANSA-ṡNH4+) were grown under slow evaporation method and examined in detail for pharmaceutical applications. The crystal structure and intermolecular interactions were studied from the single X-ray diffraction analysis and the Hirshfeld surfaces. The 2D fingerprint plots displayed the inter-contacts possible in the ionic crystal. Computational DFT method was established to determine the structural, physical and chemical properties. The molecular geometries obtained from the X-ray studies were compared with the optimized geometrical parameters calculated using DFT/6-31 + G(d,p) method. The band gap energy calculated from the UV-Visible spectral analysis and the HOMO-LUMO energy gap are compared. The theoretical UV-Visible calculations helped in determining the type of electronic transition taking place in the title molecule. The maximum absorption bands and transitions involved in the molecule represented the drug reaction possible. Non-linear optical properties were characterized from SHG efficiency measurements experimentally and the NLO parameters are also calculated from the optimized structure. The reactive sites within the molecule are detailed from the MEP surface maps. The molecular docking studies evident the structure-activity of the ionic cocrystal for anti-cancer drug property.
Comparative evaluation of RetCam vs. gonioscopy images in congenital glaucoma.
Azad, Raj V; Chandra, Parijat; Chandra, Anuradha; Gupta, Aparna; Gupta, Viney; Sihota, Ramanjit
2014-02-01
To compare clarity, exposure and quality of anterior chamber angle visualization in congenital glaucoma patients, using RetCam and indirect gonioscopy images. Cross-sectional study Participants. Congenital glaucoma patients over age of 5 years. A prospective consecutive pilot study was done in congenital glaucoma patients who were older than 5 years. Methods used are indirect gonioscopy and RetCam imaging. Clarity of the image, extent of angle visible and details of angle structures seen were graded for both methods, on digitally recorded images, in each eye, by two masked observers. Image clarity, interobserver agreement. 40 eyes of 25 congenital glaucoma patients were studied. RetCam image had excellent clarity in 77.5% of patients versus 47.5% by gonioscopy. The extent of angle seen was similar by both methods. Agreement between RetCam and gonioscopy images regarding details of angle structures was 72.50% by observer 1 and 65.00% by observer 2. There was good agreement between RetCam and indirect gonioscopy images in detecting angle structures of congenital glaucoma patients. However, RetCam provided greater clarity, with better quality, and higher magnification images. RetCam can be a useful alternative to gonioscopy in infants and small children without the need for general anesthesia.
Studying the Microanatomy of the Heart in Three Dimensions: A Practical Update
Jarvis, Jonathan C.; Stephenson, Robert
2013-01-01
The structure and function of the heart needs to be understood in three dimensions. We give a brief historical summary of the methods by which such an understanding has been sought, and some practical details of the relatively new technique of micro-CT with iodine contrast enhancement in samples from rat and rabbit. We discuss how the improved anatomical detail available in fixed cadaveric hearts will enhance our ability to model and to understand the integrated function of the cardiomyocytes, conducting tissues, and fibrous supporting structures that generate the pumping function of the heart. PMID:24400272
Comparative study of inversion methods of three-dimensional NMR and sensitivity to fluids
NASA Astrophysics Data System (ADS)
Tan, Maojin; Wang, Peng; Mao, Keyu
2014-04-01
Three-dimensional nuclear magnetic resonance (3D NMR) logging can simultaneously measure transverse relaxation time (T2), longitudinal relaxation time (T1), and diffusion coefficient (D). These parameters can be used to distinguish fluids in the porous reservoirs. For 3D NMR logging, the relaxation mechanism and mathematical model, Fredholm equation, are introduced, and the inversion methods including Singular Value Decomposition (SVD), Butler-Reeds-Dawson (BRD), and Global Inversion (GI) methods are studied in detail, respectively. During one simulation test, multi-echo CPMG sequence activation is designed firstly, echo trains of the ideal fluid models are synthesized, then an inversion algorithm is carried on these synthetic echo trains, and finally T2-T1-D map is built. Futhermore, SVD, BRD, and GI methods are respectively applied into a same fluid model, and the computing speed and inversion accuracy are compared and analyzed. When the optimal inversion method and matrix dimention are applied, the inversion results are in good aggreement with the supposed fluid model, which indicates that the inversion method of 3D NMR is applieable for fluid typing of oil and gas reservoirs. Additionally, the forward modeling and inversion tests are made in oil-water and gas-water models, respectively, the sensitivity to the fluids in different magnetic field gradients is also examined in detail. The effect of magnetic gradient on fluid typing in 3D NMR logging is stuied and the optimal manetic gradient is choosen.
NASA Astrophysics Data System (ADS)
Leitão, J. P.; de Sousa, L. M.
2018-06-01
Newly available, more detailed and accurate elevation data sets, such as Digital Elevation Models (DEMs) generated on the basis of imagery from terrestrial LiDAR (Light Detection and Ranging) systems or Unmanned Aerial Vehicles (UAVs), can be used to improve flood-model input data and consequently increase the accuracy of the flood modelling results. This paper presents the first application of the MBlend merging method and assesses the impact of combining different DEMs on flood modelling results. It was demonstrated that different raster merging methods can have different and substantial impacts on these results. In addition to the influence associated with the method used to merge the original DEMs, the magnitude of the impact also depends on (i) the systematic horizontal and vertical differences of the DEMs, and (ii) the orientation between the DEM boundary and the terrain slope. The greater water depth and flow velocity differences between the flood modelling results obtained using the reference DEM and the merged DEMs ranged from -9.845 to 0.002 m, and from 0.003 to 0.024 m s-1 respectively; these differences can have a significant impact on flood hazard estimates. In most of the cases investigated in this study, the differences from the reference DEM results were smaller for the MBlend method than for the results of the two conventional methods. This study highlighted the importance of DEM merging when conducting flood modelling and provided hints on the best DEM merging methods to use.
2014-01-01
Background There is increasing interest in innovative methods to carry out systematic reviews of complex interventions. Theory-based approaches, such as logic models, have been suggested as a means of providing additional insights beyond that obtained via conventional review methods. Methods This paper reports the use of an innovative method which combines systematic review processes with logic model techniques to synthesise a broad range of literature. The potential value of the model produced was explored with stakeholders. Results The review identified 295 papers that met the inclusion criteria. The papers consisted of 141 intervention studies and 154 non-intervention quantitative and qualitative articles. A logic model was systematically built from these studies. The model outlines interventions, short term outcomes, moderating and mediating factors and long term demand management outcomes and impacts. Interventions were grouped into typologies of practitioner education, process change, system change, and patient intervention. Short-term outcomes identified that may result from these interventions were changed physician or patient knowledge, beliefs or attitudes and also interventions related to changed doctor-patient interaction. A range of factors which may influence whether these outcomes lead to long term change were detailed. Demand management outcomes and intended impacts included content of referral, rate of referral, and doctor or patient satisfaction. Conclusions The logic model details evidence and assumptions underpinning the complex pathway from interventions to demand management impact. The method offers a useful addition to systematic review methodologies. Trial registration number PROSPERO registration number: CRD42013004037. PMID:24885751
Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler
2016-01-01
This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.
Lessons in Reading Reform: Finding What Works. Technical Appendix
ERIC Educational Resources Information Center
Betts, Julian R.; Zau, Andrew C.; Koedel, Cory
2010-01-01
This technical appendix provides more detail on the reading reforms implemented under the Blueprint for Student Success project in the San Diego Unified School District (SDUSD) between 2000 and 2005. It provides details on the dataset, the econometric methods the authors employed, and the results, which are also detailed and discussed in the main…
Zhao, Zi-Fang; Li, Xue-Zhu; Wan, You
2017-12-01
The local field potential (LFP) is a signal reflecting the electrical activity of neurons surrounding the electrode tip. Synchronization between LFP signals provides important details about how neural networks are organized. Synchronization between two distant brain regions is hard to detect using linear synchronization algorithms like correlation and coherence. Synchronization likelihood (SL) is a non-linear synchronization-detecting algorithm widely used in studies of neural signals from two distant brain areas. One drawback of non-linear algorithms is the heavy computational burden. In the present study, we proposed a graphic processing unit (GPU)-accelerated implementation of an SL algorithm with optional 2-dimensional time-shifting. We tested the algorithm with both artificial data and raw LFP data. The results showed that this method revealed detailed information from original data with the synchronization values of two temporal axes, delay time and onset time, and thus can be used to reconstruct the temporal structure of a neural network. Our results suggest that this GPU-accelerated method can be extended to other algorithms for processing time-series signals (like EEG and fMRI) using similar recording techniques.
Conceptual design of a Bitter-magnet toroidal-field system for the ZEPHYR Ignition Test Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, J.E.C.; Becker, H.D.; Bobrov, E.S.
1981-05-01
The following problems are described and discussed: (1) parametric studies - these studies examine among other things the interdependence of throat stresses, plasma parameters (margins of ignition) and stored energy. The latter is a measure of cost and is minimized in the present design; (2) magnet configuration - the shape of the plates are considered in detail including standard turns, turns located at beam ports, diagnostic and closure flanges; (3) ripple computation - this section describes the codes by which ripple is computed; (4) field diffusion and nuclear heating - the effect of magnetic field diffusion on heating is consideredmore » along with neutron heating. Current, field and temperature profiles are computed; (5) finite element analysis - the two and three dimensional finite element codes are described and the results discussed in detail; (6) structures engineering - this considers the calculation of critical stresses due to toroidal and overturning forces and discusses the method of constraint of these forces. The Materials Testing Program is also discussed; (7) fabrication - the methods available for the manufacture of the constituent parts of the Bitter plates, the method of assembly and remote maintenance are summarized.« less
NASA Astrophysics Data System (ADS)
Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi
2017-01-01
Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.
This compilation of field collection standard operating procedures (SOPs) was assembled for the U.S. Environmental Protection Agency’s (EPA) Pilot Study add-on to the Green Housing Study (GHS). A detailed description of this add-on study can be found in the peer reviewed research...
Image superresolution by midfrequency sparse representation and total variation regularization
NASA Astrophysics Data System (ADS)
Xu, Jian; Chang, Zhiguo; Fan, Jiulun; Zhao, Xiaoqiang; Wu, Xiaomin; Wang, Yanzi
2015-01-01
Machine learning has provided many good tools for superresolution, whereas existing methods still need to be improved in many aspects. On one hand, the memory and time cost should be reduced. On the other hand, the step edges of the results obtained by the existing methods are not clear enough. We do the following work. First, we propose a method to extract the midfrequency features for dictionary learning. This method brings the benefit of a reduction of the memory and time complexity without sacrificing the performance. Second, we propose a detailed wiping-off total variation (DWO-TV) regularization model to reconstruct the sharp step edges. This model adds a novel constraint on the downsampling version of the high-resolution image to wipe off the details and artifacts and sharpen the step edges. Finally, step edges produced by the DWO-TV regularization and the details provided by learning are fused. Experimental results show that the proposed method offers a desirable compromise between low time and memory cost and the reconstruction quality.
X-ray phase-contrast tomography for high-spatial-resolution zebrafish muscle imaging
NASA Astrophysics Data System (ADS)
Vågberg, William; Larsson, Daniel H.; Li, Mei; Arner, Anders; Hertz, Hans M.
2015-11-01
Imaging of muscular structure with cellular or subcellular detail in whole-body animal models is of key importance for understanding muscular disease and assessing interventions. Classical histological methods for high-resolution imaging methods require excision, fixation and staining. Here we show that the three-dimensional muscular structure of unstained whole zebrafish can be imaged with sub-5 μm detail with X-ray phase-contrast tomography. Our method relies on a laboratory propagation-based phase-contrast system tailored for detection of low-contrast 4-6 μm subcellular myofibrils. The method is demonstrated on 20 days post fertilization zebrafish larvae and comparative histology confirms that we resolve individual myofibrils in the whole-body animal. X-ray imaging of healthy zebrafish show the expected structured muscle pattern while specimen with a dystrophin deficiency (sapje) displays an unstructured pattern, typical of Duchenne muscular dystrophy. The method opens up for whole-body imaging with sub-cellular detail also of other types of soft tissue and in different animal models.
Liu, M; Wei, L; Zhang, J
2006-01-01
Missing data in clinical trials are inevitable. We highlight the ICH guidelines and CPMP points to consider on missing data. Specifically, we outline how we should consider missing data issues when designing, planning and conducting studies to minimize missing data impact. We also go beyond the coverage of the above two documents, provide a more detailed review of the basic concepts of missing data and frequently used terminologies, and examples of the typical missing data mechanism, and discuss technical details and literature for several frequently used statistical methods and associated software. Finally, we provide a case study where the principles outlined in this paper are applied to one clinical program at protocol design, data analysis plan and other stages of a clinical trial.
Holzhauser, Thomas; Ree, Ronald van; Poulsen, Lars K; Bannon, Gary A
2008-10-01
There is detailed guidance on how to perform bioinformatic analyses and enzymatic degradation studies for genetically modified crops under consideration for approval by regulatory agencies; however, there is no consensus in the scientific community on the details of how to perform IgE serum studies. IgE serum studies are an important safety component to acceptance of genetically modified crops when the introduced protein is novel, the introduced protein is similar to known allergens, or the crop is allergenic. In this manuscript, we describe the characteristics of the reagents, validation of assay performance, and data analysis necessary to optimize the information obtained from serum testing of novel proteins and genetically modified (GM) crops and to make results more accurate and comparable between different investigations.
Applying inoculation theory to the study of recidivism reduction in criminal prison inmates.
Matusitz, Jonathan; Breen, Gerald-Mark
2013-10-01
The purpose of the authors through this study is to establish inoculation theory as a viable method in the prevention or reduction of recidivism in criminal prison inmate populations in the United States. The authors begin with a detailed literature review on inoculation. They also describe, in detail, recidivism in prisons. In doing so, they provide a series of interconnected topics, such as the total number of inmates in U.S. prisons, statistical displays of repeat offenders or subjects of recidivism, and the types of crimes often times repeated by convicted criminals. What comes afterwards is an explication of how inoculation theory can be applied in the context of reducing prisoner recidivism. The authors conclude this study with a discussion section that offers suggestions for future research.
Information Based Numerical Practice.
1987-02-01
characterization by comparative computational studies of various benchmark problems. See e.g. [MacNeal, Harder (1985)], [Robinson, Blackham (1981)] any...FOR NONADAPTIVE METHODS 2.1. THE QUADRATURE FORMULA The simplest example studied in detail in the literature is the problem of the optimal quadrature...formulae and the functional analytic prerequisites for the study of optimal formulae, we refer to the large monography (808 p) of [Sobolev (1974)]. Let us
Calysto: Risk Management for Commercial Manned Spaceflight
NASA Technical Reports Server (NTRS)
Dillaman, Gary
2012-01-01
The Calysto: Risk Management for Commercial Manned Spaceflight study analyzes risk management in large enterprises and how to effectively communicate risks across organizations. The Calysto Risk Management tool developed by NASA's Kennedy Space Center's SharePoint team is used and referenced throughout the study. Calysto is a web-base tool built on Microsoft's SharePoint platform. The risk management process at NASA is examined and incorporated in the study. Using risk management standards from industry and specific organizations at the Kennedy Space Center, three methods of communicating and elevating risk are examined. Each method describes details of the effectiveness and plausibility of using the method in the Calysto Risk Management Tool. At the end of the study suggestions are made for future renditions of Calysto.
Spaceflight and Immune Responses of Rhesus Monkeys
NASA Technical Reports Server (NTRS)
Sonnenfeld, Gerald
1997-01-01
In the grant period, we perfected techniques for determination of interleukin production and leukocyte subset analysis of rhesus monkeys. These results are outlined in detail in publication number 2, appended to this report. Additionally, we participated in the ARRT restraint test to determine if restraint conditions for flight in the Space Shuttle could contribute to any effects of space flight on immune responses. All immunological parameters listed in the methods section were tested. Evaluation of the data suggests that the restraint conditions had minimal effects on the results observed, but handling of the monkeys could have had some effect. These results are outlined in detail in manuscript number 3, appended to this report. Additionally, to help us develop our rhesus monkey immunology studies, we carried out preliminary studies in mice to determine the effects of stressors on immunological parameters. We were able to show that there were gender-based differences in the response of immunological parameters to a stressor. These results are outlined in detail in manuscript number 4, appended to this report.
Roberts-Ashby, Tina; Brandon N. Ashby,
2016-01-01
This paper demonstrates geospatial modification of the USGS methodology for assessing geologic CO2 storage resources, and was applied to the Pre-Punta Gorda Composite and Dollar Bay reservoirs of the South Florida Basin. The study provides detailed evaluation of porous intervals within these reservoirs and utilizes GIS to evaluate the potential spatial distribution of reservoir parameters and volume of CO2 that can be stored. This study also shows that incorporating spatial variation of parameters using detailed and robust datasets may improve estimates of storage resources when compared to applying uniform values across the study area derived from small datasets, like many assessment methodologies. Geospatially derived estimates of storage resources presented here (Pre-Punta Gorda Composite = 105,570 MtCO2; Dollar Bay = 24,760 MtCO2) were greater than previous assessments, which was largely attributed to the fact that detailed evaluation of these reservoirs resulted in higher estimates of porosity and net-porous thickness, and areas of high porosity and thick net-porous intervals were incorporated into the model, likely increasing the calculated volume of storage space available for CO2 sequestration. The geospatial method for evaluating CO2 storage resources also provides the ability to identify areas that potentially contain higher volumes of storage resources, as well as areas that might be less favorable.
Cylinder expansion test and gas gun experiment comparison
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrier, Danielle
This is a summer internship presentation by the Hydro Working Group at Los Alamos National Laboratory (LANL) and goes into detail about their cylinder expansion test and gas gun experiment comparison. Specifically, the gas gun experiment is detailed along with applications, the cylinder expansion test is detailed along with applications, there is a comparison of the methods with pros and cons and limitations listed, the summer project is detailed, and future work is talked about.
Perceptual Learning of Acoustic Noise by Individuals with Dyslexia
ERIC Educational Resources Information Center
Agus, Trevor R.; Carrión-Castillo, Amaia; Pressnitzer, Daniel; Ramus, Franck
2014-01-01
Purpose: A phonological deficit is thought to affect most individuals with developmental dyslexia. The present study addresses whether the phonological deficit is caused by difficulties with perceptual learning of fine acoustic details. Method: A demanding test of nonverbal auditory memory, "noise learning," was administered to both…
Computer-Based Instruction in Dietetics Education.
ERIC Educational Resources Information Center
Schroeder, Lois; Kent, Phyllis
1982-01-01
Details the development and system design of a computer-based instruction (CBI) program designed to provide tutorial training in diet modification as part of renal therapy and provides the results of a study that compared the effectiveness of the CBI program with the traditional lecture/laboratory method. (EAO)
Communicative Language Teaching in the Chinese Environment
ERIC Educational Resources Information Center
Hu, Wei
2010-01-01
In order to explore effective ways to develop Chinese English learners' communicative competence, this study first briefly reviews the advantages of communicative language teaching (CLT) method which widely practiced in the Western countries and analyzes in details its obstacles in Chinese classroom context. Then it offers guidelines for…
Singh, Rina; Singh, Jagjit; Singh, Ramanpreet; Nanda, Sonia
2013-01-01
Objective:To determine the effect of different retraction cord medicaments on surface detail reproduction of polyvinyl siloxane impression materials and compare this effect on any two brands of commercially available polyvinyl siloxane impression materials. Material and methods: Four stainless steel dies were made according to ADA specification no.19. Three dies were treated with aluminium chloride (5%), ferric sulphate (13.3%) and epinephrine (0.1%) while the fourth one was left untreated to serve as control. Two impression materials (Dentsply and 3M ESPE) were used. Results: All the three medicaments adversely affected the surface detail reproduction of both the brands of the polyvinyl siloxane impression materials. These effects were statistically significant as compared to untreated control. The impressions of 3M ESPE brand have shown better surface detail reproduction as compared to Dentsply impression material. Conclusion: Surface detail reproduction of the polyvinyl siloxane impression materials is adversely affected by the retraction cord medicaments. The presence of moisture or any traces of the medicaments should be removed from the tooth surface to provide a dry field for the correct reproduction of the surface detail of these materials. Key words:Polyvinyl Siloxane, retraction cord medicaments, surface detail reproduction. PMID:24455069
NASA Astrophysics Data System (ADS)
Rao, K. H. S.; Shah, A. v.; Ruedi, B.
1982-11-01
The importance of ovulation time detection in the Practice of Natural Birth Control (NBC) as a contraceptive tool, and for natural/artificial insemination among women having the problem of in-fertility, is well known. The simple Basal Body Temperature (BBT) method of ovulation detection is so far unreliable. A newly proposed Differential Skin Temperature (DST) method may help minimize disturbing physiological effects and improve reliability. This paper explains preliminary results of a detailed correlative study on the DST method, using Infra-Red Thermography (IRT) imaging, and computer analysis techniques. Results obtained with five healthy, normally menstruating women volunteers will be given.
Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.; ...
2016-09-18
This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.
A generalization of random matrix theory and its application to statistical physics.
Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H
2017-02-01
To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.
This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.
Functional phosphoproteomic mass spectrometry-based approaches
2012-01-01
Mass Spectrometry (MS)-based phosphoproteomics tools are crucial for understanding the structure and dynamics of signaling networks. Approaches such as affinity purification followed by MS have also been used to elucidate relevant biological questions in health and disease. The study of proteomes and phosphoproteomes as linked systems, rather than research studies of individual proteins, are necessary to understand the functions of phosphorylated and un-phosphorylated proteins under spatial and temporal conditions. Phosphoproteome studies also facilitate drug target protein identification which may be clinically useful in the near future. Here, we provide an overview of general principles of signaling pathways versus phosphorylation. Likewise, we detail chemical phosphoproteomic tools, including pros and cons with examples where these methods have been applied. In addition, basic clues of electrospray ionization and collision induced dissociation fragmentation are detailed in a simple manner for successful phosphoproteomic clinical studies. PMID:23369623
2018-02-12
Unclassified Unlimited 49 Jeffrey Cramer (202) 404-3419 Fuel stability and performance problems are often due to the presence of trace levels of contaminants or...other minor changes in composition. Detailed compositional analyses of suspect fuels are often critical to the determination of the cause(s) of the...problem(s) at hand. Sensitive methods to compare fuel compositions via GC-MS methods are available, but the detailed compositional analyses of
NASA Technical Reports Server (NTRS)
Rees, T. H.; Suttles, J. T.
1972-01-01
A computer study was conducted to compare the numerical behavior of two approaches to describing the thermodynamic properties of oxygen near the critical point. Data on the relative differences between values of specific heats at constant pressure (sub p) density, and isotherm and isochor derivatives of the equation of state are presented for selected supercritical pressures at temperatures in the range 100 to 300 K. The results of a more detailed study of the sub p representations afforded by the two methods are also presented.
NASA Astrophysics Data System (ADS)
Slathia, Goldy; Raina, Bindu; Gupta, Rashmi; Bamzai, K. K.
2018-05-01
The synthesis of samarium chloride coordinated single crystal was carried out at room temperature by slow evaporation method. The crystal possesses a well defined hexagonal morphology with six symmetrically equivalent growth sectors separated by growth boundaries. The theoretical morphology has been established by structural approach using Bravaise-Friedele-Donnaye-Harker (BFDH) law. Fourier transform infra red spectroscopy was carried in order to study the geometry and structure of the crystal. The detailed thermogravimetric analysis elucidates the thermal stability of the complex.
Partial Variance of Increments Method in Solar Wind Observations and Plasma Simulations
NASA Astrophysics Data System (ADS)
Greco, A.; Matthaeus, W. H.; Perri, S.; Osman, K. T.; Servidio, S.; Wan, M.; Dmitruk, P.
2018-02-01
The method called "PVI" (Partial Variance of Increments) has been increasingly used in analysis of spacecraft and numerical simulation data since its inception in 2008. The purpose of the method is to study the kinematics and formation of coherent structures in space plasmas, a topic that has gained considerable attention, leading the development of identification methods, observations, and associated theoretical research based on numerical simulations. This review paper will summarize key features of the method and provide a synopsis of the main results obtained by various groups using the method. This will enable new users or those considering methods of this type to find details and background collected in one place.
Neonatal Atlas Construction Using Sparse Representation
Shi, Feng; Wang, Li; Wu, Guorong; Li, Gang; Gilmore, John H.; Lin, Weili; Shen, Dinggang
2014-01-01
Atlas construction generally includes first an image registration step to normalize all images into a common space and then an atlas building step to fuse the information from all the aligned images. Although numerous atlas construction studies have been performed to improve the accuracy of the image registration step, unweighted or simply weighted average is often used in the atlas building step. In this article, we propose a novel patch-based sparse representation method for atlas construction after all images have been registered into the common space. By taking advantage of local sparse representation, more anatomical details can be recovered in the built atlas. To make the anatomical structures spatially smooth in the atlas, the anatomical feature constraints on group structure of representations and also the overlapping of neighboring patches are imposed to ensure the anatomical consistency between neighboring patches. The proposed method has been applied to 73 neonatal MR images with poor spatial resolution and low tissue contrast, for constructing a neonatal brain atlas with sharp anatomical details. Experimental results demonstrate that the proposed method can significantly enhance the quality of the constructed atlas by discovering more anatomical details especially in the highly convoluted cortical regions. The resulting atlas demonstrates superior performance of our atlas when applied to spatially normalizing three different neonatal datasets, compared with other start-of-the-art neonatal brain atlases. PMID:24638883
Mapping protein-RNA interactions by RCAP, RNA-cross-linking and peptide fingerprinting.
Vaughan, Robert C; Kao, C Cheng
2015-01-01
RNA nanotechnology often feature protein RNA complexes. The interaction between proteins and large RNAs are difficult to study using traditional structure-based methods like NMR or X-ray crystallography. RCAP, an approach that uses reversible-cross-linking affinity purification method coupled with mass spectrometry, has been developed to map regions within proteins that contact RNA. This chapter details how RCAP is applied to map protein-RNA contacts within virions.
NASA Technical Reports Server (NTRS)
Chaney, William S.
1961-01-01
A theoretical study has been made of molybdenum dioxide and molybdenum trioxide in order to extend the knowledge of factors Involved in the oxidation of molybdenum. New methods were developed for calculating the lattice energies based on electrostatic valence theory, and the coulombic, polarization, Van der Waals, and repulsion energie's were calculated. The crystal structure was examined and structure details were correlated with lattice energy.
Pseudo-color coding method for high-dynamic single-polarization SAR images
NASA Astrophysics Data System (ADS)
Feng, Zicheng; Liu, Xiaolin; Pei, Bingzhi
2018-04-01
A raw synthetic aperture radar (SAR) image usually has a 16-bit or higher bit depth, which cannot be directly visualized on 8-bit displays. In this study, we propose a pseudo-color coding method for high-dynamic singlepolarization SAR images. The method considers the characteristics of both SAR images and human perception. In HSI (hue, saturation and intensity) color space, the method carries out high-dynamic range tone mapping and pseudo-color processing simultaneously in order to avoid loss of details and to improve object identifiability. It is a highly efficient global algorithm.
Nie, Yifan; Liang, Chaoping; Cha, Pil-Ryung; Colombo, Luigi; Wallace, Robert M; Cho, Kyeongjae
2017-06-07
Controlled growth of crystalline solids is critical for device applications, and atomistic modeling methods have been developed for bulk crystalline solids. Kinetic Monte Carlo (KMC) simulation method provides detailed atomic scale processes during a solid growth over realistic time scales, but its application to the growth modeling of van der Waals (vdW) heterostructures has not yet been developed. Specifically, the growth of single-layered transition metal dichalcogenides (TMDs) is currently facing tremendous challenges, and a detailed understanding based on KMC simulations would provide critical guidance to enable controlled growth of vdW heterostructures. In this work, a KMC simulation method is developed for the growth modeling on the vdW epitaxy of TMDs. The KMC method has introduced full material parameters for TMDs in bottom-up synthesis: metal and chalcogen adsorption/desorption/diffusion on substrate and grown TMD surface, TMD stacking sequence, chalcogen/metal ratio, flake edge diffusion and vacancy diffusion. The KMC processes result in multiple kinetic behaviors associated with various growth behaviors observed in experiments. Different phenomena observed during vdW epitaxy process are analysed in terms of complex competitions among multiple kinetic processes. The KMC method is used in the investigation and prediction of growth mechanisms, which provide qualitative suggestions to guide experimental study.
A comparative study of new and current methods for dental micro-CT image denoising
Lashgari, Mojtaba; Qin, Jie; Swain, Michael
2016-01-01
Objectives: The aim of the current study was to evaluate the application of two advanced noise-reduction algorithms for dental micro-CT images and to implement a comparative analysis of the performance of new and current denoising algorithms. Methods: Denoising was performed using gaussian and median filters as the current filtering approaches and the block-matching and three-dimensional (BM3D) method and total variation method as the proposed new filtering techniques. The performance of the denoising methods was evaluated quantitatively using contrast-to-noise ratio (CNR), edge preserving index (EPI) and blurring indexes, as well as qualitatively using the double-stimulus continuous quality scale procedure. Results: The BM3D method had the best performance with regard to preservation of fine textural features (CNREdge), non-blurring of the whole image (blurring index), the clinical visual score in images with very fine features and the overall visual score for all types of images. On the other hand, the total variation method provided the best results with regard to smoothing of images in texture-free areas (CNRTex-free) and in preserving the edges and borders of image features (EPI). Conclusions: The BM3D method is the most reliable technique for denoising dental micro-CT images with very fine textural details, such as shallow enamel lesions, in which the preservation of the texture and fine features is of the greatest importance. On the other hand, the total variation method is the technique of choice for denoising images without very fine textural details in which the clinician or researcher is interested mainly in anatomical features and structural measurements. PMID:26764583
Resolution studies with the DATURA beam telescope
NASA Astrophysics Data System (ADS)
Jansen, H.
2016-12-01
Detailed studies of the resolution of a EUDET-type beam telescope are carried out using the DATURA beam telescope as an example. The EUDET-type beam telescopes make use of CMOS MIMOSA 26 pixel detectors for particle tracking allowing for precise characterisation of particle-sensing devices. A profound understanding of the performance of the beam telescope as a whole is obtained by a detailed characterisation of the sensors themselves. The differential intrinsic resolution as measured in a MIMOSA 26 sensor is extracted using an iterative pull method, and various quantities that depend on the size of the cluster produced by a traversing charged particle are discussed: the residual distribution, the intra-pixel residual-width distribution and the intra-pixel density distribution of track incident positions.
NASA Astrophysics Data System (ADS)
Abidin, Nurul Hafizah Zainal; Mokhtar, Nor Fadzillah Mohd; Majid, Zanariah Abdul; Ghani, Siti Salwa Abd
2017-11-01
Temperature dependent viscosity and Coriolis force were applied to the steady Benard-Marangoni convection where the lower boundary of a horizontal layer of the binary mixture is heated from below and cooled from above. The purpose of this paper is to study in detail the onset of convection with these effects. Few cases of boundary conditions are studied which are rigid-rigid, rigid-free and free-free representing the lower-upper boundaries. A detailed numerical calculation of the marginal stability curves was performed by using the Galerkin method and it is showed that temperature dependent viscosity and Soret number destabilize the binary fluid layer system and Taylor number act oppositely.
Adequacy of surface analytical tools for studying the tribology of ceramics
NASA Technical Reports Server (NTRS)
Sliney, H. E.
1986-01-01
Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.
Dark and bright-state polaritons in triple- Λ EIT system
NASA Astrophysics Data System (ADS)
Selvan, Karthick
2018-04-01
Properties of polaritons in triple-Λ EIT system are investigated using Sawada-Brout-Chong method. The role of dark and bright-state polaritons in the dynamics of the system is studied in detail by including the decay of excited atomic levels. Time evolution of entanglement of single and three-photon EIT modes within the system is investigated to explain this study.
Servant-Leader Development in an Adult Accelerated Degree Completion Program: A Mixed-Methods Study
ERIC Educational Resources Information Center
Anderson, Angela R.
2009-01-01
Although many private Christian liberal arts programs exist today that seek to foster servant-leader (SL) development within their students, there is a void of both literature and data that details how servant-leadership development occurs and what contexts may be appropriate or necessary for this development. The purpose of this study was to…
Energetics of feeding, social behavior, and life history in non-human primates.
Emery Thompson, Melissa
2017-05-01
Energy is a variable of key importance to a wide range of research in primate behavioral ecology, life history, and conservation. However, obtaining detailed data on variation in energetic condition, and its biological consequences, has been a considerable challenge. In the past 20years, tremendous strides have been made towards non-invasive methods for monitoring the physiology of animals in their natural environment. These methods provide detailed, individualized data about energetic condition, as well as energy allocations to growth, reproduction, and somatic health. In doing so, they add much-needed resolution by which to move beyond correlative studies to research programs that can discriminate causes from effects and disaggregate multiple correlated features of the social and physical environment. In this review, I describe the conceptual and methodological approaches for studying primate energetics. I then discuss the core questions about primate feeding ecology, social behavior, and life history that can benefit from physiological studies, highlighting the ways in which recent research has done so. Among these are studies that test, and often refute, common assumptions about how feeding ecology shapes primate biology, and those that reveal proximate associations between energetics and reproductive strategies. Copyright © 2016 Elsevier Inc. All rights reserved.
Squiers, Linda; Brown, Derick; Parvanta, Sarah; Dolina, Suzanne; Kelly, Bridget; Dever, Jill; Southwell, Brian G; Sanders, Amy; Augustson, Erik
2016-06-27
Text messaging (short message service, SMS) has been shown to be effective in delivering interventions for various diseases and health conditions, including smoking cessation. While there are many published studies regarding smoking cessation text messaging interventions, most do not provide details about the study's operational methods. As a result, there is a gap in our understanding of how best to design studies of smoking cessation text messaging programs. The purpose of this paper is to detail the operational methods used to conduct a randomized trial comparing three different versions of the National Cancer Institute's SmokefreeText (SFTXT) program, designed for smokers 18 to 29 years of age. We detail our methods for recruiting participants from the Internet, reducing fraud, conducting online data collection, and retaining panel study participants. Participants were recruited through website advertisements and market research online panels. Screening questions established eligibility for the study (eg, 18 to 29 years of age, current smoker). Antifraud measures screened out participants who could not meet the study requirements. After completing a baseline survey, participants were randomized to one of three study arms, which varied by type and timing of text message delivery. The study offered US $20 gift cards as incentives to complete each of four follow-up surveys. Automated email reminders were sent at designated intervals to increase response rates. Researchers also provided telephone reminders to those who had not completed the survey after multiple email reminders. We calculated participation rates across study arms and compared the final sample characteristics to the Current Population Survey to examine generalizability. Recruitment methods drove 153,936 unique visitors to the SFTXT Study landing page and 27,360 began the screener. Based on the screening questions, 15,462 out of 27,360 responders (56.51%) were eligible to participate. Of the 15,462 who were eligible, 9486 passed the antifraud measures that were implemented; however, 3882 failed to verify their email addresses or cell phone numbers, leaving 5604 who were invited to complete the baseline survey. Of the 5604 who were invited, 4432 completed the baseline survey, but only 4027 were retained for analysis because 405 did not receive the intervention. Although antifraud measures helped to catch participants who failed study requirements and could have biased the data collected, it is possible that the email and cell phone verification check excluded some potentially eligible participants from the study. Future research should explore ways to implement verification methods without risking the loss of so many potential participants. Clinical Trials.gov NCT01885052; https://clinicaltrials.gov/ct2/show/NCT01885052; (Archived by WebCite at http://www.webcitation.org/6iWzcmFdw).
A new automated passive capillary lysimeter for logging real-time drainage water fluxes
USDA-ARS?s Scientific Manuscript database
Effective monitoring of chemical transport through the soil profile requires accurate and appropriate instrumentation to measure drainage water fluxes below the root zone of cropping system. The objectives of this study were to methodically describe in detail the construction and installation of a n...
ERIC Educational Resources Information Center
Redmond, Sean M.; Ash, Andrea C.; Hogan, Tiffany P.
2015-01-01
Purpose: Co-occurring attention-deficit/hyperactivity disorder (ADHD) and communication disorders represent a frequently encountered challenge for school-based practitioners. The purpose of the present study was to examine in more detail the clinical phenomenology of co-occurring ADHD and language impairments (LIs). Method: Measures of nonword…
NASA Technical Reports Server (NTRS)
Nusinov, M. D.; Kochnev, V. A.; Chernyak, Y. B.; Kuznetsov, A. V.; Kosolapov, A. I.; Yakovlev, O. I.
1974-01-01
Study of evaporation, condensation and sputtering on the moon can provide information on the same processes on other planets, and reveal details of the formation of the lunar regolith. Simulation methods include vacuum evaporation, laser evaporation, and bubbling gas through melts.
ERIC Educational Resources Information Center
Burkholder, Jessica Reno
2014-01-01
A modified version of Moustakas' (1994) method of analyzing phenomenological data was used to illuminate how full-time, single, Turkish international graduate students conceptualized their experiences as international students. The participants detailed common and salient aspects of their experience: personal growth, decisions regarding…
Abusive Head Trauma: A Perpetrator Confesses
ERIC Educational Resources Information Center
Bell, Erica; Shouldice, Michelle; Levin, Alex V.
2011-01-01
Objectives: To present a detailed confession from a perpetrator of Shaken Baby syndrome. Methods: Case study. Results: We present a confession of Shaken Baby syndrome describing how the perpetrator severely injured a 3 year old with repeated bursts of acceleration-deceleration (shaking). The child sustained retinal and intracranial hemorrhage.…
The synthesis of brassinosteroids, a new class of plant hormones
NASA Astrophysics Data System (ADS)
Lakhvich, Fedor A.; Khripach, Vladimir A.; Zhabinskii, Vladimir N.
1991-06-01
Data on methods of synthesis of brassinosteroids are correlated. In view of their extremely low reported content in nature, the role of chemical synthesis from other natural steroids as the main source of obtaining these hormones for detailed study and practical use is assessed. The bibliography contains 224 references.
School Quality and Learning Gains in Rural Guatemala
ERIC Educational Resources Information Center
Marshall, Jeffery H.
2009-01-01
I use unusually detailed data on schools, teachers and classrooms to explain student achievement growth in rural Guatemala. Several variables that have received little attention in previous studies--including the number of school days, teacher content knowledge and pedagogical methods--are robust predictors of achievement. A series of…
Estimating forest characteristics using NAIP imagery and ArcObjects
John S Hogland; Nathaniel M. Anderson; Woodam Chung; Lucas Wells
2014-01-01
Detailed, accurate, efficient, and inexpensive methods of estimating basal area, trees, and aboveground biomass per acre across broad extents are needed to effectively manage forests. In this study we present such a methodology using readily available National Agriculture Imagery Program imagery, Forest Inventory Analysis samples, a two stage classification and...
Language Policy and Language Planning in Cyprus
ERIC Educational Resources Information Center
Hadjioannou, Xenia; Tsiplakou, Stavroula; Kappler, Matthias
2011-01-01
The aim of this monograph is to provide a detailed account of language policy and language planning in Cyprus. Using both historical and synchronic data and adopting a mixed-methods approach (archival research, ethnographic tools and insights from sociolinguistics and Critical Discourse Analysis), this study attempts to trace the origins and the…
Temporal Patterns of Communication in the Workplace
ERIC Educational Resources Information Center
Su, Norman Makoto
2009-01-01
In this dissertation, we report on results of an in-depth observational study to understand the temporal dimension of communication in the workplace. By employing the "shadowing" method for in situ to-the-second data gathering of information workers' behaviors, we gained a detailed snapshot of informants' workdays, "warts and all." Our…
Homework Practices: Role Conflicts Concerning Parental Involvement
ERIC Educational Resources Information Center
Bräu, Karin; Harring, Marius; Weyl, Christin
2017-01-01
This article on hand discusses results of an ethnographic study which aims to perform a detailed description of practices of doing homework in a domestic environment. Based on the international state of research, first the question and the methodical approach will be explained, subsequently the role conflicts and stress ratios developed while…
Depth image enhancement using perceptual texture priors
NASA Astrophysics Data System (ADS)
Bang, Duhyeon; Shim, Hyunjung
2015-03-01
A depth camera is widely used in various applications because it provides a depth image of the scene in real time. However, due to the limited power consumption, the depth camera presents severe noises, incapable of providing the high quality 3D data. Although the smoothness prior is often employed to subside the depth noise, it discards the geometric details so to degrade the distance resolution and hinder achieving the realism in 3D contents. In this paper, we propose a perceptual-based depth image enhancement technique that automatically recovers the depth details of various textures, using a statistical framework inspired by human mechanism of perceiving surface details by texture priors. We construct the database composed of the high quality normals. Based on the recent studies in human visual perception (HVP), we select the pattern density as a primary feature to classify textures. Upon the classification results, we match and substitute the noisy input normals with high quality normals in the database. As a result, our method provides the high quality depth image preserving the surface details. We expect that our work is effective to enhance the details of depth image from 3D sensors and to provide a high-fidelity virtual reality experience.
NASA Astrophysics Data System (ADS)
Doolittle, Amity A.
2010-01-01
The study of human-environmental relations is complex and by nature draws on theories and practices from multiple disciplines. There is no single research strategy or universal set of methods to which researchers must adhere. Particularly for scholars interested in a political ecology approach to understanding human-environmental relationships, very little has been written examining the details of “how to” design a project, develop appropriate methods, produce data, and, finally, integrate multiple forms of data into an analysis. A great deal of attention has been paid, appropriately, to the theoretical foundations of political ecology, and numerous scholarly articles and books have been published recently. But beyond Andrew Vayda’s “progressive contextualization” and Piers Blaikie and Harold Brookfield’s “chains of explanation,” remarkably little is written that provides a research model to follow, modify, and expand. Perhaps one of the reasons for this gap in scholarship is that, as expected in interdisciplinary research, researchers use a variety of methods that are suitable (and perhaps unique) to the questions they are asking. To start a conversation on the methods available for researchers interested in adopting a political ecology perspective to human-environmental interactions, I use my own research project as a case study. This research is by no means flawless or inclusive of all possible methods, but by using the details of this particular research process as a case study I hope to provide insights into field research that will be valuable for future scholarship.
Doolittle, Amity A
2010-01-01
The study of human-environmental relations is complex and by nature draws on theories and practices from multiple disciplines. There is no single research strategy or universal set of methods to which researchers must adhere. Particularly for scholars interested in a political ecology approach to understanding human-environmental relationships, very little has been written examining the details of "how to" design a project, develop appropriate methods, produce data, and, finally, integrate multiple forms of data into an analysis. A great deal of attention has been paid, appropriately, to the theoretical foundations of political ecology, and numerous scholarly articles and books have been published recently. But beyond Andrew Vayda's "progressive contextualization" and Piers Blaikie and Harold Brookfield's "chains of explanation," remarkably little is written that provides a research model to follow, modify, and expand. Perhaps one of the reasons for this gap in scholarship is that, as expected in interdisciplinary research, researchers use a variety of methods that are suitable (and perhaps unique) to the questions they are asking. To start a conversation on the methods available for researchers interested in adopting a political ecology perspective to human-environmental interactions, I use my own research project as a case study. This research is by no means flawless or inclusive of all possible methods, but by using the details of this particular research process as a case study I hope to provide insights into field research that will be valuable for future scholarship.
van Smeden, Jeroen; Boiten, Walter A; Hankemeier, Thomas; Rissmann, Robert; Bouwstra, Joke A; Vreeken, Rob J
2014-01-01
Ceramides (CERs), cholesterol, and free fatty acids (FFAs) are the main lipid classes in human stratum corneum (SC, outermost skin layer), but no studies report on the detailed analysis of these classes in a single platform. The primary aims of this study were to 1) develop an LC/MS method for (semi-)quantitative analysis of all main lipid classes present in human SC; and 2) use this method to study in detail the lipid profiles of human skin substitutes and compare them to human SC lipids. By applying two injections of 10μl, the developed method detects all major SC lipids using RPLC and negative ion mode APCI-MS for detection of FFAs, and NPLC using positive ion mode APCI-MS to analyze CERs and cholesterol. Validation showed this lipid platform to be robust, reproducible, sensitive, and fast. The method was successfully applied on ex vivo human SC, human SC obtained from tape strips and human skin substitutes (porcine SC and human skin equivalents). In conjunction with FFA profiles, clear differences in CER profiles were observed between these different SC sources. Human skin equivalents more closely mimic the lipid composition of human stratum corneum than porcine skin does, although noticeable differences are still present. These differences gave biologically relevant information on some of the enzymes that are probably involved in SC lipid processing. For future research, this provides an excellent method for (semi-)quantitative, 'high-throughput' profiling of SC lipids and can be used to advance the understanding of skin lipids and the biological processes involved. © 2013.
Moskowitz, Debbie S.; Young, Simon N.
2006-01-01
Current methods of assessment in clinical psychopharmacology have several serious disadvantages, particularly for the study of social functioning. We aimed to review the strengths and weaknesses of current methods used in clinical psychopharmacology and to compare them with a group of methods, developed by personality/social psychologists, termed ecological momentary assessment (EMA), which permit the research participant to report on symptoms, affect and behaviour close in time to experience and which sample many events or time periods. EMA has a number of advantages over more traditional methods for the assessment of patients in clinical psychopharmacological studies. It can both complement and, in part, replace existing methods. EMA methods will permit more sensitive assessments and will enable more wide-ranging and detailed measurements of mood and behaviour. These types of methods should be adopted more widely by clinical psychopharmacology researchers. PMID:16496031
NASA Astrophysics Data System (ADS)
Shi, Min; Niu, Zhong-Ming; Liang, Haozhao
2018-06-01
We have combined the complex momentum representation method with the Green's function method in the relativistic mean-field framework to establish the RMF-CMR-GF approach. This new approach is applied to study the halo structure of 74Ca. All the continuum level density of concerned resonant states are calculated accurately without introducing any unphysical parameters, and they are independent of the choice of integral contour. The important single-particle wave functions and densities for the halo phenomenon in 74Ca are discussed in detail.
NASA Astrophysics Data System (ADS)
Ma, Jinlei; Zhou, Zhiqiang; Wang, Bo; Zong, Hua
2017-05-01
The goal of infrared (IR) and visible image fusion is to produce a more informative image for human observation or some other computer vision tasks. In this paper, we propose a novel multi-scale fusion method based on visual saliency map (VSM) and weighted least square (WLS) optimization, aiming to overcome some common deficiencies of conventional methods. Firstly, we introduce a multi-scale decomposition (MSD) using the rolling guidance filter (RGF) and Gaussian filter to decompose input images into base and detail layers. Compared with conventional MSDs, this MSD can achieve the unique property of preserving the information of specific scales and reducing halos near edges. Secondly, we argue that the base layers obtained by most MSDs would contain a certain amount of residual low-frequency information, which is important for controlling the contrast and overall visual appearance of the fused image, and the conventional "averaging" fusion scheme is unable to achieve desired effects. To address this problem, an improved VSM-based technique is proposed to fuse the base layers. Lastly, a novel WLS optimization scheme is proposed to fuse the detail layers. This optimization aims to transfer more visual details and less irrelevant IR details or noise into the fused image. As a result, the fused image details would appear more naturally and be suitable for human visual perception. Experimental results demonstrate that our method can achieve a superior performance compared with other fusion methods in both subjective and objective assessments.
Sustainable Design Approach: A case study of BIM use
NASA Astrophysics Data System (ADS)
Abdelhameed, Wael
2017-11-01
Achieving sustainable design in areas such as energy-efficient design depends largely on the accuracy of the analysis performed after the design is completed with all its components and material details. There are different analysis approaches and methods that predict relevant values and metrics such as U value, energy use and energy savings. Although certain differences in the accuracy of these approaches and methods have been recorded, this research paper does not focus on such matter, where determining the reason for discrepancies between those approaches and methods is difficult, because all error sources act simultaneously. The research paper rather introduces an approach through which BIM, building information modelling, can be utilised during the initial phases of the designing process, by analysing the values and metrics of sustainable design before going into the design details of a building. Managing all of the project drawings in a single file, BIM -building information modelling- is well known as one digital platform that offers a multidisciplinary detailed design -AEC model (Barison and Santos, 2010, Welle et.al., 2011). The paper presents in general BIM use in the early phases of the design process, in order to achieve certain required areas of sustainable design. The paper proceeds to introduce BIM use in specific areas such as site selection, wind velocity and building orientation, in terms of reaching the farther possible sustainable solution. In the initial phases of designing, material details and building components are not fully specified or selected yet. The designer usually focuses on zoning, topology, circulations, and other design requirements. The proposed approach employs the strategies and analysis of BIM use during those initial design phases in order to have the analysis and results of each solution or alternative design. The stakeholders and designers would have a better effective decision making process with a full clarity of each alternative's consequences. The architect would settle down and proceed in the alternative design of the best sustainable analysis. In later design stages, using the sustainable types of materials such as insulation, cladding, etc., and applying sustainable building components such as doors, windows, etc. would add more improvements and enhancements in reaching better values and metrics. The paper describes the methodology of this design approach through BIM strategies adopted in design creation. Case studies of architectural designs are used to highlight the details and benefits of this proposed approach.
Coherent Waves in Seismic Researches
NASA Astrophysics Data System (ADS)
Emanov, A.; Seleznev, V. S.
2013-05-01
Development of digital processing algorithms of seismic wave fields for the purpose of useful event picking to study environment and other objects is the basis for the establishment of new seismic techniques. In the submitted paper a fundamental property of seismic wave field coherence is used. The authors extended conception of coherence types of observed wave fields and devised a technique of coherent component selection from observed wave field. Time coherence and space coherence are widely known. In this paper conception "parameter coherence" has been added. The parameter by which wave field is coherent can be the most manifold. The reason is that the wave field is a multivariate process described by a set of parameters. Coherence in the first place means independence of linear connection in wave field of parameter. In seismic wave fields, recorded in confined space, in building-blocks and stratified mediums time coherent standing waves are formed. In prospecting seismology at observation systems with multiple overlapping head waves are coherent by parallel correlation course or, in other words, by one measurement on generalized plane of observation system. For detail prospecting seismology at observation systems with multiple overlapping on basis of coherence property by one measurement of area algorithms have been developed, permitting seismic records to be converted to head wave time sections which have neither reflected nor other types of waves. Conversion in time section is executed on any specified observation base. Energy storage of head waves relative to noise on basis of multiplicity of observation system is realized within area of head wave recording. Conversion on base below the area of wave tracking is performed with lack of signal/noise ratio relative to maximum of this ratio, fit to observation system. Construction of head wave time section and dynamic plots a basis of automatic processing have been developed, similar to CDP procedure in method of reflected waves. With use of developed algorithms of head wave conversion in time sections a work of studying of refracting boundaries in Siberia have been executed. Except for the research by method of refracting waves, the conversion of head waves in time sections, applied to seismograms of reflected wave method, allows to obtain information about refracting horizons in upper part of section in addition to reflecting horizons data. Recovery method of wave field coherent components is the basis of the engineering seismology on the level of accuracy and detail. In seismic microzoning resonance frequency of the upper part of section are determined on the basis of this method. Maps of oscillation amplification and result accuracy are constructed for each of the frequencies. The same method makes it possible to study standing wave field in buildings and constructions with high accuracy and detail, realizing diagnostics of their physical state on set of natural frequencies and form of self-oscillations, examined with high detail. The method of standing waves permits to estimate a seismic stability of structure on new accuracy level.
Launch Vehicle Design and Optimization Methods and Priority for the Advanced Engineering Environment
NASA Technical Reports Server (NTRS)
Rowell, Lawrence F.; Korte, John J.
2003-01-01
NASA's Advanced Engineering Environment (AEE) is a research and development program that will improve collaboration among design engineers for launch vehicle conceptual design and provide the infrastructure (methods and framework) necessary to enable that environment. In this paper, three major technical challenges facing the AEE program are identified, and three specific design problems are selected to demonstrate how advanced methods can improve current design activities. References are made to studies that demonstrate these design problems and methods, and these studies will provide the detailed information and check cases to support incorporation of these methods into the AEE. This paper provides background and terminology for discussing the launch vehicle conceptual design problem so that the diverse AEE user community can participate in prioritizing the AEE development effort.
Scharf, Deborah M.; Setodji, Claude M.; Shadel, William G.
2012-01-01
Introduction: The aims of this study were to validate ecological momentary assessment (EMA) as a method for measuring exposure to tobacco-related marketing and media and to use this method to provide detailed descriptive data on college students’ exposure to protobacco marketing and media. Methods: College students (n = 134; ages 18–24 years) recorded their exposures to protobacco marketing and media on handheld devices for 21 consecutive days. Participants also recalled exposures to various types of protobacco marketing and media at the end of the study period. Results: Retrospectively recalled and EMA-based estimates of protobacco marketing exposure captured different information. The correlation between retrospectively recalled and EMA-logged exposures to tobacco marketing and media was moderate (r = .37, p < .001), and EMA-logged exposures were marginally associated with the intention to smoke at the end of the study, whereas retrospective recall of exposure was not. EMA data showed that college students were exposed to protobacco marketing through multiple channels in a relatively short period: Exposures (M = 8.24, SD = 7.85) occurred primarily in the afternoon (42%), on weekends (35%), and at point-of-purchase locations (68%) or in movies/TV (20%), and exposures to Marlboro, Newport, and Camel represented 56% of all exposures combined and 70% of branded exposures. Conclusions: Findings support the validity of EMA as a method for capturing detailed information about youth exposure to protobacco marketing and media that are not captured through other existing methods. Such data have the potential to highlight areas for policy change and prevention in order to reduce the impact of tobacco marketing on youth. PMID:22039076
A hybrid 3D SEM reconstruction method optimized for complex geologic material surfaces.
Yan, Shang; Adegbule, Aderonke; Kibbey, Tohren C G
2017-08-01
Reconstruction methods are widely used to extract three-dimensional information from scanning electron microscope (SEM) images. This paper presents a new hybrid reconstruction method that combines stereoscopic reconstruction with shape-from-shading calculations to generate highly-detailed elevation maps from SEM image pairs. The method makes use of an imaged glass sphere to determine the quantitative relationship between observed intensity and angles between the beam and surface normal, and the detector and surface normal. Two specific equations are derived to make use of image intensity information in creating the final elevation map. The equations are used together, one making use of intensities in the two images, the other making use of intensities within a single image. The method is specifically designed for SEM images captured with a single secondary electron detector, and is optimized to capture maximum detail from complex natural surfaces. The method is illustrated with a complex structured abrasive material, and a rough natural sand grain. Results show that the method is capable of capturing details such as angular surface features, varying surface roughness, and surface striations. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sach, Tracey H; Desborough, James; Houghton, Julie; Holland, Richard
2014-11-06
Economic methods are underutilised within pharmacy research resulting in a lack of quality evidence to support funding decisions for pharmacy interventions. The aim of this study is to illustrate the methods of micro-costing within the pharmacy context in order to raise awareness and use of this approach in pharmacy research. Micro-costing methods are particularly useful where a new service or intervention is being evaluated and for which no previous estimates of the costs of providing the service exist. This paper describes the rationale for undertaking a micro-costing study before detailing and illustrating the process involved. The illustration relates to a recently completed trial of multi-professional medication reviews as an intervention provided in care homes. All costs are presented in UK£2012. In general, costing methods involve three broad steps (identification, measurement and valuation); when using micro-costing, closer attention to detail is required within all three stages of this process. The mean (standard deviation; 95% confidence interval (CI) ) cost per resident of the multi-professional medication review intervention was £104.80 (50.91; 98.72 to 109.45), such that the overall cost of providing the intervention to all intervention home residents was £36,221.29 (95% CI, 32 810.81 to 39 631.77). This study has demonstrated that micro-costing can be a useful method, not only for estimating the cost of a pharmacy intervention to feed into a pharmacy economic evaluation, but also as a source of information to help inform those designing pharmacy services about the potential time and costs involved in delivering such services. © 2014 Royal Pharmaceutical Society.
Array tomography of physiologically-characterized CNS synapses.
Valenzuela, Ricardo A; Micheva, Kristina D; Kiraly, Marianna; Li, Dong; Madison, Daniel V
2016-08-01
The ability to correlate plastic changes in synaptic physiology with changes in synaptic anatomy has been very limited in the central nervous system because of shortcomings in existing methods for recording the activity of specific CNS synapses and then identifying and studying the same individual synapses on an anatomical level. We introduce here a novel approach that combines two existing methods: paired neuron electrophysiological recording and array tomography, allowing for the detailed molecular and anatomical study of synapses with known physiological properties. The complete mapping of a neuronal pair allows determining the exact number of synapses in the pair and their location. We have found that the majority of close appositions between the presynaptic axon and the postsynaptic dendrite in the pair contain synaptic specializations. The average release probability of the synapses between the two neurons in the pair is low, below 0.2, consistent with previous studies of these connections. Other questions, such as receptor distribution within synapses, can be addressed more efficiently by identifying only a subset of synapses using targeted partial reconstructions. In addition, time sensitive events can be captured with fast chemical fixation. Compared to existing methods, the present approach is the only one that can provide detailed molecular and anatomical information of electrophysiologically-characterized individual synapses. This method will allow for addressing specific questions about the properties of identified CNS synapses, even when they are buried within a cloud of millions of other brain circuit elements. Copyright © 2016. Published by Elsevier B.V.
Methods of space radiation dose analysis with applications to manned space systems
NASA Technical Reports Server (NTRS)
Langley, R. W.; Billings, M. P.
1972-01-01
The full potential of state-of-the-art space radiation dose analysis for manned missions has not been exploited. Point doses have been overemphasized, and the critical dose to the bone marrow has been only crudely approximated, despite the existence of detailed man models and computer codes for dose integration in complex geometries. The method presented makes it practical to account for the geometrical detail of the astronaut as well as the vehicle. Discussed are the major assumptions involved and the concept of applying the results of detailed proton dose analysis to the real-time interpretation of on-board dosimetric measurements.
ERIC Educational Resources Information Center
Ali, Takbir
2018-01-01
This study documented in detail teachers' voices about their working conditions, professional development needs and opportunities to cater to these needs. The study reported in this paper was conducted as part of a large-scale study that used mixed methods to assess teachers' professional development needs. The qualitative data reported in this…
Abnormalities of Object Visual Processing in Body Dysmorphic Disorder
Feusner, Jamie D.; Hembacher, Emily; Moller, Hayley; Moody, Teena D.
2013-01-01
Background Individuals with body dysmorphic disorder may have perceptual distortions for their appearance. Previous studies suggest imbalances in detailed relative to configural/holistic visual processing when viewing faces. No study has investigated the neural correlates of processing non-symptom-related stimuli. The objective of this study was to determine whether individuals with body dysmorphic disorder have abnormal patterns of brain activation when viewing non-face/non-body object stimuli. Methods Fourteen medication-free participants with DSM-IV body dysmorphic disorder and 14 healthy controls participated. We performed functional magnetic resonance imaging while participants matched photographs of houses that were unaltered, contained only high spatial frequency (high detail) information, or only low spatial frequency (low detail) information. The primary outcome was group differences in blood oxygen level-dependent signal changes. Results The body dysmorphic disorder group showed lesser activity in the parahippocampal gyrus, lingual gyrus, and precuneus for low spatial frequency images. There were greater activations in medial prefrontal regions for high spatial frequency images, although no significant differences when compared to a low-level baseline. Greater symptom severity was associated with lesser activity in dorsal occipital cortex and ventrolateral prefrontal cortex for normal and high spatial frequency images. Conclusions Individuals with body dysmorphic disorder have abnormal brain activation patterns when viewing objects. Hypoactivity in visual association areas for configural and holistic (low detail) elements and abnormal allocation of prefrontal systems for details is consistent with a model of imbalances in global vs. local processing. This may occur not only for appearance but also for general stimuli unrelated to their symptoms. PMID:21557897
Chen, Shaoqiang; Zhu, Lin; Yoshita, Masahiro; Mochizuki, Toshimitsu; Kim, Changsu; Akiyama, Hidefumi; Imaizumi, Mitsuru; Kanemitsu, Yoshihiko
2015-01-01
World-wide studies on multi-junction (tandem) solar cells have led to record-breaking improvements in conversion efficiencies year after year. To obtain detailed and proper feedback for solar-cell design and fabrication, it is necessary to establish standard methods for diagnosing subcells in fabricated tandem devices. Here, we propose a potential standard method to quantify the detailed subcell properties of multi-junction solar cells based on absolute measurements of electroluminescence (EL) external quantum efficiency in addition to the conventional solar-cell external-quantum-efficiency measurements. We demonstrate that the absolute-EL-quantum-efficiency measurements provide I–V relations of individual subcells without the need for referencing measured I–V data, which is in stark contrast to previous works. Moreover, our measurements quantify the absolute rates of junction loss, non-radiative loss, radiative loss, and luminescence coupling in the subcells, which constitute the “balance sheets” of tandem solar cells. PMID:25592484
NASA Astrophysics Data System (ADS)
Chatelain, P.; Duponcheel, M.; Caprace, D.-G.; Marichal, Y.; Winckelmans, G.
2016-09-01
A Vortex Particle-Mesh (VPM) method with immersed lifting lines has been developed and validated. Based on the vorticity-velocity formulation of the Navier-Stokes equations, it combines the advantages of a particle method and of a mesh-based approach. The immersed lifting lines handle the creation of vorticity from the blade elements and its early development. LES of Vertical Axis Wind Turbine (VAWT) flows are performed. The complex wake development is captured in details and over very long distances: from the blades to the near wake coherent vortices, then through the transitional ones to the fully developed turbulent far wake (beyond 10 rotor diameters). The statistics and topology of the mean flow are studied. The computational sizes also allow insights into the detailed unsteady vortex dynamics, including some unexpected topological flow features.
Open Vehicle Sketch Pad Aircraft Modeling Strategies
NASA Technical Reports Server (NTRS)
Hahn, Andrew S.
2013-01-01
Geometric modeling of aircraft during the Conceptual design phase is very different from that needed for the Preliminary or Detailed design phases. The Conceptual design phase is characterized by the rapid, multi-disciplinary analysis of many design variables by a small engineering team. The designer must walk a line between fidelity and productivity, picking tools and methods with the appropriate balance of characteristics to achieve the goals of the study, while staying within the available resources. Identifying geometric details that are important, and those that are not, is critical to making modeling and methodology choices. This is true for both the low-order analysis methods traditionally used in Conceptual design as well as the highest-order analyses available. This paper will highlight some of Conceptual design's characteristics that drive the designer s choices as well as modeling examples for several aircraft configurations using the open source version of the Vehicle Sketch Pad (Open VSP) aircraft Conceptual design geometry modeler.
NASA Astrophysics Data System (ADS)
Bensaida, K.; Alie, Colin; Elkamel, A.; Almansoori, A.
2017-08-01
This paper presents a novel techno-economic optimization model for assessing the effectiveness of CO2 mitigation options for the electricity generation sub-sector that includes renewable energy generation. The optimization problem was formulated as a MINLP model using the GAMS modeling system. The model seeks the minimization of the power generation costs under CO2 emission constraints by dispatching power from low CO2 emission-intensity units. The model considers the detailed operation of the electricity system to effectively assess the performance of GHG mitigation strategies and integrates load balancing, carbon capture and carbon taxes as methods for reducing CO2 emissions. Two case studies are discussed to analyze the benefits and challenges of the CO2 reduction methods in the electricity system. The proposed mitigations options would not only benefit the environment, but they will as well improve the marginal cost of producing energy which represents an advantage for stakeholders.
Advanced composite elevator for Boeing 727 aircraft, volume 2
NASA Technical Reports Server (NTRS)
Chovil, D. V.; Grant, W. D.; Jamison, E. S.; Syder, H.; Desper, O. E.; Harvey, S. T.; Mccarty, J. E.
1980-01-01
Preliminary design activity consisted of developing and analyzing alternate design concepts and selecting the optimum elevator configuration. This included trade studies in which durability, inspectability, producibility, repairability, and customer acceptance were evaluated. Preliminary development efforts consisted of evaluating and selecting material, identifying ancillary structural development test requirements, and defining full scale ground and flight test requirements necessary to obtain Federal Aviation Administration (FAA) certification. After selection of the optimum elevator configuration, detail design was begun and included basic configuration design improvements resulting from manufacturing verification hardware, the ancillary test program, weight analysis, and structural analysis. Detail and assembly tools were designed and fabricated to support a full-scope production program, rather than a limited run. The producibility development programs were used to verify tooling approaches, fabrication processes, and inspection methods for the production mode. Quality parts were readily fabricated and assembled with a minimum rejection rate, using prior inspection methods.
Velocity Statistics and Spectra in Three-Stream Jets
NASA Technical Reports Server (NTRS)
Ecker, Tobias; Lowe, K. Todd; Ng, Wing F.; Henderson, Brenda; Leib, Stewart
2016-01-01
Velocimetry measurements were obtained in three-stream jets at the NASA Glenn Research Center Nozzle Acoustics Test Rig using the time-resolved Doppler global velocimetry technique. These measurements afford exceptional frequency response, to 125 kHz bandwidth, in order to study the detailed dynamics of turbulence in developing shear flows. Mean stream-wise velocity is compared to measurements acquired using particle image velocimetry for validation. Detailed results for convective velocity distributions throughout an axisymmetric plume and the thick side of a plume with an offset third-stream duct are provided. The convective velocity results exhibit that, as expected, the eddy speeds are reduced on the thick side of the plume compared to the axisymmetric case. The results indicate that the time-resolved Doppler global velocimetry method holds promise for obtaining results valuable to the implementation and refinement of jet noise prediction methods being developed for three-stream jets.
NASA Astrophysics Data System (ADS)
Barker, J. R.; Pasternack, G. B.; Bratovich, P.; Massa, D.; Reedy, G.; Johnson, T.
2010-12-01
Two-dimensional (depth-averaged) hydrodynamic models have existed for decades and are used to study a variety of hydrogeomorphic processes as well as to design river rehabilitation projects. Rapid computer and coding advances are revolutionizing the size and detail of 2D models. Meanwhile, advances in topo mapping and environmental informatics are providing the data inputs to drive large, detailed simulations. Million-element computational meshes are in hand. With simulations of this size and detail, the primary challenge has shifted to finding rapid and inexpensive means for testing model predictions against observations. Standard methods for collecting velocity data include boat-mounted ADCP and point-based sensors on boats or wading rods. These methods are labor intensive and often limited to a narrow flow range. Also, they generate small datasets at a few cross-sections, which is inadequate to characterize the statistical structure of the relation between predictions and observations. Drawing on the long-standing oceanographic method of using drogues to track water currents, previous studies have demonstrated the potential of small dGPS units to obtain surface velocity in rivers. However, dGPS is too inaccurate to test 2D models. Also, there is financial risk in losing drogues in rough currents. In this study, an RTK GPS unit was mounted onto a manned whitewater kayak. The boater positioned himself into the current and used floating debris to maintain a speed and heading consistent with the ambient surface flow field. RTK GPS measurements were taken ever 5 sec. From these positions, a 2D velocity vector was obtained. The method was tested over ~20 km of the lower Yuba River in California in flows ranging from 500-5000 cfs, yielding 5816 observations. To compare velocity magnitude against the 2D model-predicted depth-averaged value, kayak-based surface values were scaled down by an optimized constant (0.72), which had no negative effect on regression analysis. The r2 value for speed was 0.78 by this method, compared with 0.57 based on 199 points from traditional measurements. The r2 value for velocity direction was 0.77. Although it is not ideal to rely on observed surface velocity to evaluate depth-average velocity predictions, all available velocity-measurement methods have a suite of assumptions and complications. Using this method, the availability of 10-100x more data was so beneficial that the outcome was among the highest model performance outcomes reported in the literature.
Comparison of Filters Dedicated to Speckle Suppression in SAR Images
NASA Astrophysics Data System (ADS)
Kupidura, P.
2016-06-01
This paper presents the results of research on the effectiveness of different filtering methods dedicated to speckle suppression in SAR images. The tests were performed on RadarSat-2 images and on an artificial image treated with simulated speckle noise. The research analysed the performance of particular filters related to the effectiveness of speckle suppression and to the ability to preserve image details and edges. Speckle is a phenomenon inherent to radar images - a deterministic noise connected with land cover type, but also causing significant changes in digital numbers of pixels. As a result, it may affect interpretation, classification and other processes concerning radar images. Speckle, resembling "salt and pepper" noise, has the form of a set of relatively small groups of pixels of values markedly different from values of other pixels representing the same type of land cover. Suppression of this noise may also cause suppression of small image details, therefore the ability to preserve the important parts of an image, was analysed as well. In the present study, selected filters were tested, and methods dedicated particularly to speckle noise suppression: Frost, Gamma-MAP, Lee, Lee-Sigma, Local Region, general filtering methods which might be effective in this respect: Mean, Median, in addition to morphological filters (alternate sequential filters with multiple structuring element and by reconstruction). The analysis presented in this paper compared the effectiveness of different filtering methods. It proved that some of the dedicated radar filters are efficient tools for speckle suppression, but also demonstrated a significant efficiency of the morphological approach, especially its ability to preserve image details.
Investigation of mixed saliva by optoelectronic methods
NASA Astrophysics Data System (ADS)
Savchenko, Ekaterina; Nepomnyashchaya, Elina; Baranov, Maksim; Velichko, Elena; Aksenov, Evgenii; Bogomaz, Tatyana
2018-04-01
At present, saliva and its properties are being actively studied. Human saliva is a unique biological material that has potential in clinical practice. A detailed analysis of the characteristics and properties of saliva is relevant for diagnostic purposes. In this paper, the properties and characteristics of saliva are studied using optoelectronic methods: dynamic light scattering, electrophoretic light scattering and optical microscopy. Mixed saliva from a healthy patient and patient with diabetes mellitus type 2 was used as an object of the study. The dynamics of the behavior of a healthy and patient with diabetes mellitus type 2 is visible according to the results obtained. All three methods confirm hypothesis of structural changes in mixed saliva in the disease of diabetes mellitus type 2.
Martino, Steven C; Scharf, Deborah M; Setodji, Claude M; Shadel, William G
2012-04-01
The aims of this study were to validate ecological momentary assessment (EMA) as a method for measuring exposure to tobacco-related marketing and media and to use this method to provide detailed descriptive data on college students' exposure to protobacco marketing and media. College students (n = 134; ages 18-24 years) recorded their exposures to protobacco marketing and media on handheld devices for 21 consecutive days. Participants also recalled exposures to various types of protobacco marketing and media at the end of the study period. Retrospectively recalled and EMA-based estimates of protobacco marketing exposure captured different information. The correlation between retrospectively recalled and EMA-logged exposures to tobacco marketing and media was moderate (r = .37, p < .001), and EMA-logged exposures were marginally associated with the intention to smoke at the end of the study, whereas retrospective recall of exposure was not. EMA data showed that college students were exposed to protobacco marketing through multiple channels in a relatively short period: Exposures (M = 8.24, SD = 7.85) occurred primarily in the afternoon (42%), on weekends (35%), and at point-of-purchase locations (68%) or in movies/TV (20%), and exposures to Marlboro, Newport, and Camel represented 56% of all exposures combined and 70% of branded exposures. Findings support the validity of EMA as a method for capturing detailed information about youth exposure to protobacco marketing and media that are not captured through other existing methods. Such data have the potential to highlight areas for policy change and prevention in order to reduce the impact of tobacco marketing on youth.
Beyramysoltan, Samira; Abdollahi, Hamid; Rajkó, Róbert
2014-05-27
Analytical self-modeling curve resolution (SMCR) methods resolve data sets to a range of feasible solutions using only non-negative constraints. The Lawton-Sylvestre method was the first direct method to analyze a two-component system. It was generalized as a Borgen plot for determining the feasible regions in three-component systems. It seems that a geometrical view is required for considering curve resolution methods, because the complicated (only algebraic) conceptions caused a stop in the general study of Borgen's work for 20 years. Rajkó and István revised and elucidated the principles of existing theory in SMCR methods and subsequently introduced computational geometry tools for developing an algorithm to draw Borgen plots in three-component systems. These developments are theoretical inventions and the formulations are not always able to be given in close form or regularized formalism, especially for geometric descriptions, that is why several algorithms should have been developed and provided for even the theoretical deductions and determinations. In this study, analytical SMCR methods are revised and described using simple concepts. The details of a drawing algorithm for a developmental type of Borgen plot are given. Additionally, for the first time in the literature, equality and unimodality constraints are successfully implemented in the Lawton-Sylvestre method. To this end, a new state-of-the-art procedure is proposed to impose equality constraint in Borgen plots. Two- and three-component HPLC-DAD data set were simulated and analyzed by the new analytical curve resolution methods with and without additional constraints. Detailed descriptions and explanations are given based on the obtained abstract spaces. Copyright © 2014 Elsevier B.V. All rights reserved.
Numerical comparisons of ground motion predictions with kinematic rupture modeling
NASA Astrophysics Data System (ADS)
Yuan, Y. O.; Zurek, B.; Liu, F.; deMartin, B.; Lacasse, M. D.
2017-12-01
Recent advances in large-scale wave simulators allow for the computation of seismograms at unprecedented levels of detail and for areas sufficiently large to be relevant to small regional studies. In some instances, detailed information of the mechanical properties of the subsurface has been obtained from seismic exploration surveys, well data, and core analysis. Using kinematic rupture modeling, this information can be used with a wave propagation simulator to predict the ground motion that would result from an assumed fault rupture. The purpose of this work is to explore the limits of wave propagation simulators for modeling ground motion in different settings, and in particular, to explore the numerical accuracy of different methods in the presence of features that are challenging to simulate such as topography, low-velocity surface layers, and shallow sources. In the main part of this work, we use a variety of synthetic three-dimensional models and compare the relative costs and benefits of different numerical discretization methods in computing the seismograms of realistic-size models. The finite-difference method, the discontinuous-Galerkin method, and the spectral-element method are compared for a range of synthetic models having different levels of complexity such as topography, large subsurface features, low-velocity surface layers, and the location and characteristics of fault ruptures represented as an array of seismic sources. While some previous studies have already demonstrated that unstructured-mesh methods can sometimes tackle complex problems (Moczo et al.), we investigate the trade-off between unstructured-mesh methods and regular-grid methods for a broad range of models and source configurations. Finally, for comparison, our direct simulation results are briefly contrasted with those predicted by a few phenomenological ground-motion prediction equations, and a workflow for accurately predicting ground motion is proposed.
Baxter, Susan K; Blank, Lindsay; Woods, Helen Buckley; Payne, Nick; Rimmer, Melanie; Goyder, Elizabeth
2014-05-10
There is increasing interest in innovative methods to carry out systematic reviews of complex interventions. Theory-based approaches, such as logic models, have been suggested as a means of providing additional insights beyond that obtained via conventional review methods. This paper reports the use of an innovative method which combines systematic review processes with logic model techniques to synthesise a broad range of literature. The potential value of the model produced was explored with stakeholders. The review identified 295 papers that met the inclusion criteria. The papers consisted of 141 intervention studies and 154 non-intervention quantitative and qualitative articles. A logic model was systematically built from these studies. The model outlines interventions, short term outcomes, moderating and mediating factors and long term demand management outcomes and impacts. Interventions were grouped into typologies of practitioner education, process change, system change, and patient intervention. Short-term outcomes identified that may result from these interventions were changed physician or patient knowledge, beliefs or attitudes and also interventions related to changed doctor-patient interaction. A range of factors which may influence whether these outcomes lead to long term change were detailed. Demand management outcomes and intended impacts included content of referral, rate of referral, and doctor or patient satisfaction. The logic model details evidence and assumptions underpinning the complex pathway from interventions to demand management impact. The method offers a useful addition to systematic review methodologies. PROSPERO registration number: CRD42013004037.
Inspection of Piezoceramic Transducers Used for Structural Health Monitoring
Mueller, Inka; Fritzen, Claus-Peter
2017-01-01
The use of piezoelectric wafer active sensors (PWAS) for structural health monitoring (SHM) purposes is state of the art for acousto-ultrasonic-based methods. For system reliability, detailed information about the PWAS itself is necessary. This paper gives an overview on frequent PWAS faults and presents the effects of these faults on the wave propagation, used for active acousto-ultrasonics-based SHM. The analysis of the wave field is based on velocity measurements using a laser Doppler vibrometer (LDV). New and established methods of PWAS inspection are explained in detail, listing advantages and disadvantages. The electro-mechanical impedance spectrum as basis for these methods is discussed for different sensor faults. This way this contribution focuses on a detailed analysis of PWAS and the need of their inspection for an increased reliability of SHM systems. PMID:28772431
Levels of detail analysis of microwave scattering from human head models for brain stroke detection
2017-01-01
In this paper, we have presented a microwave scattering analysis from multiple human head models. This study incorporates different levels of detail in the human head models and its effect on microwave scattering phenomenon. Two levels of detail are taken into account; (i) Simplified ellipse shaped head model (ii) Anatomically realistic head model, implemented using 2-D geometry. In addition, heterogenic and frequency-dispersive behavior of the brain tissues has also been incorporated in our head models. It is identified during this study that the microwave scattering phenomenon changes significantly once the complexity of head model is increased by incorporating more details using magnetic resonance imaging database. It is also found out that the microwave scattering results match in both types of head model (i.e., geometrically simple and anatomically realistic), once the measurements are made in the structurally simplified regions. However, the results diverge considerably in the complex areas of brain due to the arbitrary shape interface of tissue layers in the anatomically realistic head model. After incorporating various levels of detail, the solution of subject microwave scattering problem and the measurement of transmitted and backscattered signals were obtained using finite element method. Mesh convergence analysis was also performed to achieve error free results with a minimum number of mesh elements and a lesser degree of freedom in the fast computational time. The results were promising and the E-Field values converged for both simple and complex geometrical models. However, the E-Field difference between both types of head model at the same reference point differentiated a lot in terms of magnitude. At complex location, a high difference value of 0.04236 V/m was measured compared to the simple location, where it turned out to be 0.00197 V/m. This study also contributes to provide a comparison analysis between the direct and iterative solvers so as to find out the solution of subject microwave scattering problem in a minimum computational time along with memory resources requirement. It is seen from this study that the microwave imaging may effectively be utilized for the detection, localization and differentiation of different types of brain stroke. The simulation results verified that the microwave imaging can be efficiently exploited to study the significant contrast between electric field values of the normal and abnormal brain tissues for the investigation of brain anomalies. In the end, a specific absorption rate analysis was carried out to compare the ionizing effects of microwave signals to different types of head model using a factor of safety for brain tissues. It is also suggested after careful study of various inversion methods in practice for microwave head imaging, that the contrast source inversion method may be more suitable and computationally efficient for such problems. PMID:29177115
Origin and transport of high energy particles in the galaxy
NASA Technical Reports Server (NTRS)
Wefel, John P.
1987-01-01
The origin, confinement, and transport of cosmic ray nuclei in the galaxy was studied. The work involves interpretations of the existing cosmic ray physics database derived from both balloon and satellite measurements, combined with an effort directed towards defining the next generation of instruments for the study of cosmic radiation. The shape and the energy dependence of the cosmic ray pathlength distribution in the galaxy was studied, demonstrating that the leaky box model is not a good representation of the detailed particle transport over the energy range covered by the database. Alternative confinement methods were investigated, analyzing the confinement lifetime in these models based upon the available data for radioactive secondary isotopes. The source abundances of several isotopes were studied using compiled nuclear physics data and the detailed transport calculations. The effects of distributed particle acceleration on the secondary to primary ratios were investigated.
NASA Astrophysics Data System (ADS)
Munoto; Sondang, Meini; Satriana, FMS
2018-04-01
This study aims to determine the characteristics of students who were uninterested in attending practicum classes. This study applied naturalistic qualitative research methods using participatory observation and interviews. The data validity was ensured by triangulation, detailed description, length of observation time, as well as details and thorough observation. The data were analyzed using domain analysis and followed by conducting taxonomic, component, and thematic analyses. The results of the study indicate that faineant students show a negative behavior while attending laboratory practicums. They have a lack of motivation, effective interaction, and attention. The cognitive abilities vary from low to high. Other causes on how the aspects of the study were low were found as well, therefore improving those aspects allows students to raise their interest in practicum classes. The impact is to create well-skilled vocational teachers in conducting practicum classes in vocational schools and to make graduates better-prepared for the workforce.
NASA Technical Reports Server (NTRS)
Leibfried, T. F., Jr.; Davari, Sadegh; Natarajan, Swami; Zhao, Wei
1992-01-01
Two categories were chosen for study: the issue of using a preprocessor on Ada code of Application Programs which would interface with the Run-Time Object Data Base Standard Services (RODB STSV), the intent was to catch and correct any mis-registration errors of the program coder between the user declared Objects, their types, their addresses, and the corresponding RODB definitions; and RODB STSV Performance Issues and Identification of Problems with the planned methods for accessing Primitive Object Attributes, this included the study of an alternate storage scheme to the 'store objects by attribute' scheme in the current design of the RODB. The study resulted in essentially three separate documents, an interpretation of the system requirements, an assessment of the preliminary design, and a detailing of the components of a detailed design.
Transition Studies on a Swept-Wing Model
NASA Technical Reports Server (NTRS)
Saric, William S.
1996-01-01
The present investigation contributes to the understanding of boundary-layer stability and transition by providing detailed measurements of carefully-produced stationary crossflow vortices. It is clear that a successful prediction of transition in swept-wing flows must include an understanding of the detailed physics involved. Receptivity and nonlinear effects must not be ignored. Linear stability theory correctly predicts the expected wavelengths and mode shapes for stationary crossflow, but fails to predict the growth rates, even for low amplitudes. As new computational and analytical methods are developed to deal with three-dimensional boundary layers, the data provided by this experiment will serve as a useful benchmark for comparison.
Development of TPS flight test and operational instrumentation
NASA Technical Reports Server (NTRS)
Carnahan, K. R.; Hartman, G. J.; Neuner, G. J.
1975-01-01
Thermal and flow sensor instrumentation was developed for use as an integral part of the space shuttle orbiter reusable thermal protection system. The effort was performed in three tasks: a study to determine the optimum instruments and instrument installations for the space shuttle orbiter RSI and RCC TPS; tests and/or analysis to determine the instrument installations to minimize measurement errors; and analysis using data from the test program for comparison to analytical methods. A detailed review of existing state of the art instrumentation in industry was performed to determine the baseline for the departure of the research effort. From this information, detailed criteria for thermal protection system instrumentation were developed.
Densitometry By Acoustic Levitation
NASA Technical Reports Server (NTRS)
Trinh, Eugene H.
1989-01-01
"Static" and "dynamic" methods developed for measuring mass density of acoustically levitated solid particle or liquid drop. "Static" method, unknown density of sample found by comparison with another sample of known density. "Dynamic" method practiced with or without gravitational field. Advantages over conventional density-measuring techniques: sample does not have to make contact with container or other solid surface, size and shape of samples do not affect measurement significantly, sound field does not have to be know in detail, and sample can be smaller than microliter. Detailed knowledge of acoustic field not necessary.
Chebyshev polynomials in the spectral Tau method and applications to Eigenvalue problems
NASA Technical Reports Server (NTRS)
Johnson, Duane
1996-01-01
Chebyshev Spectral methods have received much attention recently as a technique for the rapid solution of ordinary differential equations. This technique also works well for solving linear eigenvalue problems. Specific detail is given to the properties and algebra of chebyshev polynomials; the use of chebyshev polynomials in spectral methods; and the recurrence relationships that are developed. These formula and equations are then applied to several examples which are worked out in detail. The appendix contains an example FORTRAN program used in solving an eigenvalue problem.
Astrometric "Core-shifts" at the Highest Frequencies
NASA Technical Reports Server (NTRS)
Rioja, Maria; Dodson, Richard
2010-01-01
We discuss the application of a new VLBI astrometric method named "Source/Frequency Phase Referencing" to measurements of "core-shifts" in radio sources used for geodetic observations. We detail the reasons that astrometrical observations of 'core-shifts' have become critical in the era of VLBI2010. We detail how this new method allows the problem to be addressed at the highest frequencies and outline its superior compensation of tropospheric errors.
The Development of a Hollow Blade for Exhaust Gas Turbines
NASA Technical Reports Server (NTRS)
Kohlmann, H
1950-01-01
The subject of the development of German hollow turbine blades for use with internal cooling is discussed in detail. The development of a suitable blade profile from cascade theory is described. Also a discussion of the temperature distribution and stresses in a turbine blade is presented. Various methods of manufacturing hollow blades and the methods by which they are mounted in the turbine rotor are presented in detail.
PET-CT image fusion using random forest and à-trous wavelet transform.
Seal, Ayan; Bhattacharjee, Debotosh; Nasipuri, Mita; Rodríguez-Esparragón, Dionisio; Menasalvas, Ernestina; Gonzalo-Martin, Consuelo
2018-03-01
New image fusion rules for multimodal medical images are proposed in this work. Image fusion rules are defined by random forest learning algorithm and a translation-invariant à-trous wavelet transform (AWT). The proposed method is threefold. First, source images are decomposed into approximation and detail coefficients using AWT. Second, random forest is used to choose pixels from the approximation and detail coefficients for forming the approximation and detail coefficients of the fused image. Lastly, inverse AWT is applied to reconstruct fused image. All experiments have been performed on 198 slices of both computed tomography and positron emission tomography images of a patient. A traditional fusion method based on Mallat wavelet transform has also been implemented on these slices. A new image fusion performance measure along with 4 existing measures has been presented, which helps to compare the performance of 2 pixel level fusion methods. The experimental results clearly indicate that the proposed method outperforms the traditional method in terms of visual and quantitative qualities and the new measure is meaningful. Copyright © 2017 John Wiley & Sons, Ltd.
Web-based data collection: detailed methods of a questionnaire and data gathering tool
Cooper, Charles J; Cooper, Sharon P; del Junco, Deborah J; Shipp, Eva M; Whitworth, Ryan; Cooper, Sara R
2006-01-01
There have been dramatic advances in the development of web-based data collection instruments. This paper outlines a systematic web-based approach to facilitate this process through locally developed code and to describe the results of using this process after two years of data collection. We provide a detailed example of a web-based method that we developed for a study in Starr County, Texas, assessing high school students' work and health status. This web-based application includes data instrument design, data entry and management, and data tables needed to store the results that attempt to maximize the advantages of this data collection method. The software also efficiently produces a coding manual, web-based statistical summary and crosstab reports, as well as input templates for use by statistical packages. Overall, web-based data entry using a dynamic approach proved to be a very efficient and effective data collection system. This data collection method expedited data processing and analysis and eliminated the need for cumbersome and expensive transfer and tracking of forms, data entry, and verification. The code has been made available for non-profit use only to the public health research community as a free download [1]. PMID:16390556
Mojsiewicz-Pieńkowska, Krystyna; Jamrógiewicz, Marzena; Zebrowska, Maria; Sznitowska, Małgorzata; Centkowska, Katarzyna
2011-08-25
Silicone polymers possess unique properties, which make them suitable for many different applications, for example in the pharmaceutical and medical industry. To create an adhesive silicone film, the appropriate silicone components have to be chosen first. From these components two layers were made: an adhesive elastomer applied on the skin, and a non-adhesive elastomer on the other side of the film. The aim of this study was to identify a set of analytical methods that can be used for detailed characterization of the elastomer layers, as needed when designing new silicone films. More specifically, the following methods were combined to detailed identification of the silicone components: Fourier transform infrared spectroscopy (FTIR), proton nuclear magnetic resonance (¹H NMR) and size exclusion chromatography with evaporative light scattering detector (SEC-ELSD). It was demonstrated that these methods together with a rheological analysis are suitable for controlling the cross-linking reaction, thus obtaining the desired properties of the silicone film. Adhesive silicone films can be used as universal materials for medical use, particularly for effective treatment of scars and keloids or as drug carriers in transdermal therapy.
Modeling and visualizing borehole information on virtual globes using KML
NASA Astrophysics Data System (ADS)
Zhu, Liang-feng; Wang, Xi-feng; Zhang, Bing
2014-01-01
Advances in virtual globes and Keyhole Markup Language (KML) are providing the Earth scientists with the universal platforms to manage, visualize, integrate and disseminate geospatial information. In order to use KML to represent and disseminate subsurface geological information on virtual globes, we present an automatic method for modeling and visualizing a large volume of borehole information. Based on a standard form of borehole database, the method first creates a variety of borehole models with different levels of detail (LODs), including point placemarks representing drilling locations, scatter dots representing contacts and tube models representing strata. Subsequently, the level-of-detail based (LOD-based) multi-scale representation is constructed to enhance the efficiency of visualizing large numbers of boreholes. Finally, the modeling result can be loaded into a virtual globe application for 3D visualization. An implementation program, termed Borehole2KML, is developed to automatically convert borehole data into KML documents. A case study of using Borehole2KML to create borehole models in Shanghai shows that the modeling method is applicable to visualize, integrate and disseminate borehole information on the Internet. The method we have developed has potential use in societal service of geological information.
NASA Astrophysics Data System (ADS)
Hilburn, Guy Louis
Results from several studies are presented which detail explorations of the physical and spectral properties of low luminosity active galactic nuclei. An initial Sagittarius A* general relativistic magnetohydrodynamic simulation and Monte Carlo radiation transport model suggests accretion rate changes as the dominant flaring method. A similar study on M87 introduces new methods to the Monte Carlo model for increased consistency in highly energetic sources. Again, accretion rate variation seems most appropriate to explain spectral transients. To more closely resolve the methods of particle energization in active galactic nuclei accretion disks, a series of localized shearing box simulations explores the effect of numerical resolution on the development of current sheets. A particular focus on numerically describing converged current sheet formation will provide new methods for consideration of turbulence in accretion disks.
Control of Technology Transfer at JPL
NASA Technical Reports Server (NTRS)
Oliver, Ronald
2006-01-01
Controlled Technology: 1) Design: preliminary or critical design data, schematics, technical flow charts, SNV code/diagnostics, logic flow diagrams, wirelist, ICDs, detailed specifications or requirements. 2) Development: constraints, computations, configurations, technical analyses, acceptance criteria, anomaly resolution, detailed test plans, detailed technical proposals. 3) Production: process or how-to: assemble, operated, repair, maintain, modify. 4) Manufacturing: technical instructions, specific parts, specific materials, specific qualities, specific processes, specific flow. 5) Operations: how-to operate, contingency or standard operating plans, Ops handbooks. 6) Repair: repair instructions, troubleshooting schemes, detailed schematics. 7) Test: specific procedures, data, analysis, detailed test plan and retest plans, detailed anomaly resolutions, detailed failure causes and corrective actions, troubleshooting, trended test data, flight readiness data. 8) Maintenance: maintenance schedules and plans, methods for regular upkeep, overhaul instructions. 9) Modification: modification instructions, upgrades kit parts, including software
A study for development of aerothermodynamic test model materials and fabrication technique
NASA Technical Reports Server (NTRS)
Dean, W. G.; Connor, L. E.
1972-01-01
A literature survey, materials reformulation and tailoring, fabrication problems, and materials selection and evaluation for fabricating models to be used with the phase-change technique for obtaining quantitative aerodynamic heat transfer data are presented. The study resulted in the selection of two best materials, stycast 2762 FT, and an alumina ceramic. Characteristics of these materials and detailed fabrication methods are presented.
ERIC Educational Resources Information Center
Lisovskiy, V.; Yegorenkov, V.
2009-01-01
In this paper, we propose a simple method of observing the collision-dominated Child-Langmuir law in the course of an undergraduate laboratory work devoted to studying the properties of gas discharges. To this end we employ the dc gas discharge whose properties are studied in sufficient detail. The undergraduate laboratory work itself is reduced…
ERIC Educational Resources Information Center
Taylor, Liz
2011-01-01
This article demonstrates how a set of complementary qualitative methods can be used to construct a detailed picture not only of the nature of young people's representations of a distant place but the processes of learning by which such representations develop over the medium term. The analysis is based on an interpretive case study of a class of…
ERIC Educational Resources Information Center
Aydin, Miraç
2016-01-01
An important stage in any research inquiry is the development of research questions that need to be answered. The strategies to develop research questions should be defined and described, but few studies have considered this process in greater detail. This study explores pre-service science teachers' research questions and the strategies they can…
2011-01-01
Background Educators in allied health and medical education programs utilize instructional multimedia to facilitate psychomotor skill acquisition in students. This study examines the effects of instructional multimedia on student and instructor attitudes and student study behavior. Methods Subjects consisted of 45 student physical therapists from two universities. Two skill sets were taught during the course of the study. Skill set one consisted of knee examination techniques and skill set two consisted of ankle/foot examination techniques. For each skill set, subjects were randomly assigned to either a control group or an experimental group. The control group was taught with live demonstration of the examination skills, while the experimental group was taught using multimedia. A cross-over design was utilized so that subjects in the control group for skill set one served as the experimental group for skill set two, and vice versa. During the last week of the study, students and instructors completed written questionnaires to assess attitude toward teaching methods, and students answered questions regarding study behavior. Results There were no differences between the two instructional groups in attitudes, but students in the experimental group for skill set two reported greater study time alone compared to other groups. Conclusions Multimedia provides an efficient method to teach psychomotor skills to students entering the health professions. Both students and instructors identified advantages and disadvantages for both instructional techniques. Reponses relative to instructional multimedia emphasized efficiency, processing level, autonomy, and detail of instruction compared to live presentation. Students and instructors identified conflicting views of instructional detail and control of the content. PMID:21693058
Practical application of a patient satisfaction survey.
Margo, K L; Margo, G M
1990-01-01
Patient satisfaction surveys are a practical method for studying one aspect of quality of care in an HMO. This report details the use of a patient satisfaction survey for this purpose. The focus is on the instrument chosen and the type of analyses carried out. Despite the interest and potential usefulness of the data obtained, the organizational response to the study can be the rate-limiting factor for using the findings. In general, the method can be used for periodic monitoring and as a valid method for detecting or confirming suspected trouble spots in the system. To be successful, the data must be interpreted in the spirit of a shared commitment to quality care.
Detailed and reduced chemical-kinetic descriptions for hydrocarbon combustion
NASA Astrophysics Data System (ADS)
Petrova, Maria V.
Numerical and theoretical studies of autoignition processes of fuels such as propane are in need of realistic simplified chemical-kinetic descriptions that retain the essential features of the detailed descriptions. These descriptions should be computationally feasible and cost-effective. Such descriptions are useful for investigating ignition processes that occur, for example, in homogeneous-charge compression-ignition engines, for studying the structures and dynamics of detonations and in fields such as multi-dimensional Computational Fluid Dynamics (CFD). Reduced chemistry has previously been developed successfully for a number of other hydrocarbon fuels, however, propane has not been considered in this manner. This work focuses on the fuels of propane, as well propene, allene and propyne, for several reasons. The ignition properties of propane resemble those of other higher hydrocarbons but are different from those of the lower hydrocarbons (e.g. ethylene and acetylene). Propane, therefore, may be the smallest hydrocarbon that is representative of higher hydrocarbons in ignition and detonation processes. Since the overall activation energy and ignition times for propane are similar to those of other higher hydrocarbons, including liquid fuels that are suitable for many applications, propane has been used as a model fuel for several numerical and experimental studies. The reason for studying elementary chemistry of propene and C3H4 (allene or propyne) is that during the combustion process, propane breaks down to propene and C3H4 before proceeding to products. Similarly, propene combustion includes C3H4 chemistry. In studying propane combustion, it is therefore necessary to understand the underlying combustion chemistry of propene as well as C3H 4. The first part of this thesis focuses on obtaining and testing a detailed chemical-kinetic description for autoignition of propane, propene and C 3H4, by comparing predictions obtained with this detailed mechanism against numerous experimental data available from shock-tube studies and flame-speed measurements. To keep the detailed mechanism small, attention is restricted to pressures below about 100 atm, temperatures above about 1000 K and equivalence ratios less than about 3. Based on this detailed chemistry description, short (or skeletal) mechanisms are then obtained for each of the three fuels by eliminating reactions that are unimportant for the autoignition process under conditions presented above. This was achieved by utilizing tools such as sensitivity and reaction pathway analyses. Two distinct methodologies were then used in order to obtain a reduced mechanism for autoignition from the short mechanisms. A Systematic Reduction approach is first taken that involves introducing steady-state approximations to as many species as analytically possible. To avoid resorting to numerical methods, the analysis for obtaining ignition times for heptane, presented by Peters and co-workers is followed in order to obtain a rough estimate for an expression of propane ignition time. The results from this expression are then compared to the ignition times obtained computationally with the detailed mechanism. The second method is an Empirical Approach in which chemistry is not derived formally, but rather postulated empirically on the basis of experimental, computational and theoretical observations. As a result, generalized reduced mechanisms are proposed for autoignition of propane, propene and C3H 4. Expressions for ignition times obtained via this empirical approach are compared to the computational results obtained from the detailed mechanism.
Extending rule-based methods to model molecular geometry and 3D model resolution.
Hoard, Brittany; Jacobson, Bruna; Manavi, Kasra; Tapia, Lydia
2016-08-01
Computational modeling is an important tool for the study of complex biochemical processes associated with cell signaling networks. However, it is challenging to simulate processes that involve hundreds of large molecules due to the high computational cost of such simulations. Rule-based modeling is a method that can be used to simulate these processes with reasonably low computational cost, but traditional rule-based modeling approaches do not include details of molecular geometry. The incorporation of geometry into biochemical models can more accurately capture details of these processes, and may lead to insights into how geometry affects the products that form. Furthermore, geometric rule-based modeling can be used to complement other computational methods that explicitly represent molecular geometry in order to quantify binding site accessibility and steric effects. We propose a novel implementation of rule-based modeling that encodes details of molecular geometry into the rules and binding rates. We demonstrate how rules are constructed according to the molecular curvature. We then perform a study of antigen-antibody aggregation using our proposed method. We simulate the binding of antibody complexes to binding regions of the shrimp allergen Pen a 1 using a previously developed 3D rigid-body Monte Carlo simulation, and we analyze the aggregate sizes. Then, using our novel approach, we optimize a rule-based model according to the geometry of the Pen a 1 molecule and the data from the Monte Carlo simulation. We use the distances between the binding regions of Pen a 1 to optimize the rules and binding rates. We perform this procedure for multiple conformations of Pen a 1 and analyze the impact of conformation and resolution on the optimal rule-based model. We find that the optimized rule-based models provide information about the average steric hindrance between binding regions and the probability that antibodies will bind to these regions. These optimized models quantify the variation in aggregate size that results from differences in molecular geometry and from model resolution.
NASA Technical Reports Server (NTRS)
Succi, G. P.
1983-01-01
The techniques of helicopter rotor noise prediction attempt to describe precisely the details of the noise field and remove the empiricisms and restrictions inherent in previous methods. These techniques require detailed inputs of the rotor geometry, operating conditions, and blade surface pressure distribution. The Farassat noise prediction techniques was studied, and high speed helicopter noise prediction using more detailed representations of the thickness and loading noise sources was investigated. These predictions were based on the measured blade surface pressures on an AH-1G rotor and compared to the measured sound field. Although refinements in the representation of the thickness and loading noise sources improve the calculation, there are still discrepancies between the measured and predicted sound field. Analysis of the blade surface pressure data indicates shocks on the blades, which are probably responsible for these discrepancies.
Collins, Brian D.; Jibson, Randall W.
2015-07-28
This report provides a detailed account of assessments performed in May and June 2015 and focuses on valley-blocking landslides because they have the potential to pose considerable hazard to many villages in Nepal. First, we provide a seismological background of Nepal and then detail the methods used for both external and in-country data collection and interpretation. Our results consist of an overview of landsliding extent, a characterization of all valley-blocking landslides identified during our work, and a description of video resources that provide high resolution coverage of approximately 1,000 kilometers (km) of river valleys and surrounding terrain affected by the Gorkha earthquake sequence. This is followed by a description of site-specific landslide-hazard assessments conducted while in Nepal and includes detailed descriptions of five noteworthy case studies. Finally, we assess the expectation for additional landslide hazards during the 2015 summer monsoon season.
Informed consent and the readability of the written consent form.
Sivanadarajah, N; El-Daly, I; Mamarelis, G; Sohail, M Z; Bates, P
2017-11-01
Introduction The aim of this study was to objectively ascertain the level of readability of standardised consent forms for orthopaedic procedures. Methods Standardised consent forms (both in summary and detailed formats) endorsed by the British Orthopaedic Association (BOA) were retrieved from orthoconsent.com and assessed for readability. This involved using an online tool to calculate the validated Flesch reading ease score (FRES). This was compared with the FRES for the National Health Service (NHS) Consent Form 1. Data were analysed and interpreted according to the FRES grading table. Results The FRES for Consent Form 1 was 55.6, relating to the literacy expected of an A level student. The mean FRES for the BOA summary consent forms (n=27) was 63.6 (95% confidence interval [CI]: 61.2-66.0) while for the detailed consent forms (n=32), it was 68.9 (95% CI: 67.7-70.0). All BOA detailed forms scored >60, correlating to the literacy expected of a 13-15-year-old. The detailed forms had a higher FRES than the summary forms (p<0.001). Conclusions This study demonstrates that the BOA endorsed standardised consent forms are much easier to read and understand than the NHS Consent Form 1, with the detailed BOA forms being the easiest to read. Despite this, owing to varying literacy levels, a significant proportion of patients may struggle to give informed consent based on the written information provided to them.
SCALE Continuous-Energy Eigenvalue Sensitivity Coefficient Calculations
Perfetti, Christopher M.; Rearden, Bradley T.; Martin, William R.
2016-02-25
Sensitivity coefficients describe the fractional change in a system response that is induced by changes to system parameters and nuclear data. The Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, including quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the developmentmore » of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Tracklength importance CHaracterization (CLUTCH) and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in the CE-KENO framework of the SCALE code system to enable TSUNAMI-3D to perform eigenvalue sensitivity calculations using continuous-energy Monte Carlo methods. This work provides a detailed description of the theory behind the CLUTCH method and describes in detail its implementation. This work explores the improvements in eigenvalue sensitivity coefficient accuracy that can be gained through the use of continuous-energy sensitivity methods and also compares several sensitivity methods in terms of computational efficiency and memory requirements.« less
NASA Astrophysics Data System (ADS)
Śledziewski, Krzysztof
2018-01-01
Material fatigue it is one of the most frequent causes of steel bridge failures, particularly the bridges already existing. Thus, the procedure of fatigue life assessment is one of the most relevant procedures in a comprehensive assessment of load-carrying capacity and service life of the structure. A reliable assessment of the fatigue life is predominantly decisive for estimation of the remaining service life. Hitherto, calculation methods of welded joints took into account only stresses occurring in cross sections of whole elements and did not take into account stress concentration occurring in the vicinity of the weld, caused by geometrical aspects of the detail. At present, use of the Finite Element Analysis, makes possible looking for more accurate approach to the fatigue design of steel structures. The method of geometrical stresses is just such approach which is based on definition of stresses which take into account geometry of the detail. The study presents fatigue assessment of a representative type of welded joint in welded bridge structures. The testing covered longitudinal attachments. The main analyses were carried out on the basis of FEM and the method of local stresses, so-called "hot-spot" stresses. The obtained values of stresses were compared with the values obtained in accordance with the method of nominal stress.
NASA Astrophysics Data System (ADS)
Yu, H.; Russell, A. G.; Mulholland, J. A.
2017-12-01
In air pollution epidemiologic studies with spatially resolved air pollution data, exposures are often estimated using the home locations of individual subjects. Due primarily to lack of data or logistic difficulties, the spatiotemporal mobility of subjects are mostly neglected, which are expected to result in exposure misclassification errors. In this study, we applied detailed cell phone location data to characterize potential exposure misclassification errors associated with home-based exposure estimation of air pollution. The cell phone data sample consists of 9,886 unique simcard IDs collected on one mid-week day in October, 2013 from Shenzhen, China. The Community Multi-scale Air Quality model was used to simulate hourly ambient concentrations of six chosen pollutants at 3 km spatial resolution, which were then fused with observational data to correct for potential modeling biases and errors. Air pollution exposure for each simcard ID was estimated by matching hourly pollutant concentrations with detailed location data for corresponding IDs. Finally, the results were compared with exposure estimates obtained using the home location method to assess potential exposure misclassification errors. Our results show that the home-based method is likely to have substantial exposure misclassification errors, over-estimating exposures for subjects with higher exposure levels and under-estimating exposures for those with lower exposure levels. This has the potential to lead to a bias-to-the-null in the health effect estimates. Our findings suggest that the use of cell phone data has the potential for improving the characterization of exposure and exposure misclassification in air pollution epidemiology studies.
Are insular populations of the Philippine falconet (Microhierax erythrogenys) steps in a cline?
Todd E. Katzner; Nigel J. Collar
2013-01-01
Founder effects, new environments, and competition often produce changes in species colonizing islands, although the resulting endemism sometimes requires molecular identification. One method to identify fruitful areas for more detailed genetic study is through comparative morphological analyses. We measured 210 museum specimens to evaluate the potential morphological...
Oral Health Patterns among Schoolchildren in Mafraq Governorate, Jordan
ERIC Educational Resources Information Center
ALBashtawy, Mohammed
2012-01-01
Little is known about the oral hygiene patterns among schoolchildren in Jordan. A school-based cross-sectional study was performed from January to March 2010. A simple random sampling method was used. Each student participant completed a detailed questionnaire regarding oral hygiene habits. Data were coded and analyzed using SPSS software version…
Program of Studies: Trade and Industrial: Grades 9-12.
ERIC Educational Resources Information Center
Fairfax County Schools, VA.
Part 1 of the trade and industrial education curriculum guide for grades 9-12 contains a brief program overview and Vocational Industrial Clubs of America (VICA) description, more detailed descriptions of in-school and out-of-school programs and program classification methods, a list of references, and charts of various programs and training…
High-resolution solution-state NMR of unfractionated plant cell walls
John Ralph; Fachuang Lu; Hoon Kim; Dino Ress; Daniel J. Yelle; Kenneth E. Hammel; Sally A. Ralph; Bernadette Nanayakkara; Armin Wagner; Takuya Akiyama; Paul F. Schatz; Shawn D. Mansfield; Noritsugu Terashima; Wout Boerjan; Bjorn Sundberg; Mattias Hedenstrom
2009-01-01
Detailed structural studies on the plant cell wall have traditionally been difficult. NMR is one of the preeminent structural tools, but obtaining high-resolution solution-state spectra has typically required fractionation and isolation of components of interest. With recent methods for dissolution of, admittedly, finely divided plant cell wall material, the wall can...
When Worlds Collide: Identity, Culture and the Lived Experiences of Research When "Teaching-Led"
ERIC Educational Resources Information Center
Sharp, John G.; Hemmings, Brian; Kay, Russell; Callinan, Carol
2015-01-01
This article presents detailed findings from the qualitative or interpretive phase of a mixed-methods case study focusing on the professional identities and lived experiences of research among six lecturers working in different capacities across the field of education in a "teaching-led" higher education institution. Building upon the…
Grammatical Processing of Spoken Language in Child and Adult Language Learners
ERIC Educational Resources Information Center
Felser, Claudia; Clahsen, Harald
2009-01-01
This article presents a selective overview of studies that have investigated auditory language processing in children and late second-language (L2) learners using online methods such as event-related potentials (ERPs), eye-movement monitoring, or the cross-modal priming paradigm. Two grammatical phenomena are examined in detail, children's and…
Writing Professor as Adult Learner: An Autoethnography of Online Professional Development
ERIC Educational Resources Information Center
Henning, Teresa Beth
2012-01-01
This paper is a study of the author's experiences taking a six-week, asynchronous, online, faculty development class for educators at the secondary and postsecondary levels. Using autoethnography methods, the author details her learning and the ways her experiences support adult learning theories. Implications of this research suggest that adult…
World Percussion Approaches in Collegiate Percussion Programs: A Mixed-Methods Study
ERIC Educational Resources Information Center
Hernly, Patrick
2012-01-01
As world percussion has grown in popularity in American colleges and universities, two main problems have emerged. The first problem is that no known source exists detailing how percussion instructors have incorporated world percussion into their collegiate teaching. A review of the literature has highlighted four main approaches to incorporating…
Parent Reactions to a School-Based Body Mass Index Screening Program
ERIC Educational Resources Information Center
Johnson, Suzanne Bennett; Pilkington, Lorri L.; Lamp, Camilla; He, Jianghua; Deeb, Larry C.
2009-01-01
Background: This study assessed parent reactions to school-based body mass index (BMI) screening. Methods: After a K-8 BMI screening program, parents were sent a letter detailing their child's BMI results. Approximately 50 parents were randomly selected for interview from each of 4 child weight-classification groups (overweight, at risk of…
Educating Prospective Teachers of Biology: Introduction and Research Methods.
ERIC Educational Resources Information Center
Hewson, Peter W.; Tabachnick, B. Robert; Zeichner, Kenneth M.; Blomker, Kathryn B.; Meyer, Helen; Lemberger, John; Marion, Robin; Park, Hyun-Ju; Toolin, Regina
1999-01-01
Introduces an issue that details a complex study of a science-teacher-education program whose goal was to graduate teachers who held conceptual change conceptions of teaching science and were disposed to put them into practice. Presents a conceptual framework for science-teacher education, and describes the context and major questions of the…
The Impact of Local Labor Market Factors on Army Reserve Accessions.
1986-06-01
Borack et al,1985: p. 16]. Conversely non-wage benefits accruing on the moonlighting job will have the opposite effect. Primary wace . The effect of a... Methods which have been used to overcome this problem are discussed in more detail below. 1. Active Force SupDlv Studies The large number of recent
Extending Methods: Using Bourdieu's Field Analysis to Further Investigate Taste
ERIC Educational Resources Information Center
Dimick, Alexandra Schindel
2015-01-01
In this commentary on Per Anderhag, Per-Olof Wickman and Karim Hamza's article "Signs of taste for science," I consider how their study is situated within the concern for the role of science education in the social and cultural production of inequality. Their article provides a finely detailed methodology for analyzing the constitution…
Seeding Success: Schools That Work for Aboriginal Students
ERIC Educational Resources Information Center
Munns, Geoff; O'Rourke, Virginia; Bodkin-Andrews, Gawaian
2013-01-01
This article reports on a large mixed methods research project that investigated the conditions of success for Aboriginal school students. The article presents the qualitative case study component of the research. It details the work of four schools identified as successful for Aboriginal students with respect to social and academic outcomes, and…
ERIC Educational Resources Information Center
Krausert, Christopher R.; Ying, Di; Zhang, Yu; Jiang, Jack J.
2011-01-01
Purpose: Digital kymography and vocal fold curve fitting are blended with detailed symmetry analysis of kymograms to provide a comprehensive characterization of the vibratory properties of injured vocal folds. Method: Vocal fold vibration of 12 excised canine larynges was recorded under uninjured, unilaterally injured, and bilaterally injured…
Does Civic Education Matter?: The Power of Long-Term Observation and the Experimental Method
ERIC Educational Resources Information Center
Claassen, Ryan L.; Monson, J. Quin
2015-01-01
Despite consensus regarding the civic shortcomings of American citizens, no such scholarly consensus exists regarding the effectiveness of civic education addressing political apathy and ignorance. Accordingly, we report the results of a detailed study of students enrolled in introductory American politics courses on the campuses of two large…
An analysis of numerical convergence in discrete velocity gas dynamics for internal flows
NASA Astrophysics Data System (ADS)
Sekaran, Aarthi; Varghese, Philip; Goldstein, David
2018-07-01
The Discrete Velocity Method (DVM) for solving the Boltzmann equation has significant advantages in the modeling of non-equilibrium and near equilibrium flows as compared to other methods in terms of reduced statistical noise, faster solutions and the ability to handle transient flows. Yet the DVM performance for rarefied flow in complex, small-scale geometries, in microelectromechanical (MEMS) devices for instance, is yet to be studied in detail. The present study focuses on the performance of the DVM for locally large Knudsen number flows of argon around sharp corners and other sources for discontinuities in the distribution function. Our analysis details the nature of the solution for some benchmark cases and introduces the concept of solution convergence for the transport terms in the discrete velocity Boltzmann equation. The limiting effects of the velocity space discretization are also investigated and the constraints on obtaining a robust, consistent solution are derived. We propose techniques to maintain solution convergence and demonstrate the implementation of a specific strategy and its effect on the fidelity of the solution for some benchmark cases.
Timing the warm absorber in NGC4051
NASA Astrophysics Data System (ADS)
Silva, C.; Uttley, P.; Costantini, E.
2015-07-01
In this work we have combined spectral and timing analysis in the characterization of highly ionized outflows in Seyfert galaxies, the so-called warm absorbers. Here, we present our results on the extensive ˜600ks of XMM-Newton archival observations of the bright and highly variable Seyfert 1 galaxy NGC4051, whose spectrum has revealed a complex multi-component wind. Working simultaneously with RGS and PN data, we have performed a detailed analysis using a time-dependent photoionization code in combination with spectral and Fourier timing techniques. This method allows us to study in detail the response of the gas due to variations in the ionizing flux of the central source. As a result, we will show the contribution of the recombining gas to the time delays of the most highly absorbed energy bands relative to the continuum (Silva, Uttley & Costantini in prep.), which is also vital information for interpreting the continuum lags associated with propagation and reverberation effects in the inner emitting regions. Furthermore, we will illustrate how this powerful method can be applied to other sources and warm-absorber configurations, allowing for a wide range of studies.
NASA Astrophysics Data System (ADS)
Li, Zhe; Feng, Jinchao; Liu, Pengyu; Sun, Zhonghua; Li, Gang; Jia, Kebin
2018-05-01
Temperature is usually considered as a fluctuation in near-infrared spectral measurement. Chemometric methods were extensively studied to correct the effect of temperature variations. However, temperature can be considered as a constructive parameter that provides detailed chemical information when systematically changed during the measurement. Our group has researched the relationship between temperature-induced spectral variation (TSVC) and normalized squared temperature. In this study, we focused on the influence of temperature distribution in calibration set. Multi-temperature calibration set selection (MTCS) method was proposed to improve the prediction accuracy by considering the temperature distribution of calibration samples. Furthermore, double-temperature calibration set selection (DTCS) method was proposed based on MTCS method and the relationship between TSVC and normalized squared temperature. We compare the prediction performance of PLS models based on random sampling method and proposed methods. The results from experimental studies showed that the prediction performance was improved by using proposed methods. Therefore, MTCS method and DTCS method will be the alternative methods to improve prediction accuracy in near-infrared spectral measurement.
Ovchinnikov, Victor; Karplus, Martin; Vanden-Eijnden, Eric
2011-01-01
A set of techniques developed under the umbrella of the string method is used in combination with all-atom molecular dynamics simulations to analyze the conformation change between the prepowerstroke (PPS) and rigor (R) structures of the converter domain of myosin VI. The challenges specific to the application of these techniques to such a large and complex biomolecule are addressed in detail. These challenges include (i) identifying a proper set of collective variables to apply the string method, (ii) finding a suitable initial string, (iii) obtaining converged profiles of the free energy along the transition path, (iv) validating and interpreting the free energy profiles, and (v) computing the mean first passage time of the transition. A detailed description of the PPS↔R transition in the converter domain of myosin VI is obtained, including the transition path, the free energy along the path, and the rates of interconversion. The methodology developed here is expected to be useful more generally in studies of conformational transitions in complex biomolecules. PMID:21361558
NASA Astrophysics Data System (ADS)
Zahid, F.; Paulsson, M.; Polizzi, E.; Ghosh, A. W.; Siddiqui, L.; Datta, S.
2005-08-01
We present a transport model for molecular conduction involving an extended Hückel theoretical treatment of the molecular chemistry combined with a nonequilibrium Green's function treatment of quantum transport. The self-consistent potential is approximated by CNDO (complete neglect of differential overlap) method and the electrostatic effects of metallic leads (bias and image charges) are included through a three-dimensional finite element method. This allows us to capture spatial details of the electrostatic potential profile, including effects of charging, screening, and complicated electrode configurations employing only a single adjustable parameter to locate the Fermi energy. As this model is based on semiempirical methods it is computationally inexpensive and flexible compared to ab initio models, yet at the same time it is able to capture salient qualitative features as well as several relevant quantitative details of transport. We apply our model to investigate recent experimental data on alkane dithiol molecules obtained in a nanopore setup. We also present a comparison study of single molecule transistors and identify electronic properties that control their performance.
A short introduction to cytogenetic studies in mammals with reference to the present volume.
Graphodatsky, A; Ferguson-Smith, M A; Stanyon, R
2012-01-01
Genome diversity has long been studied from the comparative cytogenetic perspective. Early workers documented differences between species in diploid chromosome number and fundamental number. Banding methods allowed more detailed descriptions of between-species rearrangements and classes of differentially staining chromosome material. The infusion of molecular methods into cytogenetics provided a third revolution, which is still not exhausted. Chromosome painting has provided a global view of the translocation history of mammalian genome evolution, well summarized in the contributions to this special volume. More recently, FISH of cloned DNA has provided details on defining breakpoint and intrachromosomal marker order, which have helped to document inversions and centromere repositioning. The most recent trend in comparative molecular cytogenetics is to integrate sequencing information in order to formulate and test reconstructions of ancestral genomes and phylogenomic hypotheses derived from comparative cytogenetics. The integration of comparative cytogenetics and sequencing promises to provide an understanding of what drives chromosome rearrangements and genome evolution in general. We believe that the contributions in this volume, in no small way, point the way to the next phase in cytogenetic studies. Copyright © 2012 S. Karger AG, Basel.
The geography of spatial synchrony.
Walter, Jonathan A; Sheppard, Lawrence W; Anderson, Thomas L; Kastens, Jude H; Bjørnstad, Ottar N; Liebhold, Andrew M; Reuman, Daniel C
2017-07-01
Spatial synchrony, defined as correlated temporal fluctuations among populations, is a fundamental feature of population dynamics, but many aspects of synchrony remain poorly understood. Few studies have examined detailed geographical patterns of synchrony; instead most focus on how synchrony declines with increasing linear distance between locations, making the simplifying assumption that distance decay is isotropic. By synthesising and extending prior work, we show how geography of synchrony, a term which we use to refer to detailed spatial variation in patterns of synchrony, can be leveraged to understand ecological processes including identification of drivers of synchrony, a long-standing challenge. We focus on three main objectives: (1) showing conceptually and theoretically four mechanisms that can generate geographies of synchrony; (2) documenting complex and pronounced geographies of synchrony in two important study systems; and (3) demonstrating a variety of methods capable of revealing the geography of synchrony and, through it, underlying organism ecology. For example, we introduce a new type of network, the synchrony network, the structure of which provides ecological insight. By documenting the importance of geographies of synchrony, advancing conceptual frameworks, and demonstrating powerful methods, we aim to help elevate the geography of synchrony into a mainstream area of study and application. © 2017 John Wiley & Sons Ltd/CNRS.
The use of geoscience methods for terrestrial forensic searches
NASA Astrophysics Data System (ADS)
Pringle, J. K.; Ruffell, A.; Jervis, J. R.; Donnelly, L.; McKinley, J.; Hansen, J.; Morgan, R.; Pirrie, D.; Harrison, M.
2012-08-01
Geoscience methods are increasingly being utilised in criminal, environmental and humanitarian forensic investigations, and the use of such methods is supported by a growing body of experimental and theoretical research. Geoscience search techniques can complement traditional methodologies in the search for buried objects, including clandestine graves, weapons, explosives, drugs, illegal weapons, hazardous waste and vehicles. This paper details recent advances in search and detection methods, with case studies and reviews. Relevant examples are given, together with a generalised workflow for search and suggested detection technique(s) table. Forensic geoscience techniques are continuing to rapidly evolve to assist search investigators to detect hitherto difficult to locate forensic targets.
NASA Technical Reports Server (NTRS)
1974-01-01
Major resource management missions to be performed by the TERSSE are examined in order to develop an understanding of the form and function of a system designed to perform an operational mission. Factors discussed include: resource manager (user) functions, methods of performing their function, the information flows and information requirements embodied in their function, and the characteristics of the observation system which assists in the management of the resource involved. The missions selected for study are: world crop survey and land resources management. These missions are found to represent opposite ends of the TERSSE spectrum and to support the conclusion that different missions require different systems and must be analyzed in detail to permit proper system development decisions.
Anatomical exploration of a dicephalous goat kid using sheet plastination (E12).
Elnady, Fawzy; Sora, Mircea-Constantin
2009-06-01
A dicephalous, 1-day-old, female goat kid was presented for anatomical study. Epoxy plastination slices (E12) were used successfully to explore this condition. They provided excellent anatomic and bone detail, demonstrating organ position, shared structures, and vascular anatomy. Sheet plastination (E12) was used as an optimal method to clarify how the two heads were united, especially the neuroanatomy. The plastinated transparent slices allowed detailed study of the anatomical structures, in a non-collapsed and non-dislocated state. Thus, we anatomically explored this rare condition without traditional dissection. The advantages of plastination extended to the preservation at room temperature of this case for further topographical investigation. To the authors' best knowledge, this is the first published report of plastination of a dicephalous goat.
Comparison of the spatial landmark scatter of various 3D digitalization methods.
Boldt, Florian; Weinzierl, Christian; Hertrich, Klaus; Hirschfelder, Ursula
2009-05-01
The aim of this study was to compare four different three-dimensional digitalization methods on the basis of the complex anatomical surface of a cleft lip and palate plaster cast, and to ascertain their accuracy when positioning 3D landmarks. A cleft lip and palate plaster cast was digitalized with the SCAN3D photo-optical scanner, the OPTIX 400S laser-optical scanner, the Somatom Sensation 64 computed tomography system and the MicroScribe MLX 3-axis articulated-arm digitizer. First, four examiners appraised by individual visual inspection the surface detail reproduction of the three non-tactile digitalization methods in comparison to the reference plaster cast. The four examiners then localized the landmarks five times at intervals of 2 weeks. This involved simply copying, or spatially tracing, the landmarks from a reference plaster cast to each model digitally reproduced by each digitalization method. Statistical analysis of the landmark distribution specific to each method was performed based on the 3D coordinates of the positioned landmarks. Visual evaluation of surface detail conformity assigned the photo-optical digitalization method an average score of 1.5, the highest subjectively-determined conformity (surpassing computer tomographic and laser-optical methods). The tactile scanning method revealed the lowest degree of 3D landmark scatter, 0.12 mm, and at 1.01 mm the lowest maximum 3D landmark scatter; this was followed by the computer tomographic, photo-optical and laser-optical methods (in that order). This study demonstrates that the landmarks' precision and reproducibility are determined by the complexity of the reference-model surface as well as the digital surface quality and individual ability of each evaluator to capture 3D spatial relationships. The differences in the 3D-landmark scatter values and lowest maximum 3D-landmark scatter between the best and the worst methods showed minor differences. The measurement results in this study reveal that it is not the method's precision but rather the complexity of the object analysis being planned that should determine which method is ultimately employed.
Royle, J. Andrew; Chandler, Richard B.; Sollmann, Rahel; Gardner, Beth
2013-01-01
Spatial Capture-Recapture provides a revolutionary extension of traditional capture-recapture methods for studying animal populations using data from live trapping, camera trapping, DNA sampling, acoustic sampling, and related field methods. This book is a conceptual and methodological synthesis of spatial capture-recapture modeling. As a comprehensive how-to manual, this reference contains detailed examples of a wide range of relevant spatial capture-recapture models for inference about population size and spatial and temporal variation in demographic parameters. Practicing field biologists studying animal populations will find this book to be a useful resource, as will graduate students and professionals in ecology, conservation biology, and fisheries and wildlife management.
Ride quality research techniques: Section on general techniques
NASA Technical Reports Server (NTRS)
1977-01-01
Information is gathered about the methods currently used for the study of ride quality in a variety of transportation modes by a variety of research organizations, including universities, Federal agencies, contracting firms, and private industries. Detailed descriptions of these techniques and their strengths and weaknesses, and identifying the organizations using such methods are presented. The specific efforts of the Group's participants, as well as a variety of feasible approaches not currently in use, are presented as methodological alternatives under the three basic factors which must be considered in ride quality studies: research techniques, research environments, and choice of subjects.
Facebook or Twitter?: Effective recruitment strategies for family caregivers.
Herbell, Kayla; Zauszniewski, Jaclene A
2018-06-01
This brief details recent recruitment insights from a large all-online study of family caregivers that aimed to develop a measure to assess how family caregivers manage daily stresses. Online recruitment strategies included the use of Twitter and Facebook. Overall, 800 individuals responded to the recruitment strategy; 230 completed all study procedures. The most effective online recruitment strategy for targeting family caregivers was Facebook, yielding 86% of the sample. Future researchers may find the use of social media recruitment methods appealing because they are inexpensive, simple, and efficient methods for obtaining National samples. Copyright © 2018 Elsevier Inc. All rights reserved.
Ding, Xiuhua; Su, Shaoyong; Nandakumar, Kannabiran; Wang, Xiaoling; Fardo, David W
2014-01-01
Large-scale genetic studies are often composed of related participants, and utilizing familial relationships can be cumbersome and computationally challenging. We present an approach to efficiently handle sequencing data from complex pedigrees that incorporates information from rare variants as well as common variants. Our method employs a 2-step procedure that sequentially regresses out correlation from familial relatedness and then uses the resulting phenotypic residuals in a penalized regression framework to test for associations with variants within genetic units. The operating characteristics of this approach are detailed using simulation data based on a large, multigenerational cohort.
Supermarket sales data: feasibility and applicability in population food and nutrition monitoring.
Tin, Sandar Tin; Mhurchu, Cliona Ni; Bullen, Chris
2007-01-01
Population food and nutrition monitoring plays a critical role in understanding suboptimal nutrition at the population level, yet current monitoring methods such as national surveys are not practical to undertake on a continuous basis. Supermarket sales data potentially address this gap by providing detailed, timely, and inexpensive monitoring data for informing policies and anticipating trends. This paper reviews 22 studies that used supermarket sales data to examine food purchasing patterns. Despite some methodological limitations, feasibility studies showed promising results. The potential and limitations of using supermarket sales data to supplement food and nutrition monitoring methods are discussed.
An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle
NASA Astrophysics Data System (ADS)
Wang, Yue; Gao, Dan; Mao, Xuming
2018-03-01
A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.
NASA Astrophysics Data System (ADS)
Majka, Marcin; Gadda, Giacomo; Taibi, Angelo; Gałązka, Mirosław; Zieliński, Piotr
2017-03-01
We have developed a numerical simulation method for predicting the time dependence (wave form) of pressure at any location in the systemic arterial system in humans. The method uses the matlab-Simulink environment. The input data include explicitly the geometry of the arterial tree, treated up to an arbitrary bifurcation level, and the elastic properties of arteries as well as rheological parameters of blood. Thus, the impact of anatomic details of an individual subject can be studied. The method is applied here to reveal the earliest stages of mechanical reaction of the pressure profiles to sudden local blockages (thromboses or embolisms) of selected arteries. The results obtained with a purely passive model provide reference data indispensable for studies of longer-term effects due to neural and humoral mechanisms. The reliability of the results has been checked by comparison of two available sets of anatomic, elastic, and rheological data involving (i) 55 and (ii) 138 arterial segments. The remaining arteries have been replaced with the appropriate resistive elements. Both models are efficient in predicting an overall shift of pressure, whereas the accuracy of the 55-segment model in reproducing the detailed wave forms and stabilization times turns out dependent on the location of the blockage and the observation point.
Mechanisms of small molecule–DNA interactions probed by single-molecule force spectroscopy
Almaqwashi, Ali A.; Paramanathan, Thayaparan; Rouzina, Ioulia; Williams, Mark C.
2016-01-01
There is a wide range of applications for non-covalent DNA binding ligands, and optimization of such interactions requires detailed understanding of the binding mechanisms. One important class of these ligands is that of intercalators, which bind DNA by inserting aromatic moieties between adjacent DNA base pairs. Characterizing the dynamic and equilibrium aspects of DNA-intercalator complex assembly may allow optimization of DNA binding for specific functions. Single-molecule force spectroscopy studies have recently revealed new details about the molecular mechanisms governing DNA intercalation. These studies can provide the binding kinetics and affinity as well as determining the magnitude of the double helix structural deformations during the dynamic assembly of DNA–ligand complexes. These results may in turn guide the rational design of intercalators synthesized for DNA-targeted drugs, optical probes, or integrated biological self-assembly processes. Herein, we survey the progress in experimental methods as well as the corresponding analysis framework for understanding single molecule DNA binding mechanisms. We discuss briefly minor and major groove binding ligands, and then focus on intercalators, which have been probed extensively with these methods. Conventional mono-intercalators and bis-intercalators are discussed, followed by unconventional DNA intercalation. We then consider the prospects for using these methods in optimizing conventional and unconventional DNA-intercalating small molecules. PMID:27085806
Klieve, Helen; Sveticic, Jerneja; De Leo, Diego
2009-01-01
Background The 1996 Australian National Firearms Agreement introduced strict access limitations. However, reports on the effectiveness of the new legislation are conflicting. This study, accessing all cases of suicide 1997-2004, explores factors which may impact on the choice of firearms as a suicide method, including current licence possession and previous history of legal access. Methods Detailed information on all Queensland suicides (1997-2004) was obtained from the Queensland Suicide Register, with additional details of firearm licence history accessed from the Firearm Registry (Queensland Police Service). Cases were compared against licence history and method choice (firearms or other method). Odds ratios (OR) assessed the risk of firearms suicide and suicide by any method against licence history. A logistic regression was undertaken identifying factors significant in those most likely to use firearms in suicide. Results The rate of suicide using firearms in those with a current license (10.92 per 100,000) far exceeded the rate in those with no license history (1.03 per 100,000). Those with a license history had a far higher rate of suicide (30.41 per 100,000) compared to that of all suicides (15.39 per 100,000). Additionally, a history of firearms licence (current or present) was found to more than double the risk of suicide by any means (OR = 2.09, P < 0.001). The group with the highest risk of selecting firearms to suicide were older males from rural locations. Conclusion Accessibility and familiarity with firearms represent critical elements in determining the choice of method. Further licensing restrictions and the implementation of more stringent secure storage requirements are likely to reduce the overall familiarity with firearms in the community and contribute to reductions in rates of suicide. PMID:19778414
Hill, Marilyn H E; Bradley, Angela; Mushtaq, Sohail; Williams, Elizabeth A; Powers, Hilary J
2009-07-01
Riboflavin status is usually measured as the in vitro stimulation with flavin adenine dinucleotide of the erythrocyte enzyme glutathione reductase, and expressed as an erythrocyte glutathione reductase activation coefficient (EGRAC). This method is used for the National Diet and Nutrition Surveys (NDNS) of the UK. In the period between the 1990 and 2003 surveys of UK adults, the estimated prevalence of riboflavin deficiency, expressed as an EGRAC value > or = 1.30, increased from 2 to 46 % in males and from 1 to 34 % in females. We hypothesised that subtle but important differences in the detail of the methodology between the two NDNS accounted for this difference. We carried out an evaluation of the performance of the methods used in the two NDNS and compared against an 'in-house' method, using blood samples collected from a riboflavin intervention study. Results indicated that the method used for the 1990 NDNS gave a significantly lower mean EGRAC value than both the 2003 NDNS method and the 'in-house' method (P < 0.0001). The key differences between the methods relate to the concentration of FAD used in the assay and the duration of the period of incubation of FAD with enzyme. The details of the EGRAC method should be standardised for use in different laboratories and over time. Additionally, it is proposed that consideration be given to re-evaluating the basis of the EGRAC threshold for riboflavin deficiency.