The statistical analysis of circadian phase and amplitude in constant-routine core-temperature data
NASA Technical Reports Server (NTRS)
Brown, E. N.; Czeisler, C. A.
1992-01-01
Accurate estimation of the phases and amplitude of the endogenous circadian pacemaker from constant-routine core-temperature series is crucial for making inferences about the properties of the human biological clock from data collected under this protocol. This paper presents a set of statistical methods based on a harmonic-regression-plus-correlated-noise model for estimating the phases and the amplitude of the endogenous circadian pacemaker from constant-routine core-temperature data. The methods include a Bayesian Monte Carlo procedure for computing the uncertainty in these circadian functions. We illustrate the techniques with a detailed study of a single subject's core-temperature series and describe their relationship to other statistical methods for circadian data analysis. In our laboratory, these methods have been successfully used to analyze more than 300 constant routines and provide a highly reliable means of extracting phase and amplitude information from core-temperature data.
Dobbins, T J; Ida, K; Suzuki, C; Yoshinuma, M; Kobayashi, T; Suzuki, Y; Yoshida, M
2017-09-01
A new Motional Stark Effect (MSE) analysis routine has been developed for improved spatial resolution in the core of the Large Helical Device (LHD). The routine was developed to reduce the dependency of the analysis on the Pfirsch-Schlüter (PS) current in the core. The technique used the change in the polarization angle as a function of flux in order to find the value of diota/dflux at each measurement location. By integrating inwards from the edge, the iota profile can be recovered from this method. This reduces the results' dependency on the PS current because the effect of the PS current on the MSE measurement is almost constant as a function of flux in the core; therefore, the uncertainty in the PS current has a minimal effect on the calculation of the iota profile. In addition, the VMEC database was remapped from flux into r/a space by interpolating in mode space in order to improve the database core resolution. These changes resulted in a much smoother iota profile, conforming more to the physics expectations of standard discharge scenarios in the core of the LHD.
Role of CT scanning in formation evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergosh, J.L.; Dibona, B.G.
1988-01-01
The use of the computerized tomographic (CT) scanner in formation evaluation of difficult to analyze core samples has moved from the research and development phase to daily, routine use in the core-analysis laboratory. The role of the CT scanner has become increasingly important as geologists try to obtain more representative core material for accurate formation evaluation. The most common problem facing the core analyst when preparing to measure petrophysical properties is the selection of representative and unaltered core samples for routine and special core testing. Recent data have shown that heterogeneous reservoir rock can be very difficult, if not impossible,more » to assess correctly when using standard core examination procedures, because many features, such as fractures, are not visible on the core surface. Another problem is the invasion of drilling mud into the core sample. Flushing formation oil and water from the core can greatly alter the saturation and distribution of fluids and lead to serious formation evaluation problems. Because the quality and usefulness of the core date are directly tied to proper sample selection, it has become imperative that the CT scanner be used whenever possible.« less
2013-01-01
Background Studies that have looked at the effect of polio eradication efforts in India on routine immunization programs have provided mixed findings. One polio eradication project, funded by US Agency for International Development (USAID) and carried out by the CORE Group Polio Project (CGPP) in the state of Uttar Pradesh of India, has included the strengthening of routine immunization systems as a core part of its polio eradication strategy. This paper explores the performance of routine immunization services in the CGPP intervention areas concurrent with intensive polio eradication activities. The paper also explores determinants of routine immunization performance such as caretaker characteristics and CGPP activities to strengthen routine immunization services. Methods We conduct secondary data analysis of the latest project household immunization survey in 2011 and compare these findings to reports of past surveys in the CGPP program area and at the Uttar Pradesh state level (as measured by children’s receipt of DPT vaccinations). This is done to judge if there is any evidence that routine immunization services are being disrupted. We also model characteristics of survey respondents and respondents’ exposure to CGPP, communication activities against their children’s receipt of key vaccinations in order to identify determinants of routine immunization coverage. Results Routine immunization coverage has increased between the first survey (2005 for state level estimates, 2008 for the CGPP program) and the latest (2011 for both state level and CGPP areas), as measured by children’s receipt of DPT vaccination. This increase occurred concurrent with polio eradication efforts intensive enough to result in interruption of transmission. In addition, a mothers’ exposure to specific communication materials, her religion and education were associated with whether or not her children receive one or more doses of DPT. Conclusions A limitation of the analysis is the absence of a controlled comparison. It is possible routine immunization coverage would have increased even more in the absence of polio eradication efforts. At the same time, however, there is no evidence that routine immunization services were disrupted by polio eradication efforts. Targeted health communications are helpful in improving routine immunization performance. Strategies to address other determinants of routine immunization, such as religion and education, are also needed to maximize coverage. PMID:23680228
Santana, Márcia Rosane Moreira; da Silva, Marília Marques; de Moraes, Danielle Souza; Fukuda, Cláudia Cristina; Freitas, Lucia Helena; Ramos, Maria Eveline Cascardo; Fleury, Heloísa Junqueira; Evans, Chris
2015-01-01
The Clinical Outcome in Routine Evaluation - Outcome Measurement (CORE-OM) was developed in the 1990s, with the aim of assessing the efficacy and effectiveness of mental health treatments. To adapt the CORE-OM for use in the Brazilian population. The instrument was translated and adapted based on the international protocol developed by the CORE System Trust which contains seven steps: translation, semantic equivalence analysis, synthesis of the translated versions, pre-testing in the target population, data analysis and back translation. After semantic analysis, modifications were necessary in seven of the 34 original items. Changes were made to avoid repetition of words and the use of terms difficult to understand. Internal consistency analysis showed evidence of score stability in the CORE-OM adapted to Brazilian Portuguese. The instrument was successfully adapted to Brazilian Portuguese, and its semantic and conceptual properties were equivalent to those of the original instrument.
Brazier, John E.; Rowen, Donna; Barkham, Michael
2013-01-01
Background. The Clinical Outcomes in Routine Evaluation–Outcome Measure (CORE-OM) is used to evaluate the effectiveness of psychological therapies in people with common mental disorders. The objective of this study was to estimate a preference-based index for this population using CORE-6D, a health state classification system derived from the CORE-OM consisting of a 5-item emotional component and a physical item, and to demonstrate a novel method for generating states that are not orthogonal. Methods. Rasch analysis was used to identify 11 emotional health states from CORE-6D that were frequently observed in the study population and are, thus, plausible (in contrast, conventional statistical design might generate implausible states). Combined with the 3 response levels of the physical item of CORE-6D, they generate 33 plausible health states, 18 of which were selected for valuation. A valuation survey of 220 members of the public in South Yorkshire, United Kingdom, was undertaken using the time tradeoff (TTO) method. Regression analysis was subsequently used to predict values for all possible states described by CORE-6D. Results. A number of multivariate regression models were built to predict values for the 33 health states of CORE-6D, using the Rasch logit value of the emotional state and the response level of the physical item as independent variables. A cubic model with high predictive value (adjusted R2 = 0.990) was selected to predict TTO values for all 729 CORE-6D health states. Conclusion. The CORE-6D preference-based index will enable the assessment of cost-effectiveness of interventions for people with common mental disorders using existing and prospective CORE-OM data sets. The new method for generating states may be useful for other instruments with highly correlated dimensions. PMID:23178639
Jagust, William J.; Landau, Susan M.; Koeppe, Robert A.; Reiman, Eric M.; Chen, Kewei; Mathis, Chester A.; Price, Julie C.; Foster, Norman L.; Wang, Angela Y.
2015-01-01
INTRODUCTION This paper reviews the work done in the ADNI PET core over the past 5 years, largely concerning techniques, methods, and results related to amyloid imaging in ADNI. METHODS The PET Core has utilized [18F]florbetapir routinely on ADNI participants, with over 1600 scans available for download. Four different laboratories are involved in data analysis, and have examined factors such as longitudinal florbetapir analysis, use of FDG-PET in clinical trials, and relationships between different biomarkers and cognition. RESULTS Converging evidence from the PET Core has indicated that cross-sectional and longitudinal florbetapir analyses require different reference regions. Studies have also examined the relationship between florbetapir data obtained immediately after injection, which reflects perfusion, and FDG-PET results. Finally, standardization has included the translation of florbetapir PET data to a centiloid scale. CONCLUSION The PET Core has demonstrated a variety of methods for standardization of biomarkers such as florbetapir PET in a multicenter setting. PMID:26194311
BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.
1981-06-01
This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given.
NASA Astrophysics Data System (ADS)
Ja'fari, Ahmad; Hamidzadeh Moghadam, Rasoul
2012-10-01
Routine core analysis provides useful information for petrophysical study of the hydrocarbon reservoirs. Effective porosity and fluid conductivity (permeability) could be obtained from core analysis in laboratory. Coring hydrocarbon bearing intervals and analysis of obtained cores in laboratory is expensive and time consuming. In this study an improved method to make a quantitative correlation between porosity and permeability obtained from core and conventional well log data by integration of different artificial intelligent systems is proposed. The proposed method combines the results of adaptive neuro-fuzzy inference system (ANFIS) and neural network (NN) algorithms for overall estimation of core data from conventional well log data. These methods multiply the output of each algorithm with a weight factor. Simple averaging and weighted averaging were used for determining the weight factors. In the weighted averaging method the genetic algorithm (GA) is used to determine the weight factors. The overall algorithm was applied in one of SW Iran’s oil fields with two cored wells. One-third of all data were used as the test dataset and the rest of them were used for training the networks. Results show that the output of the GA averaging method provided the best mean square error and also the best correlation coefficient with real core data.
EDITSPEC: System Manual. Volume IV. Data Handler.
1980-11-01
PRINTS AND ABORTS OR RETURNS WITHOUT SAYING ANYTHING DKFBF FILL BUFFER ROUTINE: BT ENTRY AT IBTAD IS IN D GET BLOCK NBL OF DATA SET NSW IN AND WAIT FOR...READ COMPLETION DKFND ROUTINE TO LOCATE BLOCK NBL SEGMENT NSG OF DATA SET NSW. N SEARCHES BT’S FIRST’THEN READS INTO CORE RETURNS IBTAD=THE BT ENTRY...WHICH IS RETURNED IN NBL . DKMIC ROUTINE TO SEARCH IN CORE BUFFER TABLES FOR ONE WITH DATA SET NOS FILENAME FILNM AND RETURN THE ONE WITH THE MOST
Allen, Brett A; Hannon, James C; Burns, Ryan D; Williams, Skip M
2014-07-01
Trunk and core muscular development has been advocated to increase athletic performance and for maintenance of musculoskeletal health, especially related to the prevention of low back pain (LBP). The purpose of this study was to examine the effects of a simple core conditioning routine on tests of trunk and core muscular endurance in school-aged children. Participants included 164 students (86 girls, 78 boys; mean age, 11.5 ± 2.5 years) recruited from a grade school in a metropolitan area located in the southwestern United States. Students performed an equipment-free, moderate-to-high intensity, dynamic core conditioning warm-up routine once a week for a period of 6 weeks during the start of their physical education classes. The intervention consisted of 10 different dynamic core conditioning exercises performed at a 30-second duration per exercise totaling 5 minutes per session. Pre- and post-assessments of muscular endurance consisted of 5 different trunk and core muscular endurance tests: Parallel Roman Chair Dynamic Back Extension, Prone Plank, Lateral Plank, Dynamic Curl-Up, and Static Curl-up. A generalized estimation equation was used to analyze differences in pre- and post-intervention muscular fitness assessments controlling for gender and grade level. Analysis of the data revealed significant increases in muscular fitness test performance for each of the 5 measured outcomes (p < 0.001). Because risk factors of LBP are thought to commence during childhood, results of this study suggest that it may be desirable for children and adolescents to perform moderate-to-high intensity dynamic core exercises during physical education warm-up to improve trunk and core muscular endurance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, D.G.; Eubanks, L.
1998-03-01
This software assists the engineering designer in characterizing the statistical uncertainty in the performance of complex systems as a result of variations in manufacturing processes, material properties, system geometry or operating environment. The software is composed of a graphical user interface that provides the user with easy access to Cassandra uncertainty analysis routines. Together this interface and the Cassandra routines are referred to as CRAX (CassandRA eXoskeleton). The software is flexible enough, that with minor modification, it is able to interface with large modeling and analysis codes such as heat transfer or finite element analysis software. The current version permitsmore » the user to manually input a performance function, the number of random variables and their associated statistical characteristics: density function, mean, coefficients of variation. Additional uncertainity analysis modules are continuously being added to the Cassandra core.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robiinson, David G.
1999-02-20
This software assists the engineering designer in characterizing the statistical uncertainty in the performance of complex systems as a result of variations in manufacturing processes, material properties, system geometry or operating environment. The software is composed of a graphical user interface that provides the user with easy access to Cassandra uncertainty analysis routines. Together this interface and the Cassandra routines are referred to as CRAX (CassandRA eXoskeleton). The software is flexible enough, that with minor modification, it is able to interface with large modeling and analysis codes such as heat transfer or finite element analysis software. The current version permitsmore » the user to manually input a performance function, the number of random variables and their associated statistical characteristics: density function, mean, coefficients of variation. Additional uncertainity analysis modules are continuously being added to the Cassandra core.« less
GaiaGrid : Its Implications and Implementation
NASA Astrophysics Data System (ADS)
Ansari, S. G.; Lammers, U.; Ter Linden, M.
2005-12-01
Gaia is an ESA space mission to determine positions of 1 billion objects in the Galaxy at micro-arcsecond precision. The data analysis and processing requirements of the mission involves about 20 institutes across Europe, each providing specific algorithms for specific tasks, which range from relativistic effects on positional determination, classification, astrometric binary star detection, photometric analysis, spectroscopic analysis etc. In an initial phase, a study has been ongoing over the past three years to determine the complexity of Gaia's data processing. Two processing categories have materialised: core and shell. While core deals with routine data processing, shell tasks are algorithms to carry out data analysis, which involves the Gaia Community at large. For this latter category, we are currently experimenting with use of Grid paradigms to allow access to the core data and to augment processing power to simulate and analyse the data in preparation for the actual mission. We present preliminary results and discuss the sociological impact of distributing the tasks amongst the community.
Three-dimensional pin-to-pin analyses of VVER-440 cores by the MOBY-DICK code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehmann, M.; Mikolas, P.
1994-12-31
Nuclear design for the Dukovany (EDU) VVER-440s nuclear power plant is routinely performed by the MOBY-DICK system. After its implementation on Hewlett Packard series 700 workstations, it is able to perform routinely three-dimensional pin-to-pin core analyses. For purposes of code validation, the benchmark prepared from EDU operational data was solved.
The Alzheimer's Disease Neuroimaging Initiative 2 PET Core: 2015.
Jagust, William J; Landau, Susan M; Koeppe, Robert A; Reiman, Eric M; Chen, Kewei; Mathis, Chester A; Price, Julie C; Foster, Norman L; Wang, Angela Y
2015-07-01
This article reviews the work done in the Alzheimer's Disease Neuroimaging Initiative positron emission tomography (ADNI PET) core over the past 5 years, largely concerning techniques, methods, and results related to amyloid imaging in ADNI. The PET Core has used [(18)F]florbetapir routinely on ADNI participants, with over 1600 scans available for download. Four different laboratories are involved in data analysis, and have examined factors such as longitudinal florbetapir analysis, use of [(18)F]fluorodeoxyglucose (FDG)-PET in clinical trials, and relationships between different biomarkers and cognition. Converging evidence from the PET Core has indicated that cross-sectional and longitudinal florbetapir analyses require different reference regions. Studies have also examined the relationship between florbetapir data obtained immediately after injection, which reflects perfusion, and FDG-PET results. Finally, standardization has included the translation of florbetapir PET data to a centiloid scale. The PET Core has demonstrated a variety of methods for the standardization of biomarkers such as florbetapir PET in a multicenter setting. Copyright © 2015 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
2013-01-01
Background The primary strategy to interrupt transmission of wild poliovirus in India is to improve supplemental immunization activities (SIAs) and routine immunization coverage in priority districts. The CORE Group, part of the Social Mobilization Network (SM Net), has been successful in improving SIA coverage in high-risk areas of Uttar Pradesh (UP). The SM Net works through community level mobilisers (from the CORE Group and UNICEF) and covers more than 2 million children under the age of five. In this paper, we examine the reasons the CORE Group had been successful through exploration of which social mobilization activities of the CORE Group predicted better performance of SIAs. Methods We carried out a secondary data analysis of routine monitoring information collected by the CORE Group and the Government of India for SIAs. These data included information about vaccination outcomes of SIAs in CORE Group areas and non-CORE Group areas within the districts where the CORE Group operates, along with information about the number of various social mobilization activities carried out for each SIA. We employed Generalized Linear Latent and Mixed Model (GLLAMM) statistical analysis methods to identify which social mobilization activities predicted SIA performance, and to account for the intra-class correlation (ICC) between multiple observations within the same geographic areas over time. Results The number of mosque announcements carried out was the most consistent determinant of improved SIA performance across various performance measures. The number of Bullawa Tollies carried out also appeared to be an important determinant of improved SIA performance. The number of times other social mobilization activities were carried out did not appear to determine better SIA performance. Conclusions Social mobilization activities can improve the performance of mass vaccination campaigns. In the CORE Group areas, the number of mosque announcements and Bullawa Tollies carried out were important determinants of desired SIA outcomes. The CORE Group and SM Net should conduct sufficient numbers of these activities in support of each SIA. It is likely, however, that the quality of social mobilization activities (not studied here) is as or more important than the quantity of activities; quality measures of social mobilization activities should be investigated in the future as to how they determine vaccination performance. PMID:23327427
Ferreira, Daniel; Perestelo-Pérez, Lilisbeth; Westman, Eric; Wahlund, Lars-Olof; Sarría, Antonio; Serrano-Aguilar, Pedro
2014-01-01
Background: Current research criteria for Alzheimer’s disease (AD) include cerebrospinal fluid (CSF) biomarkers into the diagnostic algorithm. However, spreading their use to the clinical routine is still questionable. Objective: To provide an updated, systematic and critical review on the diagnostic utility of the CSF core biomarkers for AD. Data sources: MEDLINE, PreMedline, EMBASE, PsycInfo, CINAHL, Cochrane Library, and CRD. Eligibility criteria: (1a) Systematic reviews with meta-analysis; (1b) Primary studies published after the new revised diagnostic criteria; (2) Evaluation of the diagnostic performance of at least one CSF core biomarker. Results: The diagnostic performance of CSF biomarkers is generally satisfactory. They are optimal for discriminating AD patients from healthy controls. Their combination may also be suitable for mild cognitive impairment (MCI) prognosis. However, CSF biomarkers fail to distinguish AD from other forms of dementia. Limitations: (1) Use of clinical diagnosis as standard instead of pathological postmortem confirmation; (2) variability of methodological aspects; (3) insufficiently long follow-up periods in MCI studies; and (4) lower diagnostic accuracy in primary care compared with memory clinics. Conclusion: Additional work needs to be done to validate the application of CSF core biomarkers as they are proposed in the new revised diagnostic criteria. The use of CSF core biomarkers in clinical routine is more likely if these limitations are overcome. Early diagnosis is going to be of utmost importance when effective pharmacological treatment will be available and the CSF core biomarkers can also be implemented in clinical trials for drug development. PMID:24715863
Mahesh, S; Lekha, V; Manipadam, John Mathew; Venugopal, A; Ramesh, H
2016-11-01
The aim of this study is to analyze the outcomes of patients with chronic pancreatitis who underwent the Frey procedure and who had a histologic evidence of adenocarcinoma in the cored out specimen.The type of analysis is retrospective. Out of 523 patients who underwent Frey procedure for chronic pancreatitis, seven (five males and two females; age range 42 to 54 years) had histologically proven adenocarcinoma. In the first four cases, intraoperative frozen section was not done. The diagnosis was made on routine histopathology and only one out of four could undergo attempted curative therapy. In the remaining three cases, intraoperative frozen section confirmation was available, and curative resection performed. Only four out of seven had a clear-cut mass lesion: (a) cancer can occur in chronic pancreatitis in the absence of a mass lesion and (b) intraoperative frozen section of the cored specimen is crucial to exercising curative therapeutic options and must be performed routinely. If frozen section is reported as adenocarcinoma, a head resection with repeat frozen of the margins of resection is appropriate. If the adenocarcinoma is reported on regular histopathology after several days, then a total pancreatectomy may be more appropriate.
Core strengthening and synchronized swimming: TRX® suspension training in young female athletes.
Tinto, Amalia; Campanella, Marta; Fasano, Milena
2017-06-01
Developing muscle strength and full body stability is essential for the efficient execution of technical moves in synchronized swimming. However, many swimmers find it difficult to control body stability while executing particular figures in water. We evaluated the effects of TRX® suspension training (2 sessions weekly for 6 months on core strength and core stability in young female. Twenty synchronized swimmers (Beginners A category, mean age 10±1 years) are divided in experimental group (EG; N.=10 athletes) and control group (CG; N.=10 athletes). EG received suspension training twice weekly (each session lasting about 15 min) as dryland exercises for 6 months in addition to routine training. CG completed routine training with conventional dryland exercises. Before (T1) and after (T2) completion of the study oblique and transversus abdominis muscle force was measured using a Stabilizer Pressure Biofeedback unit, in prone and supine positions, and isotonic muscle endurance was evaluated with the McGill Test. Non-parametric statistical analysis showed a significant increase (P<0.0001) in the majority of the parameters in the experimental group. The study results provide evidence for the benefit of integrating TRX® suspension training in dryland exercises for muscle strengthening in young athletes practicing synchronized swimming, and in general reiterates the importance of strengthening the core area to ensure stability and specific adaptations, improve the quality of the movement and prevent against injury.
Core-core and core-valence correlation energy atomic and molecular benchmarks for Li through Ar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ranasinghe, Duminda S.; Frisch, Michael J.; Petersson, George A., E-mail: gpetersson@wesleyan.edu
2015-12-07
We have established benchmark core-core, core-valence, and valence-valence absolute coupled-cluster single double (triple) correlation energies (±0.1%) for 210 species covering the first- and second-rows of the periodic table. These species provide 194 energy differences (±0.03 mE{sub h}) including ionization potentials, electron affinities, and total atomization energies. These results can be used for calibration of less expensive methodologies for practical routine determination of core-core and core-valence correlation energies.
Akhtar, Muhammad Waseem; Karimi, Hossein; Gilani, Syed Amir
2017-01-01
Low back pain is a frequent problem faced by the majority of people at some point in their lifetime. Exercise therapy has been advocated an effective treatment for chronic low back pain. However, there is lack of consensus on the best exercise treatment and numerous studies are underway. Conclusive studies are lacking especially in this part of the world. Thisstudy was designed to compare the effectiveness of specific stabilization exercises with routine physical therapy exerciseprovided in patients with nonspecific chronic mechanical low back pain. This is single blinded randomized control trial that was conducted at the department of physical therapy Orthopedic and Spine Institute, Johar Town, Lahore in which 120 subjects with nonspecific chronic low back pain participated. Subjects with the age between 20 to 60 years and primary complaint of chronic low back pain were recruited after giving an informed consent. Participants were randomly assigned to two treatment groups A & B which were treated with core stabilization exercise and routine physical therapy exercise respectively. TENS and ultrasound were given as therapeutic modalities to both treatment groups. Outcomes of the treatment were recorded using Visual Analogue Scale (VAS) pretreatment, at 2 nd , 4 th and 6 th week post treatment. The results of this study illustrate that clinical and therapeutic effects of core stabilization exercise program over the period of six weeks are more effective in terms of reduction in pain, compared to routine physical therapy exercise for similar duration. This study found significant reduction in pain across the two groups at 2 nd , 4 th and 6 th week of treatment with p value less than 0.05. There was a mean reduction of 3.08 and 1.71 on VAS across the core stabilization group and routine physical therapy exercise group respectively. Core stabilization exercise is more effective than routine physical therapy exercise in terms of greater reduction in pain in patients with non-specific low back pain.
Núñez, Eutimio Gustavo Fernández; Faintuch, Bluma Linkowski; Teodoro, Rodrigo; Wiecek, Danielle Pereira; da Silva, Natanael Gomes; Papadopoulos, Minas; Pelecanou, Maria; Pirmettis, Ioannis; de Oliveira Filho, Renato Santos; Duatti, Adriano; Pasqualini, Roberto
2011-04-01
The objective of this study was the development of a statistical approach for radiolabeling optimization of cysteine-dextran conjugates with Tc-99m tricarbonyl core. This strategy has been applied to the labeling of 2-propylene-S-cysteine-dextran in the attempt to prepare a new class of tracers for sentinel lymph node detection, and can be extended to other radiopharmaceuticals for different targets. The statistical routine was based on three-level factorial design. Best labeling conditions were achieved. The specific activity reached was 5 MBq/μg. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
Large-Scale Compute-Intensive Analysis via a Combined In-situ and Co-scheduling Workflow Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messer, Bronson; Sewell, Christopher; Heitmann, Katrin
2015-01-01
Large-scale simulations can produce tens of terabytes of data per analysis cycle, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of performing the analysis in situ, utilizing the same resources as the simulation, and/or off-loading subsets of the data to a compute-intensive analysis system. We introduce an analysis framework developed for HACC, a cosmological N-body code, that uses both in situ and co-scheduling approaches for handling Petabyte-size outputs. An initial inmore » situ step is used to reduce the amount of data to be analyzed, and to separate out the data-intensive tasks handled off-line. The analysis routines are implemented using the PISTON/VTK-m framework, allowing a single implementation of an algorithm that simultaneously targets a variety of GPU, multi-core, and many-core architectures.« less
Who Can You Turn to? Tie Activation within Core Business Discussion Networks
ERIC Educational Resources Information Center
Renzulli, Linda A.; Aldrich, Howard
2005-01-01
We examine the connection between personal network characteristics and the activation of ties for access to resources during routine times. We focus on factors affecting business owners' use of their core network ties to obtain legal, loan, financial and expert advice. Owners rely more on core business ties when their core networks contain a high…
2011-01-01
Background The primary strategy to interrupt transmission of wild poliovirus in India is to improve supplemental immunization activities and routine immunization coverage in priority districts with a focus on 107 high-risk blocks of western Uttar Pradesh and central Bihar. Villages or urban areas with a history of wild poliovirus transmission, or hard-to-reach or resistant populations are categorized as high-risk areas within blocks. The Social Mobilization Network (SM Net) was formed in Uttar Pradesh in 2003 to support polio eradication efforts through improved planning, implementation and monitoring of social mobilization activities in those high-risk areas. In this paper, we examine the vaccination outcomes in districts of SM Net where the CORE Group works. Methods We carried out a secondary data analysis of routine monitoring information collected by the SM Net and the Government of India. These data include information about vaccination outcomes in SM Net areas and non-SM Net areas within the districts where the CORE Group operates. Statistical analysis was used to compare, between SM Net and non-SM Net areas, vaccination outcomes considered sensitive to social mobilization efforts of the SM Net. We employed Generalized Estimating Equations (GEE) statistical method to account for Intra-cluster Correlation (ICC), and used 'Quasi-likelihood under the independence model criterion (QIC)' as the model selection method. Results Vaccination outcomes in SM Net areas were as high as or higher than in non-SM Net areas. There was considerable variation in vaccination outcomes between districts. Conclusions While not conclusive, the results suggest that the social mobilization efforts of the SM Net and the CORE Group are helping to increase vaccination levels in high-risk areas of Uttar Pradesh. Vaccination outcomes in CORE Group areas were equal or higher than in non-CORE, non-SM Net areas. This occurred even though SM Net areas are those with more community resistance to polio vaccination and/or are have harder-to-reach populations than non-SM Net areas. Other likely explanations for the relatively good vaccination performance in SM Net areas are not apparent. PMID:21569256
Palmieri, Gaspare; Evans, Chris; Hansen, Vidje; Brancaleoni, Greta; Ferrari, Silvia; Porcelli, Piero; Reitano, Francesco; Rigatelli, Marco
2009-01-01
The Clinical Outcomes in Routine Evaluation--Outcome Measure (CORE-OM) was translated into Italian and tested in non-clinical (n = 263) and clinical (n = 647) samples. The translation showed good acceptability, internal consistency and convergent validity in both samples. There were large and statistically significant differences between clinical and non-clinical datasets on all scores. The reliable change criteria were similar to those for the UK referential data. Some of the clinically significant change criteria, particularly for the men, were moderately different from the UK cutting points. The Italian version of the CORE-OM showed respectable psychometric parameters. However, it seemed plausible that non-clinical and clinical distributions of self-report scores on psychopathology and functioning measures may differ by language and culture. *A good quality Italian translation of the CORE-OM, and hence the GP-CORE, CORE-10 and CORE-5 measures also, is now available for use by practitioners and anyone surveying or exploring general psychological state. The measures can be obtained from CORE-IMS or yourself and practitioners are encouraged to share anonymised data so that good clinical and non-clinical referential databases can be established for Italy.
Brenner, Stephan; De Allegri, Manuela; Gabrysch, Sabine; Chinkhumba, Jobiba; Sarker, Malabika; Muula, Adamson S
2015-01-01
A variety of clinical process indicators exists to measure the quality of care provided by maternal and neonatal health (MNH) programs. To allow comparison across MNH programs in low- and middle-income countries (LMICs), a core set of essential process indicators is needed. Although such a core set is available for emergency obstetric care (EmOC), the 'EmOC signal functions', a similar approach is currently missing for MNH routine care evaluation. We describe a strategy for identifying core process indicators for routine care and illustrate their usefulness in a field example. We first developed an indicator selection strategy by combining epidemiological and programmatic aspects relevant to MNH in LMICs. We then identified routine care process indicators meeting our selection criteria by reviewing existing quality of care assessment protocols. We grouped these indicators into three categories based on their main function in addressing risk factors of maternal or neonatal complications. We then tested this indicator set in a study assessing MNH quality of clinical care in 33 health facilities in Malawi. Our strategy identified 51 routine care processes: 23 related to initial patient risk assessment, 17 to risk monitoring, 11 to risk prevention. During the clinical performance assessment a total of 82 cases were observed. Birth attendants' adherence to clinical standards was lowest in relation to risk monitoring processes. In relation to major complications, routine care processes addressing fetal and newborn distress were performed relatively consistently, but there were major gaps in the performance of routine care processes addressing bleeding, infection, and pre-eclampsia risks. The identified set of process indicators could identify major gaps in the quality of obstetric and neonatal care provided during the intra- and immediate postpartum period. We hope our suggested indicators for essential routine care processes will contribute to streamlining MNH program evaluations in LMICs.
Brenner, Stephan; De Allegri, Manuela; Gabrysch, Sabine; Chinkhumba, Jobiba; Sarker, Malabika; Muula, Adamson S.
2015-01-01
Background A variety of clinical process indicators exists to measure the quality of care provided by maternal and neonatal health (MNH) programs. To allow comparison across MNH programs in low- and middle-income countries (LMICs), a core set of essential process indicators is needed. Although such a core set is available for emergency obstetric care (EmOC), the ‘EmOC signal functions’, a similar approach is currently missing for MNH routine care evaluation. We describe a strategy for identifying core process indicators for routine care and illustrate their usefulness in a field example. Methods We first developed an indicator selection strategy by combining epidemiological and programmatic aspects relevant to MNH in LMICs. We then identified routine care process indicators meeting our selection criteria by reviewing existing quality of care assessment protocols. We grouped these indicators into three categories based on their main function in addressing risk factors of maternal or neonatal complications. We then tested this indicator set in a study assessing MNH quality of clinical care in 33 health facilities in Malawi. Results Our strategy identified 51 routine care processes: 23 related to initial patient risk assessment, 17 to risk monitoring, 11 to risk prevention. During the clinical performance assessment a total of 82 cases were observed. Birth attendants’ adherence to clinical standards was lowest in relation to risk monitoring processes. In relation to major complications, routine care processes addressing fetal and newborn distress were performed relatively consistently, but there were major gaps in the performance of routine care processes addressing bleeding, infection, and pre-eclampsia risks. Conclusion The identified set of process indicators could identify major gaps in the quality of obstetric and neonatal care provided during the intra- and immediate postpartum period. We hope our suggested indicators for essential routine care processes will contribute to streamlining MNH program evaluations in LMICs. PMID:25875252
NASA Astrophysics Data System (ADS)
Reilly, B. T.; Stoner, J. S.; Wiest, J.
2017-08-01
Computed tomography (CT) of sediment cores allows for high-resolution images, three-dimensional volumes, and down core profiles. These quantitative data are generated through the attenuation of X-rays, which are sensitive to sediment density and atomic number, and are stored in pixels as relative gray scale values or Hounsfield units (HU). We present a suite of MATLAB™ tools specifically designed for routine sediment core analysis as a means to standardize and better quantify the products of CT data collected on medical CT scanners. SedCT uses a graphical interface to process Digital Imaging and Communications in Medicine (DICOM) files, stitch overlapping scanned intervals, and create down core HU profiles in a manner robust to normal coring imperfections. Utilizing a random sampling technique, SedCT reduces data size and allows for quick processing on typical laptop computers. SedCTimage uses a graphical interface to create quality tiff files of CT slices that are scaled to a user-defined HU range, preserving the quantitative nature of CT images and easily allowing for comparison between sediment cores with different HU means and variance. These tools are presented along with examples from lacustrine and marine sediment cores to highlight the robustness and quantitative nature of this method.
From play to problem solving to Common Core: The development of fluid reasoning.
Prince, Pauline
2017-01-01
How and when does fluid reasoning develop and what does it look like at different ages, from a neurodevelopmental and functional perspective? The goal of this article is to discuss the development of fluid reasoning from a practical perspective of our children's lives: from play to problem solving to Common Core Curriculum. A review of relevant and current literature supports a connection between movement, including movement through free play, and the development of novel problem solving. As our children grow and develop, motor routines can become cognitive routines and can be evidenced not only in games, such as chess, but also in the acquisition and demonstration of academic skills. Finally, this article describes the connection between novel problem solving and the demands of the Common Core Curriculum.
Barkham, M; Margison, F; Leach, C; Lucock, M; Mellor-Clark, J; Evans, C; Benson, L; Connell, J; Audin, K; McGrath, G
2001-04-01
To complement the evidence-based practice paradigm, the authors argued for a core outcome measure to provide practice-based evidence for the psychological therapies. Utility requires instruments that are acceptable scientifically, as well as to service users, and a coordinated implementation of the measure at a national level. The development of the Clinical Outcomes in Routine Evaluation-Outcome Measure (CORE-OM) is summarized. Data are presented across 39 secondary-care services (n = 2,710) and within an intensively evaluated single service (n = 1,455). Results suggest that the CORE-OM is a valid and reliable measure for multiple settings and is acceptable to users and clinicians as well as policy makers. Baseline data levels of patient presenting problem severity, including risk, are reported in addition to outcome benchmarks that use the concept of reliable and clinically significant change. Basic quality improvement in outcomes for a single service is considered.
Genotoxic Changes to Rodent Cells Exposed in Vitro to Tungsten, Nickel, Cobalt and Iron
2014-03-10
SAR) 18. NUMBER OF PAGES 20 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c . THIS PAGE unclassified...37 ° C in a humidified atmosphere of 5% CO2 in air. Cultures were routinely fed every 3–4 days, and passaged weekly. Experiments were conducted...Samples were stored at −80 ° C until needed. 2.4. Microarray Analysis RNA samples were sent to the Genomics Core of the Lerner Research Institute at the
Coenen, Michaela; Rudolf, Klaus-Dieter; Kus, Sandra; Dereskewitz, Caroline
2018-05-24
The International Classification of Functioning, Disability and Health (ICF) provides a standardized language of almost 1500 ICF categories for coding information about functioning and contextual factors. Short lists (ICF Core Sets) are helpful tools to support the implementation of the ICF in clinical routine. In this paper we report on the implementation of ICF Core Sets in clinical routine using the "ICF Core Sets for Hand Conditions" and the "Lighthouse Project Hand" as an example. Based on the ICF categories of the "Brief ICF Core Set for Hand Conditions", the ICF-based assessment tool (ICF Hand A ) was developed aiming to guide the assessment and treatment of patients with injuries and diseases located at the hand. The ICF Hand A facilitates the standardized assessment of functioning - taking into consideration of a holistic view of the patients - along the continuum of care ranging from acute care to rehabilitation and return to work. Reference points for the assessment of the ICF Hand A are determined in treatment guidelines for selected injuries and diseases of the hand along with recommendations for acute treatment and care, procedures and interventions of subsequent treatment and rehabilitation. The assessment of the ICF Hand A according to the defined reference points can be done using electronic clinical assessment tools and allows for an automatic generation of a timely medical report of a patient's functioning. In the future, the ICF Hand A can be used to inform the coding of functioning in ICD-11.
ERIC Educational Resources Information Center
Spillane, James P.; Parise, Leigh Mesler; Sherer, Jennifer Zoltners
2011-01-01
The institutional environment of America's schools has changed substantially as government regulation has focused increasingly on the core technical work of schools--instruction. The authors explore the school administrative response to this changing environment, describing how government regulation becomes embodied in the formal structure of four…
A Psychometric Evaluation of the Core Bereavement Items
ERIC Educational Resources Information Center
Holland, Jason M.; Nam, Ilsung; Neimeyer, Robert A.
2013-01-01
Despite being a routinely administered assessment of grieving, few studies have empirically examined the psychometric properties of the Core Bereavement Items (CBI). The present study investigated the factor structure, internal reliability, and concurrent validity of the CBI in a large, diverse sample of bereaved young adults (N = 1,366).…
NASA Technical Reports Server (NTRS)
Gander, Philippa H.; Connell, Linda J.; Graeber, R. Curtis
1986-01-01
Experiments were conducted to estimate the magnitude of the masking effect produced in humans by alternate periods of physical activity and rest or sleep on the circadian rhythms of heart rate and core temperature. The heart rate, rectal temperature, and nondominant wrist activity were monitored in 12 male subjects during 6 days of normal routine at home and during 6 days of controlled bed-rest regimen. The comparisons of averaged waveforms for the activity, heart rate, and temperature indicated that about 45 percent of the range of the circadian heart rate rhythm during normal routine and about 14 percent of the range of the circadian temperature rhythm were attributable to the effects of activity. The smaller effect of activity on the temperature rhythm may be partially attributable to the fact that core temperature is being more rigorously conserved than heart rate, at least during moderate exercise.
Gergov, Vera; Lahti, Jari; Marttunen, Mauri; Lipsanen, Jari; Evans, Chris; Ranta, Klaus; Laitila, Aarno; Lindberg, Nina
2017-05-01
An increasing need exists for suitable measures to evaluate treatment outcome in adolescents. YP-CORE is a pan-theoretical brief questionnaire developed for this purpose, but it lacks studies in different cultures or languages. To explore the acceptability, factor structure, reliability, validity, and sensitivity to change of the Finnish translation of YP-CORE. The study was conducted at the Department of Adolescent Psychiatry, Helsinki University Central Hospital. A Finnish translation was prepared by a team of professionals and adolescents. A clinical sample of 104 patients was asked to complete the form together with BDI-21 and BAI, and 92 of them filled the forms again after a 3-month treatment. Analysis included acceptability, confirmatory factor analysis, internal and test-re-test reliability, concurrent validity, influence of gender and age, and criteria for reliable change. YP-CORE was well accepted, and the rate of missing values was low. Internal consistency (α = 0.83-.92) and test-re-test reliability were good (r = 0.69), and the results of CFA supported a one-factor model. YP-CORE showed good concurrent validity against two widely used symptom-specific measures (r = 0.62-0.87). Gender had a moderately strong effect on the scores (d = 0.67), but the effect of age was not as evident. The measure was sensitive to change, showing a larger effect size (d = 0.55) than in the BDI-21 and BAI (d = 0.31-0.50). The results show that the translation of YP-CORE into Finnish has been successful, the YP-CORE has good psychometric properties, and the measure could be taken into wider use in clinical settings for outcome measurement in adolescents.
The State of Software for Evolutionary Biology.
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-05-01
With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.
A statistical model of the human core-temperature circadian rhythm
NASA Technical Reports Server (NTRS)
Brown, E. N.; Choe, Y.; Luithardt, H.; Czeisler, C. A.
2000-01-01
We formulate a statistical model of the human core-temperature circadian rhythm in which the circadian signal is modeled as a van der Pol oscillator, the thermoregulatory response is represented as a first-order autoregressive process, and the evoked effect of activity is modeled with a function specific for each circadian protocol. The new model directly links differential equation-based simulation models and harmonic regression analysis methods and permits statistical analysis of both static and dynamical properties of the circadian pacemaker from experimental data. We estimate the model parameters by using numerically efficient maximum likelihood algorithms and analyze human core-temperature data from forced desynchrony, free-run, and constant-routine protocols. By representing explicitly the dynamical effects of ambient light input to the human circadian pacemaker, the new model can estimate with high precision the correct intrinsic period of this oscillator ( approximately 24 h) from both free-run and forced desynchrony studies. Although the van der Pol model approximates well the dynamical features of the circadian pacemaker, the optimal dynamical model of the human biological clock may have a harmonic structure different from that of the van der Pol oscillator.
ERIC Educational Resources Information Center
Marshall, Keith; Willoughby-Booth, Simon
2007-01-01
There are few reliable self-report measures suitable for people with a learning disability in reporting psychological distress. This study examines the modification of the Clinical Outcomes in Routine Evaluation-Outcome Measure (CORE-OM), exploring its reliability, using two different presentation styles. One style included a sequencing task then…
NASA Astrophysics Data System (ADS)
Syvorotka, Ihor I.; Pavlyk, Lyubomyr P.; Ubizskii, Sergii B.; Buryy, Oleg A.; Savytskyy, Hrygoriy V.; Mitina, Nataliya Y.; Zaichenko, Oleksandr S.
2017-04-01
Method of determining of magnetic moment and size from measurements of dependence of the nonlinear magnetic susceptibility upon magnetic field is proposed, substantiated and tested for superparamagnetic nanoparticles (SPNP) of the "magnetic core-polymer shell" type which are widely used in biomedical technologies. The model of the induction response of the SPNP ensemble on the combined action of the magnetic harmonic excitation field and permanent bias field is built, and the analysis of possible ways to determine the magnetic moment and size of the nanoparticles as well as the parameters of the distribution of these variables is performed. Experimental verification of the proposed method was implemented on samples of SPNP with maghemite core in dry form as well as in colloidal systems. The results have been compared with the data obtained by other methods. Advantages of the proposed method are analyzed and discussed, particularly in terms of its suitability for routine express testing of SPNP for biomedical technology.
Particle shape accounts for instrumental discrepancy in ice core dust size distributions
NASA Astrophysics Data System (ADS)
Folden Simonsen, Marius; Cremonesi, Llorenç; Baccolo, Giovanni; Bosch, Samuel; Delmonte, Barbara; Erhardt, Tobias; Kjær, Helle Astrid; Potenza, Marco; Svensson, Anders; Vallelonga, Paul
2018-05-01
The Klotz Abakus laser sensor and the Coulter counter are both used for measuring the size distribution of insoluble mineral dust particles in ice cores. While the Coulter counter measures particle volume accurately, the equivalent Abakus instrument measurement deviates substantially from the Coulter counter. We show that the difference between the Abakus and the Coulter counter measurements is mainly caused by the irregular shape of dust particles in ice core samples. The irregular shape means that a new calibration routine based on standard spheres is necessary for obtaining fully comparable data. This new calibration routine gives an increased accuracy to Abakus measurements, which may improve future ice core record intercomparisons. We derived an analytical model for extracting the aspect ratio of dust particles from the difference between Abakus and Coulter counter data. For verification, we measured the aspect ratio of the same samples directly using a single-particle extinction and scattering instrument. The results demonstrate that the model is accurate enough to discern between samples of aspect ratio 0.3 and 0.4 using only the comparison of Abakus and Coulter counter data.
OSCAR4: a flexible architecture for chemical text-mining.
Jessop, David M; Adams, Sam E; Willighagen, Egon L; Hawizy, Lezan; Murray-Rust, Peter
2011-10-14
The Open-Source Chemistry Analysis Routines (OSCAR) software, a toolkit for the recognition of named entities and data in chemistry publications, has been developed since 2002. Recent work has resulted in the separation of the core OSCAR functionality and its release as the OSCAR4 library. This library features a modular API (based on reduction of surface coupling) that permits client programmers to easily incorporate it into external applications. OSCAR4 offers a domain-independent architecture upon which chemistry specific text-mining tools can be built, and its development and usage are discussed.
Developing, implementing and disseminating a core outcome set for neonatal medicine.
Webbe, James; Brunton, Ginny; Ali, Shohaib; Duffy, James Mn; Modi, Neena; Gale, Chris
2017-01-01
In high resource settings, 1 in 10 newborn babies require admission to a neonatal unit. Research evaluating neonatal care involves recording and reporting many different outcomes and outcome measures. Such variation limits the usefulness of research as studies cannot be compared or combined. To address these limitations, we aim to develop, disseminate and implement a core outcome set for neonatal medicine. A steering group that includes parents and former patients, healthcare professionals and researchers has been formed to guide the development of the core outcome set. We will review neonatal trials systematically to identify previously reported outcomes. Additionally, we will specifically identify outcomes of importance to parents, former patients and healthcare professionals through a systematic review of qualitative studies. Outcomes identified will be entered into an international, multi-perspective eDelphi survey. All key stakeholders will be invited to participate. The Delphi method will encourage individual and group stakeholder consensus to identify a core outcome set. The core outcome set will be mapped to existing, routinely recorded data where these exist. Use of a core set will ensure outcomes of importance to key stakeholders, including former patients and parents, are recorded and reported in a standard fashion in future research. Embedding the core outcome set within future clinical studies will extend the usefulness of research to inform practice, enhance patient care and ultimately improve outcomes. Using routinely recorded electronic data will facilitate implementation with minimal addition burden. Core Outcome Measures in Effectiveness Trials (COMET) database: 842 (www.comet-initiative.org/studies/details/842).
ERIC Educational Resources Information Center
Cahill, Jane; Barkham, Michael; Stiles, William B.; Twigg, Elspeth; Hardy, Gillian E.; Rees, Anne; Evans, Chris
2006-01-01
Clients (N = 77) undergoing cognitive therapy for depression were assessed before treatment with the Clinical Outcomes in Routine Evaluation-Outcome Measure (CORE-OM), which encompasses domains of subjective well-being, problems, functioning, and risk of harming self or others, along with the Beck Depression Inventory-II (BDIII), the Hamilton…
Yield of Routine Image-Guided Biopsy of Renal Mass Thermal Ablation Zones: 11-Year Experience.
Wasnik, Ashish P; Higgins, Ellen J; Fox, Giovanna A; Caoili, Elaine M; Davenport, Matthew S
2018-06-19
To determine the yield of routine image-guided core biopsy of renal cell carcinoma (RCC) thermal ablation zones. Institutional review board approval was obtained for this Health Insurance Portability and Accountability Act-compliant quality improvement effort. Routine core biopsy of RCC ablation zones was performed 2 months postablation from July 2003 to December 2014. Routine nicotinamide adenine dinucleotide staining was performed by specialized genitourinary pathologists to assess cell viability. The original purpose of performing routine postablation biopsy was to verify, in addition to imaging, whether the mass was completely treated. Imaging was stratified as negative, indeterminate, or positive for viable malignancy. Histology was stratified as negative, indeterminate, positive, or nondiagnostic for viable malignancy. Histology results were compared to prebiopsy imaging findings. Routine ablation zone biopsy was performed after 50% (146/292) of index ablations (24 cryoablations, 122 radiofrequency ablations), and postablation imaging was performed more often with multiphasic computed tomography than magnetic resonance imaging (100 vs 46, p < 0.0001). When imaging was negative (n = 117), biopsy added no additional information (92% [n = 108] negative, 0.9% [n = 1] indeterminate, 7% [n = 8] nondiagnostic). When imaging was indeterminate (n = 19), 11% (n = 2) of biopsies had viable RCC and 89% (n = 17) were negative. When imaging was positive, biopsy detected viable neoplasm in only 10% (1/10) of cases; 80% (8/10) were negative and 10% (1/10) were nondiagnostic. Routine biopsy of renal ablation zones to validate postablation imaging results was not value-added and therefore was discontinued at the study institution. Copyright © 2018. Published by Elsevier Inc.
Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas
2014-01-01
Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools.
2014-01-01
Background Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Methods Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Results Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Conclusions Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools. PMID:25565007
Large Data at Small Universities: Astronomical processing using a computer classroom
NASA Astrophysics Data System (ADS)
Fuller, Nathaniel James; Clarkson, William I.; Fluharty, Bill; Belanger, Zach; Dage, Kristen
2016-06-01
The use of large computing clusters for astronomy research is becoming more commonplace as datasets expand, but access to these required resources is sometimes difficult for research groups working at smaller Universities. As an alternative to purchasing processing time on an off-site computing cluster, or purchasing dedicated hardware, we show how one can easily build a crude on-site cluster by utilizing idle cycles on instructional computers in computer-lab classrooms. Since these computers are maintained as part of the educational mission of the University, the resource impact on the investigator is generally low.By using open source Python routines, it is possible to have a large number of desktop computers working together via a local network to sort through large data sets. By running traditional analysis routines in an “embarrassingly parallel” manner, gains in speed are accomplished without requiring the investigator to learn how to write routines using highly specialized methodology. We demonstrate this concept here applied to 1. photometry of large-format images and 2. Statistical significance-tests for X-ray lightcurve analysis. In these scenarios, we see a speed-up factor which scales almost linearly with the number of cores in the cluster. Additionally, we show that the usage of the cluster does not severely limit performance for a local user, and indeed the processing can be performed while the computers are in use for classroom purposes.
The State of Software for Evolutionary Biology
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-01-01
Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525
Unsupervised daily routine and activity discovery in smart homes.
Jie Yin; Qing Zhang; Karunanithi, Mohan
2015-08-01
The ability to accurately recognize daily activities of residents is a core premise of smart homes to assist with remote health monitoring. Most of the existing methods rely on a supervised model trained from a preselected and manually labeled set of activities, which are often time-consuming and costly to obtain in practice. In contrast, this paper presents an unsupervised method for discovering daily routines and activities for smart home residents. Our proposed method first uses a Markov chain to model a resident's locomotion patterns at different times of day and discover clusters of daily routines at the macro level. For each routine cluster, it then drills down to further discover room-level activities at the micro level. The automatic identification of daily routines and activities is useful for understanding indicators of functional decline of elderly people and suggesting timely interventions.
Common Graphics Library (CGL). Volume 2: Low-level user's guide
NASA Technical Reports Server (NTRS)
Taylor, Nancy L.; Hammond, Dana P.; Theophilos, Pauline M.
1989-01-01
The intent is to instruct the users of the Low-Level routines of the Common Graphics Library (CGL). The Low-Level routines form an application-independent graphics package enabling the user community to construct and design scientific charts conforming to the publication and/or viewgraph process. The Low-Level routines allow the user to design unique or unusual report-quality charts from a set of graphics utilities. The features of these routines can be used stand-alone or in conjunction with other packages to enhance or augment their capabilities. This library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine-independent, providing support for centralized and/or distributed computer systems.
AGAMA: Action-based galaxy modeling framework
NASA Astrophysics Data System (ADS)
Vasiliev, Eugene
2018-05-01
The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).
OSCAR4: a flexible architecture for chemical text-mining
2011-01-01
The Open-Source Chemistry Analysis Routines (OSCAR) software, a toolkit for the recognition of named entities and data in chemistry publications, has been developed since 2002. Recent work has resulted in the separation of the core OSCAR functionality and its release as the OSCAR4 library. This library features a modular API (based on reduction of surface coupling) that permits client programmers to easily incorporate it into external applications. OSCAR4 offers a domain-independent architecture upon which chemistry specific text-mining tools can be built, and its development and usage are discussed. PMID:21999457
STARS: A general-purpose finite element computer program for analysis of engineering structures
NASA Technical Reports Server (NTRS)
Gupta, K. K.
1984-01-01
STARS (Structural Analysis Routines) is primarily an interactive, graphics-oriented, finite-element computer program for analyzing the static, stability, free vibration, and dynamic responses of damped and undamped structures, including rotating systems. The element library consists of one-dimensional (1-D) line elements, two-dimensional (2-D) triangular and quadrilateral shell elements, and three-dimensional (3-D) tetrahedral and hexahedral solid elements. These elements enable the solution of structural problems that include truss, beam, space frame, plane, plate, shell, and solid structures, or any combination thereof. Zero, finite, and interdependent deflection boundary conditions can be implemented by the program. The associated dynamic response analysis capability provides for initial deformation and velocity inputs, whereas the transient excitation may be either forces or accelerations. An effective in-core or out-of-core solution strategy is automatically employed by the program, depending on the size of the problem. Data input may be at random within a data set, and the program offers certain automatic data-generation features. Input data are formatted as an optimal combination of free and fixed formats. Interactive graphics capabilities enable convenient display of nodal deformations, mode shapes, and element stresses.
Carpenter, G.B.; Cardinell, A.P.; Francois, D.K.; Good, L.K.; Lewis, R.L.; Stiles, N.T.
1982-01-01
Analysis of high-resolution geophysical data collected over 540 blocks tentatively selected for leasing in proposed OCS Oil and Gas Lease Sale 52 (Georges Bank) revealed a number of potential geologic hazards to oil and gas exploration and development activities: evidence of mass movements and shallow gas deposits on the continental slope. No potential hazards were observed on the continental shelf or rise. Other geology-related problems, termed constraints because they pose a relatively low degree of risk and can be routinely dealt with by the use of existing technology have been observed on the continental shelf. Constraints identified in the proposed sale area are erosion, sand waves, filled channels and deep faults. Piston cores were collected for geotechnical analysis at selected locations on the continental slope in the proposed lease sale area. The core locations were selected to provide information on slope stability and to establish the general geotechnical properties of the sediments. Preliminary results of a testing program suggest that the surficial sediment cover is stable with respect to mass movement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdel-Khalik, Hany S.; Zhang, Qiong
2014-05-20
The development of hybrid Monte-Carlo-Deterministic (MC-DT) approaches, taking place over the past few decades, have primarily focused on shielding and detection applications where the analysis requires a small number of responses, i.e. at the detector locations(s). This work further develops a recently introduced global variance reduction approach, denoted by the SUBSPACE approach is designed to allow the use of MC simulation, currently limited to benchmarking calculations, for routine engineering calculations. By way of demonstration, the SUBSPACE approach is applied to assembly level calculations used to generate the few-group homogenized cross-sections. These models are typically expensive and need to be executedmore » in the order of 10 3 - 10 5 times to properly characterize the few-group cross-sections for downstream core-wide calculations. Applicability to k-eigenvalue core-wide models is also demonstrated in this work. Given the favorable results obtained in this work, we believe the applicability of the MC method for reactor analysis calculations could be realized in the near future.« less
NASA Astrophysics Data System (ADS)
Kim, Hyun-Tae; Romanelli, M.; Voitsekhovitch, I.; Koskela, T.; Conboy, J.; Giroud, C.; Maddison, G.; Joffrin, E.; contributors, JET
2015-06-01
A consistent deterioration of global confinement in H-mode experiments has been observed in JET [1] following the replacement of all carbon plasma facing components (PFCs) with an all metal (‘ITER-like’) wall (ILW). This has been correlated to the observed degradation of the pedestal confinement, as lower electron temperature (Te) values are routinely measured at the top of the edge barrier region. A comparative investigation of core heat transport in JET-ILW and JET-CW (carbon wall) discharges has been performed, to assess whether core confinement has also been affected by the wall change. The results presented here have been obtained by analysing a set of discharges consisting of high density JET-ILW H-mode plasmas and comparing them against their counterpart discharges in JET-CW having similar global operational parameters. The set contains 10 baseline ({βN}=1.5∼ 2 ) discharge-pairs with 2.7 T toroidal magnetic field, 2.5 MA plasma current, and 14 to 17 MW of neutral beam injection (NBI) heating. Based on a Te profile analysis using high resolution Thomson scattering (HRTS) data, the Te profile peaking (i.e. core Te (ρ = 0.3) / edge Te (ρ = 0.7)) is found to be similar, and weakly dependent on edge Te, for both JET-ILW and JET-CW discharges. When ILW discharges are seeded with N2, core and edge Te both increase to maintain a similar peaking factor. The change in core confinement is addressed with interpretative TRANSP simulations. It is found that JET-ILW H-mode plasmas have higher NBI power deposition to electrons and lower NBI power deposition to ions as compared to the JET-CW counterparts. This is an effect of the lower electron temperature at the top of the pedestal. As a result, the core electron energy confinement time is reduced in JET-ILW discharges, but the core ion energy confinement time is not decreased. Overall, the core energy confinement is found to be the same in the JET-ILW discharges compared to the JET-CW counterparts.
NASA Astrophysics Data System (ADS)
Benaouda, D.; Wadge, G.; Whitmarsh, R. B.; Rothwell, R. G.; MacLeod, C.
1999-02-01
In boreholes with partial or no core recovery, interpretations of lithology in the remainder of the hole are routinely attempted using data from downhole geophysical sensors. We present a practical neural net-based technique that greatly enhances lithological interpretation in holes with partial core recovery by using downhole data to train classifiers to give a global classification scheme for those parts of the borehole for which no core was retrieved. We describe the system and its underlying methods of data exploration, selection and classification, and present a typical example of the system in use. Although the technique is equally applicable to oil industry boreholes, we apply it here to an Ocean Drilling Program (ODP) borehole (Hole 792E, Izu-Bonin forearc, a mixture of volcaniclastic sandstones, conglomerates and claystones). The quantitative benefits of quality-control measures and different subsampling strategies are shown. Direct comparisons between a number of discriminant analysis methods and the use of neural networks with back-propagation of error are presented. The neural networks perform better than the discriminant analysis techniques both in terms of performance rates with test data sets (2-3 per cent better) and in qualitative correlation with non-depth-matched core. We illustrate with the Hole 792E data how vital it is to have a system that permits the number and membership of training classes to be changed as analysis proceeds. The initial classification for Hole 792E evolved from a five-class to a three-class and then to a four-class scheme with resultant classification performance rates for the back-propagation neural network method of 83, 84 and 93 per cent respectively.
Characterizing Drainage Multiphase Flow in Heterogeneous Sandstones
NASA Astrophysics Data System (ADS)
Jackson, Samuel J.; Agada, Simeon; Reynolds, Catriona A.; Krevor, Samuel
2018-04-01
In this work, we analyze the characterization of drainage multiphase flow properties on heterogeneous rock cores using a rich experimental data set and mm-m scale numerical simulations. Along with routine multiphase flow properties, 3-D submeter scale capillary pressure heterogeneity is characterized by combining experimental observations and numerical calibration, resulting in a 3-D numerical model of the rock core. The uniqueness and predictive capability of the numerical models are evaluated by accurately predicting the experimentally measured relative permeability of N2—DI water and CO2—brine systems in two distinct sandstone rock cores across multiple fractional flow regimes and total flow rates. The numerical models are used to derive equivalent relative permeabilities, which are upscaled functions incorporating the effects of submeter scale capillary pressure. The functions are obtained across capillary numbers which span four orders of magnitude, representative of the range of flow regimes that occur in subsurface CO2 injection. Removal of experimental boundary artifacts allows the derivation of equivalent functions which are characteristic of the continuous subsurface. We also demonstrate how heterogeneities can be reorientated and restructured to efficiently estimate flow properties in rock orientations differing from the original core sample. This analysis shows how combined experimental and numerical characterization of rock samples can be used to derive equivalent flow properties from heterogeneous rocks.
Beck, Alison; Burdett, Mark; Lewis, Helen
2015-06-01
To investigate the impact of waiting for psychological therapy on client well-being as measured by the Clinical Outcomes in Routine Evaluation-Outcome Measure (CORE-OM) global distress (GD) score. Global distress scores were retrieved for all clients referred for psychological therapy in a secondary care mental health service between November 2006 and May 2013 and who had completed a CORE-OM at assessment and first session. GD scores for a subgroup of 103 clients who had completed a CORE-OM during the last therapy session were also reviewed. The study sample experienced a median wait of 41.14 weeks between assessment and first session. The relationship between wait time from referral acceptance to assessment, and assessment GD score was not significant. During the period between assessment and first session no significant difference in GD score was observed. Nevertheless 29.1% of the sample experienced reliable change; 16.0% of clients reliably improved and 13.1% reliably deteriorated whilst waiting for therapy. Demographic factors were not found to have a significant effect on the change in GD score between assessment and first session. Waiting time was associated with post-therapy outcomes but not to a degree which was meaningful. The majority of individuals (54.4%), regardless of whether they improved or deteriorated whilst waiting for therapy, showed reliable improvement at end of therapy as measured by the CORE-OM. The majority of GD scores remained stable while waiting for therapy; however, 29.1% of secondary care clients experienced either reliable improvement or deterioration. Irrespective of whether they improved, deteriorated or remained unchanged whilst waiting for therapy, most individuals who had a complete end of therapy assessment showed reliable improvements following therapy. There was no significant difference in GD score between assessment and first session recordings. A proportion of clients (29.1%) showed reliable change, either improvement or deterioration, as measured by the GD score while waiting for therapy. Of the individuals with last session CORE-OMs (54.4%) showed significant improvement following therapy regardless of whether or not they experienced change while waiting for therapy. Limitations include: Problems of data quality, the data were from a routine data set and data were lost at each stage of the analysis. A focus on the CORE-OM limits exploration of the subjective experience of waiting for psychotherapy and the impact this has on psychological well-being. © 2014 The British Psychological Society.
PARALLELISATION OF THE MODEL-BASED ITERATIVE RECONSTRUCTION ALGORITHM DIRA.
Örtenberg, A; Magnusson, M; Sandborg, M; Alm Carlsson, G; Malusek, A
2016-06-01
New paradigms for parallel programming have been devised to simplify software development on multi-core processors and many-core graphical processing units (GPU). Despite their obvious benefits, the parallelisation of existing computer programs is not an easy task. In this work, the use of the Open Multiprocessing (OpenMP) and Open Computing Language (OpenCL) frameworks is considered for the parallelisation of the model-based iterative reconstruction algorithm DIRA with the aim to significantly shorten the code's execution time. Selected routines were parallelised using OpenMP and OpenCL libraries; some routines were converted from MATLAB to C and optimised. Parallelisation of the code with the OpenMP was easy and resulted in an overall speedup of 15 on a 16-core computer. Parallelisation with OpenCL was more difficult owing to differences between the central processing unit and GPU architectures. The resulting speedup was substantially lower than the theoretical peak performance of the GPU; the cause was explained. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Effects of 8-Week Core Exercises on Free Style Swimming Performance of Female Swimmers Aged 9-12
ERIC Educational Resources Information Center
Gencer, Yildirim Gökhan
2018-01-01
With this study, it is aimed to review the effects of 8-week core exercises, which are scheduled before the routine exercises, on changes over certain physical and motoric attributes and freestyle swimming performance of female athletes of the youngest age group, which is 9-12. For the study, a group of 12 female licensed swimmers who had a…
The circadian rhythm of core temperature: origin and some implications for exercise performance.
Waterhouse, Jim; Drust, Barry; Weinert, Dietmar; Edwards, Benjamin; Gregson, Warren; Atkinson, Greg; Kao, Shaoyuan; Aizawa, Seika; Reilly, Thomas
2005-01-01
This review first examines reliable and convenient ways of measuring core temperature for studying the circadian rhythm, concluding that measurements of rectal and gut temperature fulfil these requirements, but that insulated axilla temperature does not. The origin of the circadian rhythm of core temperature is mainly due to circadian changes in the rate of loss of heat through the extremities, mediated by vasodilatation of the cutaneous vasculature. Difficulties arise when the rhythm of core temperature is used as a marker of the body clock, since it is also affected by the sleep-wake cycle. This masking effect can be overcome directly by constant routines and indirectly by "purification" methods, several of which are described. Evidence supports the value of purification methods to act as a substitute when constant routines cannot be performed. Since many of the mechanisms that rise to the circadian rhythm of core temperature are the same as those that occur during thermoregulation in exercise, there is an interaction between the two. This interaction is manifest in the initial response to spontaneous activity and to mild exercise, body temperature rising more quickly and thermoregulatory reflexes being recruited less quickly around the trough and rising phase of the resting temperature rhythm, in comparison with the peak and falling phase. There are also implications for athletes, who need to exercise maximally and with minimal risk of muscle injury or heat exhaustion in a variety of ambient temperatures and at different times of the day. Understanding the circadian rhythm of core temperature may reduce potential hazards due to the time of day when exercise is performed.
Analysis of the SL-1 Accident Using RELAPS5-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Francisco, A.D. and Tomlinson, E. T.
2007-11-08
On January 3, 1961, at the National Reactor Testing Station, in Idaho Falls, Idaho, the Stationary Low Power Reactor No. 1 (SL-1) experienced a major nuclear excursion, killing three people, and destroying the reactor core. The SL-1 reactor, a 3 MW{sub t} boiling water reactor, was shut down and undergoing routine maintenance work at the time. This paper presents an analysis of the SL-1 reactor excursion using the RELAP5-3D thermal-hydraulic and nuclear analysis code, with the intent of simulating the accident from the point of reactivity insertion to destruction and vaporization of the fuel. Results are presented, along with amore » discussion of sensitivity to some reactor and transient parameters (many of the details are only known with a high level of uncertainty).« less
SUTIL: system utilities routines programmer's reference manual
NASA Technical Reports Server (NTRS)
Harper, D.
1976-01-01
A package of FORTRAN callable subroutines which allows efficient communication of data between users and programs is described. Proper utilization of the SUTIL package to reduce program core requirements and expedite program development is emphasized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bohm, P., E-mail: bohm@ipp.cas.cz; Bilkova, P.; Melich, R.
2014-11-15
The core Thomson scattering diagnostic (TS) on the COMPASS tokamak was put in operation and reported earlier. Implementation of edge TS, with spatial resolution along the laser beam up to ∼1/100 of the tokamak minor radius, is presented now. The procedure for spatial calibration and alignment of both core and edge systems is described. Several further upgrades of the TS system, like a triggering unit and piezo motor driven vacuum window shutter, are introduced as well. The edge TS system, together with the core TS, is now in routine operation and provides electron temperature and density profiles.
Jian, Yu-Tao; Tang, Tian-Yu; Swain, Michael V; Wang, Xiao-Dong; Zhao, Ke
2016-12-01
The aim of this in vitro study was to evaluate the effect of core ceramic grinding on the fracture behaviour of bilayered zirconia under two loading schemes. Interfacial surfaces of sandblasted zirconia disks (A) were ground with 80 (B), 120 (C) and 220 (D) grit diamond discs, respectively. Surface roughness and topographic analysis were performed using a confocal scanning laser microscope (CSLM) and a scanning electron microscopy (SEM). Relative monoclinic content was evaluated using X-ray diffraction analysis (XRD) then reevaluated after simulated veneer firing. Biaxial fracture strength (σ) and Weibull modulus (m) were calculated either with core in compression (subgroup Ac-Dc) or in tension (subgroup At-Dt). Facture surfaces were examined by SEM and energy dispersive X-ray spectroscopy (EDS). Maximum tensile stress at fracture was estimated by finite element analysis. Statistical data analysis was performed using Kruskal-Wallis and one-way ANOVA at a significance level of 0.05. As grit size of the diamond disc increased, zirconia surface roughness decreased (p<0.001). Thermal veneering treatment reversed the transformation of monoclinic phase observed after initial grinding. No difference in initial (p=0.519 for subgroups Ac-Dc) and final fracture strength (p=0.699 for subgroups Ac-Dc; p=0.328 for subgroups At-Dt) was found among the four groups for both loading schemes. While coarse grinding slightly increased final fracture strength reliability (m) for subgroups Ac-Dc. Two different modes of fracture were observed according to which material was on the bottom surface. Components of the liner porcelain remained on the zirconia surface after fracture for all groups. Technician grinding changed surface topography of zirconia ceramic material, but was not detrimental to the bilayered system strength after veneer application. Coarse grinding slightly improved the fracture strength reliability of the bilayered system tested with core in compression. It is recommended that veneering porcelain be applied directly after routine lab grinding of zirconia ceramic, and its application on rough zirconia cores may be preferred to enhance bond strength. Copyright © 2016. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Svalbonas, V.; Levine, H.; Ogilvie, P.
1975-01-01
Engineering programming information is presented for the STARS-2P (shell theory automated for rotational structures-2P (plasticity)) digital computer program, and FORTRAN 4 was used in writing the various subroutines. The execution of this program requires the use of thirteen temporary storage units. The program was initially written and debugged on the IBM 370-165 computer and converted to the UNIVAC 1108 computer, where it utilizes approximately 60,000 words of core. Only basic FORTRAN library routines are required by the program: sine, cosine, absolute value, and square root.
Mallory, Melanie A; Lucic, Danijela; Ebbert, Mark T W; Cloherty, Gavin A; Toolsie, Dan; Hillyard, David R
2017-05-01
HCV genotyping remains a critical tool for guiding initiation of therapy and selecting the most appropriate treatment regimen. Current commercial genotyping assays may have difficulty identifying 1a, 1b and genotype 6. To evaluate the concordance for identifying 1a, 1b, and genotype 6 between two methods: the PLUS assay and core/NS5B sequencing. This study included 236 plasma and serum samples previously genotyped by core/NS5B sequencing. Of these, 25 samples were also previously tested by the Abbott RealTime HCV GT II Research Use Only (RUO) assay and yielded ambiguous results. The remaining 211 samples were routine genotype 1 (n=169) and genotype 6 (n=42). Genotypes obtained from sequence data were determined using a laboratory-developed HCV sequence analysis tool and the NCBI non-redundant database. Agreement between the PLUS assay and core/NS5B sequencing for genotype 1 samples was 95.8% (162/169), with 96% (127/132) and 95% (35/37) agreement for 1a and 1b samples respectively. PLUS results agreed with core/NS5B sequencing for 83% (35/42) of unselected genotype 6 samples, with the remaining seven "not detected" by the PLUS assay. Among the 25 samples with ambiguous GT II results, 15 were concordant by PLUS and core/NS5B sequencing, nine were not detected by PLUS, and one sample had an internal control failure. The PLUS assay is an automated method that identifies 1a, 1b and genotype 6 with good agreement with gold-standard core/NS5B sequencing and can aid in the resolution of certain genotype samples with ambiguous GT II results. Copyright © 2017 Elsevier B.V. All rights reserved.
Common Graphics Library (CGL). Volume 1: LEZ user's guide
NASA Technical Reports Server (NTRS)
Taylor, Nancy L.; Hammond, Dana P.; Hofler, Alicia S.; Miner, David L.
1988-01-01
Users are introduced to and instructed in the use of the Langley Easy (LEZ) routines of the Common Graphics Library (CGL). The LEZ routines form an application independent graphics package which enables the user community to view data quickly and easily, while providing a means of generating scientific charts conforming to the publication and/or viewgraph process. A distinct advantage for using the LEZ routines is that the underlying graphics package may be replaced or modified without requiring the users to change their application programs. The library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine independent, providing support for centralized and/or distributed computer systems.
Omer-Mizrahi, Melany; Margel, Shlomo
2009-01-15
Core polystyrene microspheres of narrow size distribution were prepared by dispersion polymerization of styrene in a mixture of ethanol and 2-methoxy ethanol. Uniform polyglycidyl methacrylate/polystyrene core-shell micrometer-sized particles were prepared by emulsion polymerization at 73 degrees C of glycidyl methacrylate in the presence of the core polystyrene microspheres. Core-shell particles with different properties (size, surface morphology and composition) have been prepared by changing various parameters belonging to the above seeded emulsion polymerization process, e.g., volumes of the monomer glycidyl methacrylate and the crosslinker monomer ethylene glycol dimethacrylate. Magnetic Fe(3)O(4)/polyglycidyl methacrylate/polystyrene micrometer-sized particles were prepared by coating the former core-shell particles with magnetite nanoparticles via a nucleation and growth mechanism. Characterization of the various particles has been accomplished by routine methods such as light microscopy, SEM, FTIR, BET and magnetic measurements.
Yuan, Jiangbei; Wang, Chengjian; Sun, Yujiao; Huang, Linjuan; Wang, Zhongfu
2014-10-01
A novel strategy is proposed, using cost-saving chemical reactions to generate intact free reducing N-glycans and their fluorescent derivatives from glycoproteins for subsequent analysis. N-Glycans without core α-1,3-linked fucose are released in reducing form by selective hydrolysis of the N-type carbohydrate-peptide bond of glycoproteins under a set of optimized mild alkaline conditions and are comparable to those released by commonly used peptide-N-glycosidase (PNGase) F in terms of yield without any detectable side reaction (peeling or deacetylation). The obtained reducing glycans can be routinely derivatized with 2-aminobenzoic acid (2-AA), 1-phenyl-3-methyl-5-pyrazolone (PMP), and potentially some other fluorescent reagents for comprehensive analysis. Alternatively, the core α-1,3-fucosylated N-glycans are released in mild alkaline medium and derivatized with PMP in situ, and their yields are comparable to those obtained using commonly used PNGase A without conspicuous peeling reaction or any detectable deacetylation. Using this new technique, the N-glycans of a series of purified glycoproteins and complex biological samples were successfully released and analyzed by electrospray ionization mass spectrometry (ESI-MS) and tandem mass spectrometry (MS/MS), demonstrating its general applicability to glycomic studies. Copyright © 2014 Elsevier Inc. All rights reserved.
Getting through to circadian oscillators: why use constant routines?
NASA Technical Reports Server (NTRS)
Duffy, Jeanne F.; Dijk, Derk-Jan
2002-01-01
Overt 24-h rhythmicity is composed of both exogenous and endogenous components, reflecting the product of multiple (periodic) feedback loops with a core pacemaker at their center. Researchers attempting to reveal the endogenous circadian (near 24-h) component of rhythms commonly conduct their experiments under constant environmental conditions. However, even under constant environmental conditions, rhythmic changes in behavior, such as food intake or the sleep-wake cycle, can contribute to observed rhythmicity in many physiological and endocrine variables. Assessment of characteristics of the core circadian pacemaker and its direct contribution to rhythmicity in different variables, including rhythmicity in gene expression, may be more reliable when such periodic behaviors are eliminated or kept constant across all circadian phases. This is relevant for the assessment of the status of the circadian pacemaker in situations in which the sleep-wake cycle or food intake regimes are altered because of external conditions, such as in shift work or jet lag. It is also relevant for situations in which differences in overt rhythmicity could be due to changes in either sleep oscillatory processes or circadian rhythmicity, such as advanced or delayed sleep phase syndromes, in aging, or in particular clinical conditions. Researchers studying human circadian rhythms have developed constant routine protocols to assess the status of the circadian pacemaker in constant behavioral and environmental conditions, whereas this technique is often thought to be unnecessary in the study of animal rhythms. In this short review, the authors summarize constant routine methodology and what has been learned from constant routines and argue that animal and human circadian rhythm researchers should (continue to) use constant routines as a step on the road to getting through to central and peripheral circadian oscillators in the intact organism.
Multitasking 3-D forward modeling using high-order finite difference methods on the Cray X-MP/416
DOE Office of Scientific and Technical Information (OSTI.GOV)
Terki-Hassaine, O.; Leiss, E.L.
1988-01-01
The CRAY X-MP/416 was used to multitask 3-D forward modeling by the high-order finite difference method. Flowtrace analysis reveals that the most expensive operation in the unitasked program is a matrix vector multiplication. The in-core and out-of-core versions of a reentrant subroutine can perform any fraction of the matrix vector multiplication independently, a pattern compatible with multitasking. The matrix vector multiplication routine can be distributed over two to four processors. The rest of the program utilizes the microtasking feature that lets the system treat independent iterations of DO-loops as subtasks to be performed by any available processor. The availability ofmore » the Solid-State Storage Device (SSD) meant the I/O wait time was virtually zero. A performance study determined a theoretical speedup, taking into account the multitasking overhead. Multitasking programs utilizing both macrotasking and microtasking features obtained actual speedups that were approximately 80% of the ideal speedup.« less
2013-01-01
Background The Clinical Outcomes in Routine Evaluation - Outcome Measure (CORE-OM) is a 34-item instrument developed to monitor clinically significant change in out-patients. The CORE-OM covers four domains: well-being, problems/symptoms, functioning and risk, and sums up in two total scores: the mean of All items, and the mean of All non-risk items. The aim of this study was to examine the psychometric properties of the Norwegian translation of the CORE-OM. Methods A clinical sample of 527 out-patients from North Norwegian specialist psychiatric services, and a non-clinical sample of 464 persons were obtained. The non-clinical sample was a convenience sample consisting of friends and family of health personnel, and of students of medicine and clinical psychology. Students also reported psychological stress. Exploratory factor analysis (EFA) was employed in half the clinical sample. Confirmatory (CFA) factor analyses modelling the theoretical sub-domains were performed in the remaining half of the clinical sample. Internal consistency, means, and gender and age differences were studied by comparing the clinical and non-clinical samples. Stability, effect of language (Norwegian versus English), and of psychological stress was studied in the sub-sample of students. Finally, cut-off scores were calculated, and distributions of scores were compared between clinical and non-clinical samples, and between students reporting stress or no stress. Results The results indicate that the CORE-OM both measures general (g) psychological distress and sub-domains, of which risk of harm separates most clearly from the g factor. Internal consistency, stability and cut-off scores compared well with the original English version. No, or only negligible, language effects were found. Gender differences were only found for the well-being domain in the non-clinical sample and for the risk domain in the clinical sample. Current patient status explained differences between clinical and non-clinical samples, also when gender and age were controlled for. Students reporting psychological distress during last week scored significantly higher than students reporting no stress. These results further validate the recommended cut-off point of 1 between clinical and non-clinical populations. Conclusions The CORE-OM in Norwegian has psychometric properties at the same level as the English original, and could be recommended for general clinical use. A cut-off point of 1 is recommended for both genders. PMID:23521746
YODA++: A proposal for a semi-automatic space mission control
NASA Astrophysics Data System (ADS)
Casolino, M.; de Pascale, M. P.; Nagni, M.; Picozza, P.
YODA++ is a proposal for a semi-automated data handling and analysis system for the PAMELA space experiment. The core of the routines have been developed to process a stream of raw data downlinked from the Resurs DK1 satellite (housing PAMELA) to the ground station in Moscow. Raw data consist of scientific data and are complemented by housekeeping information. Housekeeping information will be analyzed within a short time from download (1 h) in order to monitor the status of the experiment and to foreseen the mission acquisition planning. A prototype for the data visualization will run on an APACHE TOMCAT web application server, providing an off-line analysis tool using a browser and part of code for the system maintenance. Data retrieving development is in production phase, while a GUI interface for human friendly monitoring is on preliminary phase as well as a JavaServerPages/JavaServerFaces (JSP/JSF) web application facility. On a longer timescale (1 3 h from download) scientific data are analyzed. The data storage core will be a mix of CERNs ROOT files structure and MySQL as a relational database. YODA++ is currently being used in the integration and testing on ground of PAMELA data.
NASA Astrophysics Data System (ADS)
Zhang, Wenyan; Chen, Jiahua; Wang, Wei; Lu, GongXuan; Hao, Lingyun; Ni, Yaru; Lu, Chunhua; Xu, Zhongzi
2017-03-01
Super-paramagnetic core-shell microspheres were synthesized by ultrasonic assisted routine under low ultrasonic irradiation powers. Compared with conventional routine, ultrasonic effect could not only improve the uniformity of the core-shell structure of Fe3O4@SiO2, but shorten the synthesis time in large scale. Owing to their hydrophilicity and high surface charge, the Fe3O4@SiO2 microspheres could be dispersed well in distilled water to form homogeneous colloidal suspension. The suspensions have favorable magneto-chromatic ability that they sensitively exhibit brilliant colorful ribbons by magnetic attraction. The colorful ribbons, which distributed along the magnetic lines, make morphology of the magnetic fields become "visible" to naked eyed. Those colorful ribbons originate from strong magnetic interaction between the microspheres and magnetic fields. Furthermore, the magneto-chromatic performance is reversible as the colorful ribbons vanished rapidly with the removing of magnetic fields. The silica layer effectively enhanced the acid resistance and surface-oxidation resistance of theFe3O4@SiO2 microspheres, so they could exhibit stable magnetic nature and robust magneto-chromatic property in acid environment.
Image processing and machine learning in the morphological analysis of blood cells.
Rodellar, J; Alférez, S; Acevedo, A; Molina, A; Merino, A
2018-05-01
This review focuses on how image processing and machine learning can be useful for the morphological characterization and automatic recognition of cell images captured from peripheral blood smears. The basics of the 3 core elements (segmentation, quantitative features, and classification) are outlined, and recent literature is discussed. Although red blood cells are a significant part of this context, this study focuses on malignant lymphoid cells and blast cells. There is no doubt that these technologies may help the cytologist to perform efficient, objective, and fast morphological analysis of blood cells. They may also help in the interpretation of some morphological features and may serve as learning and survey tools. Although research is still needed, it is important to define screening strategies to exploit the potential of image-based automatic recognition systems integrated in the daily routine of laboratories along with other analysis methodologies. © 2018 John Wiley & Sons Ltd.
Mallory, Melanie A; Lucic, Danijela X; Sears, Mitchell T; Cloherty, Gavin A; Hillyard, David R
2014-05-01
HCV genotyping is a critical tool for guiding initiation of therapy and selecting the most appropriate treatment regimen. To evaluate the concordance between the Abbott GT II assay and genotyping by sequencing subregions of the HCV 5'UTR, core and NS5B. The Abbott assay was used to genotype 127 routine patient specimens and 35 patient specimens with unusual subtypes and mixed infection. Abbott results were compared to genotyping by 5'UTR, core and NS5B sequencing. Sequences were genotyped using the NCBI non-redundant database and the online genotyping tool COMET. Among routine specimens, core/NS5B sequencing identified 93 genotype 1s, 13 genotype 2s, 15 genotype 3s, three genotype 4s, two genotype 6s and one recombinant specimen. Genotype calls by 5'UTR, core, NS5B sequencing and the Abbott assay were 97.6% concordant. Core/NS5B sequencing identified two discrepant samples as genotype 6 (subtypes 6l and 6u) while Abbott and 5'UTR sequencing identified these samples as genotype 1 with no subtype. The Abbott assay subtyped 91.4% of genotype 1 specimens. Among the 35 rare specimens, the Abbott assay inaccurately genotyped 3k, 6e, 6o, 6q and one genotype 4 variant; gave indeterminate results for 3g, 3h, 4r, 6m, 6n, and 6q specimens; and agreed with core/NS5B sequencing for mixed specimens. The Abbott assay is an automated HCV genotyping method with improved accuracy over 5'UTR sequencing. Samples identified by the Abbott assay as genotype 1 with no subtype may be rare subtypes of other genotypes and thus require confirmation by another method. Copyright © 2014 Elsevier B.V. All rights reserved.
Finning, Kirstin; Bhandari, Radhika; Sellers, Fiona; Revelli, Nicoletta; Villa, Maria Antonietta; Muñiz-Díaz, Eduardo; Nogués, Núria
2016-03-01
High-throughput genotyping platforms enable simultaneous analysis of multiple polymorphisms for blood group typing. BLOODchip® ID is a genotyping platform based on Luminex® xMAP technology for simultaneous determination of 37 red blood cell (RBC) antigens (ID CORE XT) and 18 human platelet antigens (HPA) (ID HPA XT) using the BIDS XT software. In this international multicentre study, the performance of ID CORE XT and ID HPA XT, using the centres' current genotyping methods as the reference for comparison, and the usability and practicality of these systems, were evaluated under working laboratory conditions. DNA was extracted from whole blood in EDTA with Qiagen methodologies. Ninety-six previously phenotyped/genotyped samples were processed per assay: 87 testing samples plus five positive controls and four negative controls. Results were available for 519 samples: 258 with ID CORE XT and 261 with ID HPA XT. There were three "no calls" that were either caused by human error or resolved after repeating the test. Agreement between the tests and reference methods was 99.94% for ID CORE XT (9,540/9,546 antigens determined) and 100% for ID HPA XT (all 4,698 alleles determined). There were six discrepancies in antigen results in five RBC samples, four of which (in VS, N, S and Do(a)) could not be investigated due to lack of sufficient sample to perform additional tests and two of which (in S and C) were resolved in favour of ID CORE XT (100% accuracy). The total hands-on time was 28-41 minutes for a batch of 16 samples. Compared with the reference platforms, ID CORE XT and ID HPA XT were considered simpler to use and had shorter processing times. ID CORE XT and ID HPA XT genotyping platforms for RBC and platelet systems were accurate and user-friendly in working laboratory settings.
Holmes, Lisa; Landsverk, John; Ward, Harriet; Rolls-Reutz, Jennifer; Saldana, Lisa; Wulczyn, Fred; Chamberlain, Patricia
2014-04-01
Estimating costs in child welfare services is critical as new service models are incorporated into routine practice. This paper describes a unit costing estimation system developed in England (cost calculator) together with a pilot test of its utility in the United States where unit costs are routinely available for health services but not for child welfare services. The cost calculator approach uses a unified conceptual model that focuses on eight core child welfare processes. Comparison of these core processes in England and in four counties in the United States suggests that the underlying child welfare processes generated from England were perceived as very similar by child welfare staff in California county systems with some exceptions in the review and legal processes. Overall, the adaptation of the cost calculator for use in the United States child welfare systems appears promising. The paper also compares the cost calculator approach to the workload approach widely used in the United States and concludes that there are distinct differences between the two approaches with some possible advantages to the use of the cost calculator approach, especially in the use of this method for estimating child welfare costs in relation to the incorporation of evidence-based interventions into routine practice.
Is seminal vesiculectomy necessary in all patients with biopsy Gleason score 6?
Gofrit, Ofer N; Zorn, Kevin C; Shikanov, Sergey A; Zagaja, Gregory P; Shalhav, Arieh L
2009-04-01
Radiotherapists are excluding the seminal vesicles (SVs) from their target volume in cases of low-risk prostate cancer. However, these glands are routinely removed in every radical prostatectomy. Dissection of the SVs can damage the pelvic plexus, compromise trigonal, bladder neck, and cavernosal innervation, and contribute to delayed gain of continence and erectile function. In this study we evaluated the oncological benefit of routine removal of the SVs in currently operated patients. A total of 1003 patients (mean age, 59.7 years) with prostate cancer underwent robot-assisted radical prostatectomy between February 2003 and July 2007. Seminal vesicle invasion (SVI) was found in 46 of the operated patients (4.6%). Biopsy Gleason score (BGS), preoperative serum PSA, clinical tumor stage, percent of positive cores, and maximal percentage of cancer in a core had all a significant impact on the risk of SVI. Only 4/634 patients (0.6%) with BGS < or =6 suffered from SVI, as opposed to 42/369 (11.4%) with higher Gleason scores. Seminal vesiculectomy does not benefit more than 99% of the patients with BGS < or =6. Considering the potential neural and vascular damage associated with seminal vesiculectomy, we suggest that routine removal of these glands during radical prostatectomy in these cases is not necessary.
Porosity characterization for heterogeneous shales using integrated multiscale microscopy
NASA Astrophysics Data System (ADS)
Rassouli, F.; Andrew, M.; Zoback, M. D.
2016-12-01
Pore size distribution analysis plays a critical role in gas storage capacity and fluid transport characterization of shales. Study of the diverse distribution of pore size and structure in such low permeably rocks is withheld by the lack of tools to visualize the microstructural properties of shale rocks. In this paper we try to use multiple techniques to investigate the full pore size range in different sample scales. Modern imaging techniques are combined with routine analytical investigations (x-ray diffraction, thin section analysis and mercury porosimetry) to describe pore size distribution of shale samples from Haynesville formation in East Texas to generate a more holistic understanding of the porosity structure in shales, ranging from standard core plug down to nm scales. Standard 1" diameter core plug samples were first imaged using a Versa 3D x-ray microscope at lower resolutions. Then we pick several regions of interest (ROIs) with various micro-features (such as micro-cracks and high organic matters) in the rock samples to run higher resolution CT scans using a non-destructive interior tomography scans. After this step, we cut the samples and drill 5 mm diameter cores out of the selected ROIs. Then we rescan the samples to measure porosity distribution of the 5 mm cores. We repeat this step for samples with diameter of 1 mm being cut out of the 5 mm cores using a laser cutting machine. After comparing the pore structure and distribution of the samples measured form micro-CT analysis, we move to nano-scale imaging to capture the ultra-fine pores within the shale samples. At this stage, the diameter of the 1 mm samples will be milled down to 70 microns using the laser beam. We scan these samples in a nano-CT Ultra x-ray microscope and calculate the porosity of the samples by image segmentation methods. Finally, we use images collected from focused ion beam scanning electron microscopy (FIB-SEM) to be able to compare the results of porosity measurements from all different imaging techniques. These multi-scale characterization techniques are then compared with traditional analytical techniques such as Mercury Porosimetry.
Fused-data transrectal EIT for prostate cancer imaging.
Murphy, Ethan K; Wu, Xiaotian; Halter, Ryan J
2018-05-25
Prostate cancer is a significant problem affecting 1 in 7 men. Unfortunately, the diagnostic gold-standard of ultrasound-guided biopsy misses 10%-30% of all cancers. The objective of this study was to develop an electrical impedance tomography (EIT) approach that has the potential to image the entire prostate using multiple impedance measurements recorded between electrodes integrated onto an end-fired transrectal ultrasound (TRUS) device and a biopsy probe (BP). Simulations and sensitivity analyses were used to investigate the best combination of electrodes, and measured tank experiments were used to evaluate a fused-data transrectal EIT (fd-TREIT) and BP approach. Simulations and sensitivity analysis revealed that (1) TREIT measurements are not sufficiently sensitive to image the whole prostate, (2) the combination of TREIT + BP measurements increases the sensitive region of TREIT-only measurements by 12×, and (3) the fusion of multiple TREIT + BP measurements collected during a routine or customized 12-core biopsy procedure can cover up to 76.1% or 94.1% of a nominal 50 cm 3 prostate, respectively. Three measured tank experiments of the fd-TREIT + BP approach successfully and accurately recovered the positions of 2-3 metal or plastic inclusions. The measured tank experiments represent important steps in the development of an algorithm that can combine EIT from multiple locations and from multiple probes-data that could be collected during a routine TRUS-guided 12-core biopsy. Overall, this result is a step towards a clinically deployable impedance imaging approach to scanning the entire prostate, which could significantly help to improve prostate cancer diagnosis.
Blainey, Sarah H; Rumball, Freya; Mercer, Louise; Evans, Lauren Jayne; Beck, Alison
2017-11-01
To investigate the effectiveness of psychological therapy in reducing psychological distress for adults with autism spectrum conditions (ASC) and co-morbid mental health conditions in routine clinical practice. To explore the effect of individual characteristics and service factors on change in general distress. In a specialist psychological therapies service for adults with ASC, the Clinical Outcomes in Routine Evaluation-Outcome Measure (CORE-OM) self-report questionnaire of psychological distress is completed by clients at start and end of therapy. Change over time and reliable and clinical change was assessed for 81 of a total of 122 clients (66.4%). Factors which may influence change over time were explored using available clinical information. Overall, there was a significant reduction in CORE-OM score during therapy with a small effect size. Most clients showed an improvement in psychological distress over therapy (75.4% improved, with 36.9% of these showing reliable changes). Significant and comparable reductions from pre-therapy to post-therapy were seen across the sample, showing that individual differences did not mediate therapy effectiveness. CORE-OM scores mediate the association between age of ASD diagnosis and hours of therapeutic input required, with greater age at diagnosis and higher distress associated with longer therapy duration. Our preliminary findings suggest that psychological therapy may be effective in reducing general distress for clients with ASC and co-morbid mental health conditions and should be routinely offered. Individuals who are diagnosed with ASD in adulthood are likely to require a longer course of therapy when their general distress scores are high. Copyright © 2017 John Wiley & Sons, Ltd.
Armentano, Antonio; Summa, Simona; Magro, Sonia Lo; D’Antini, Pasquale; Palermo, Carmen; Muscarella, Marilena
2016-01-01
A C18 column packed with core-shell particles was used for the chromatographic separation of sulphonamides in feed and meat by a conventional high performance liquid chromatography system coupled with a diode array detector. Two analytical methods, already used in our laboratory, have been modified without any changes in the extraction and clean-up steps and in the liquid chromatography instrumentation. Chromatographic conditions applied on a traditional 5-µm column have been optimized on a column packed with 2.6 µm core-shell particles. A binary mobile phase [acetate buffer solution at pH 4.50 and a mixture of methanol acetonitrile 50: 50 (v/v)] was employed in gradient mode at the flow rate of 1.2 mL with an injection volume of 6 µL. These chromatographic conditions allow the separation of 13 sulphonamides with an entire run of 13 minutes. Preliminary studies have been carried out comparing blanks and spiked samples of feed and meat. A good resolution and the absence of interferences were achieved in chromatograms for both matrices. Since no change was made to the sample preparation, the optimized method does not require a complete revalidation and can be used to make routine analysis faster. PMID:28217560
Kilfedder, Catherine; Power, Kevin; Karatzias, Thanos; McCafferty, Aileen; Niven, Karen; Chouliara, Zoë; Galloway, Lisa; Sharp, Stephen
2010-09-01
The aim of the present study was to compare the effectiveness and acceptability of three interventions for occupational stress. A total of 90 National Health Service employees were randomized to face-to-face counselling or telephone counselling or bibliotherapy. Outcomes were assessed at post-intervention and 4-month follow-up. Clinical Outcomes in Routine Evaluation (CORE), General Health Questionnaire (GHQ-12), and Perceived Stress Scale (PSS-10) were used to evaluate intervention outcomes. An intention-to-treat analyses was performed. Repeated measures analysis revealed significant time effects on all measures with the exception of CORE Risk. No significant group effects were detected on all outcome measures. No time by group significant interaction effects were detected on any of the outcome measures with the exception of CORE Functioning and GHQ total. With regard to acceptability of interventions, participants expressed a preference for face-to-face counselling over the other two modalities. Overall, it was concluded that the three intervention groups are equally effective. Given that bibliotherapy is the least costly of the three, results from the present study might be considered in relation to a stepped care approach to occupational stress management with bibliotherapy as the first line of intervention, followed by telephone and face-to-face counselling as required.
The development of optimal lightweight truss-core sandwich panels
NASA Astrophysics Data System (ADS)
Langhorst, Benjamin Robert
Sandwich structures effectively provide lightweight stiffness and strength by sandwiching a low-density core between stiff face sheets. The performance of lightweight truss-core sandwich panels is enhanced through the design of novel truss arrangements and the development of methods by which the panels may be optimized. An introduction to sandwich panels is presented along with an overview of previous research of truss-core sandwich panels. Three alternative truss arrangements are developed and their corresponding advantages, disadvantages, and optimization routines are discussed. Finally, performance is investigated by theoretical and numerical methods, and it is shown that the relative structural efficiency of alternative truss cores varies with panel weight and load-carrying capacity. Discrete truss core sandwich panels can be designed to serve bending applications more efficiently than traditional pyramidal truss arrangements at low panel weights and load capacities. Additionally, discrete-truss cores permit the design of heterogeneous cores, which feature unit cells that vary in geometry throughout the panel according to the internal loads present at each unit cell's location. A discrete-truss core panel may be selectively strengthened to more efficiently support bending loads. Future research is proposed and additional areas for lightweight sandwich panel development are explained.
Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri
2014-01-01
In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries.
Clissett, Philip; Porock, Davina; Harwood, Rowan H; Gladman, John R F
2013-12-01
To explore the experiences of family carers of people with cognitive impairment during admission to hospital. Providing appropriate care in acute hospitals for people with co-morbid cognitive impairment, especially dementia or delirium or both, is challenging to healthcare professionals. One key element is close working with family members. Qualitative interview study. Semi-structured interviews with family carers of 34 older people who had been admitted to a UK general hospital and had co-morbid cognitive impairment. Interviews conducted in 2009 and 2010. Analysis was undertaken using Strauss and Corbin's framework. The findings elaborate a core problem, 'disruption from normal routine' and a core process, 'gaining or giving a sense of control to cope with disruption'. Family carers responded to disruption proactively by trying to make sense of the situation and attempting to gain control for themselves or the patient. They tried to stay informed, communicate with staff about the patient and plan for the future. The interaction of the core problem and the core process resulted in outcomes where family members either valued the support of hospital staff and services or were highly critical of the care provided. Family carers are not passive in the face of the disruption of hospitalization and respond both by trying to involve themselves in the care and support of their relative and by trying to work in partnership with members of staff. Nurses need to foster this relationship conscientiously. © 2013 John Wiley & Sons Ltd.
Trahearn, Nicholas; Tsang, Yee Wah; Cree, Ian A; Snead, David; Epstein, David; Rajpoot, Nasir
2017-06-01
Automation of downstream analysis may offer many potential benefits to routine histopathology. One area of interest for automation is in the scoring of multiple immunohistochemical markers to predict the patient's response to targeted therapies. Automated serial slide analysis of this kind requires robust registration to identify common tissue regions across sections. We present an automated method for co-localized scoring of Estrogen Receptor and Progesterone Receptor (ER/PR) in breast cancer core biopsies using whole slide images. Regions of tumor in a series of fifty consecutive breast core biopsies were identified by annotation on H&E whole slide images. Sequentially cut immunohistochemical stained sections were scored manually, before being digitally scanned and then exported into JPEG 2000 format. A two-stage registration process was performed to identify the annotated regions of interest in the immunohistochemistry sections, which were then scored using the Allred system. Overall correlation between manual and automated scoring for ER and PR was 0.944 and 0.883, respectively, with 90% of ER and 80% of PR scores within in one point or less of agreement. This proof of principle study indicates slide registration can be used as a basis for automation of the downstream analysis for clinically relevant biomarkers in the majority of cases. The approach is likely to be improved by implantation of safeguarding analysis steps post registration. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.
Project DULCE: Strengthening Families through Enhanced Primary Care
ERIC Educational Resources Information Center
Sege, Robert; Kaplan-Sanof, Margot; Morton, Samantha J.; Velasco-Hodgson, M. Carolina; Preer, Genevieve; Morakinyo, Grace; DeVos, Ed; Krathen, Julie
2014-01-01
Project DULCE (Developmental understanding and legal Collaboration for everyone) integrated the Strengthening families approach to building family protective factors into routine health care visits for infants in a primary health care setting. The core collaborators--Boston medical Center pediatric primary care, the medical-legal partnership |…
Nesher, Nahum; Uretzky, Gideon; Insler, Steven; Nataf, Patrick; Frolkis, Inna; Pineau, Emmanuelle; Cantoni, Emmanuel; Bolotin, Gil; Vardi, Moshe; Pevni, Dimitry; Lev-Ran, Oren; Sharony, Ram; Weinbroum, Avi A
2005-06-01
Perioperative hypothermia might be detrimental to the patient undergoing off-pump coronary artery bypass surgery. We assessed the efficacy of the Allon thermoregulation system (MTRE Advanced Technologies Ltd, Or-Akiva, Israel) compared with that of routine thermal care in maintaining normothermia during and after off-pump coronary artery bypass surgery. Patients undergoing off-pump coronary artery bypass surgery were perioperatively and randomly warmed with the 2 techniques (n = 45 per group). Core temperature, hemodynamics, and troponin I, interleukin 6, interleukin 8, and interleukin 10 blood levels were assessed. The mean temperature of the patients in the Allon thermoregulation system group (AT group) was significantly ( P < .005) higher than that of the patients receiving routine thermal care (the RTC group); less than 40% of the latter reached 36 degrees C compared with 100% of the former. The cardiac index was higher and the systemic vascular resistance was lower ( P < .05) by 16% and 25%, respectively, in the individuals in the AT group compared with in the individuals in the RTC group during the 4 postoperative hours. End-of-surgery interleukin 6 levels and 24-hour postoperative troponin I levels were significantly ( P < .01) lower in the patients in the AT group than in the RTC group. The RTC group's troponin levels closely correlated with their interleukin 6 levels at the end of the operation ( R = 0.51, P = .002). Unlike routine thermal care, the Allon thermoregulation system maintains core normothermia in more than 80% of patients undergoing off-pump coronary artery bypass surgery. Normothermia is associated with better cardiac and vascular conditions, a lower cardiac injury rate, and a lower inflammatory response. The close correlation between the increased interleukin 6 and troponin I levels in the routine thermal care group indicates a potential deleterious effect of lowered temperature on the patient's outcome.
[Influence of different types of posts and cores on color of IPS-Empress 2 crown].
Li, Dong-fang; Yang, Jing-yuan; Yang, Xing-mei; Yang, Liu; Xu, Qiang; Guan, Hong-yu; Wan, Qian-bing
2007-10-01
To evaluate the influence of different types of posts and cores on the final color of the IPS-Emperss 2 crown. Five types of posts and cores (Cerapost with Empress cosmo, Cerapost with composite resin, gilded Ni-Cr alloy, gold alloy and Ni-Cr alloy) were made. The shifts in color of three points of IPS-Empress 2 crown surface (cervical, middle and incisal) with different posts and cores was measured with a spectroradiometer (PR-650). The L* a* b* values of zirconium oxide and gilded Ni-Cr alloy posts and cores with ceramic crown were the highest. The L* a* values of zirconium oxide posts composite cores were higher while the b* values were lower. The L* a* b* values of Ni-Cr alloy were lower than that of gold alloy and were the lowest. In combination with IPS-Empress 2 crown, zirconium oxide posts are suitable for routine use in the anterior dentition, and gilded Ni-Cr alloy and gold alloy posts and cores can be recommended for clinical practice. Ni-Cr alloy posts and cores can not be recommended for clinical practice.
Microbiology of the lower ocean crust - Preliminary results from IODP Expedition 360, Atlantis Bank
NASA Astrophysics Data System (ADS)
Sylvan, J. B.; Edgcomb, V. P.; Burgaud, G.; Klein, F.; Schubotz, F.; Expedition 360 Scientists, I.
2016-12-01
International Ocean Discovery Program (IODP) Expedition 360 represents the first leg of a multi-phase drilling program, SloMo, aimed at investigating the nature of the lower crust and Moho at slow spreading ridges. The goal of Expedition 360 was to recover a representative transect of the lower oceanic crust formed at Atlantis Bank, an oceanic core complex on the SW Indian Ridge. We present here preliminary analysis of microbial communities sampled from Hole U1473A, drilled to 789.7 m below seafloor during Expedition 360. Sub-sampling of core sections was conducted in a newly designed plexiglass enclosure with positive air pressure and HEPA filtered air, providing a clean environment for microbiology sampling aboard the JOIDES Resolution. Adenosine triphosphoate, an indicator of microbial biomass, was quantified above detection in 23 of 66 samples analyzed. We measured exoenzyme activity for alkaline phosphatase (AP), leucine aminopeptidase and arginine aminopeptidase in 16 samples and found AP to be very low but above background for 14 of the samples, with highest activities measured between 10 and 70 m below seafloor (mbsf) and peaks again at 158 and 307 mbsf, while both peptidase enzymes were above detection for only one sample at 715 mbsf. Isolates of fungi obtained from core samples as well as analyses of lipid and DNA biomarkers, and Raman spectra for a few of our rock core samples provide initial insights into microbial communities in the lower oceanic crust. Finally, a new tracer of seawater and drilling mud contamination, perfluoromethyl decaline (PFMD), was tested for the first time and its performance compared with the commonly used tracer perfluoromethylcyclohexane (PMCH). PFMD was run during coring operations for ten samples and was routinely detected in the drilling fluids, usually detected on the outside of uncleaned cores, and rarely above detection on the cleaned outside of cores. It was below detection on the inside of cores, indicating penetration of drill fluids to the interior of whole round drill cores, where we collected our samples, is unlikely.
The compressed average image intensity metric for stereoscopic video quality assessment
NASA Astrophysics Data System (ADS)
Wilczewski, Grzegorz
2016-09-01
The following article depicts insights towards design, creation and testing of a genuine metric designed for a 3DTV video quality evaluation. The Compressed Average Image Intensity (CAII) mechanism is based upon stereoscopic video content analysis, setting its core feature and functionality to serve as a versatile tool for an effective 3DTV service quality assessment. Being an objective type of quality metric it may be utilized as a reliable source of information about the actual performance of a given 3DTV system, under strict providers evaluation. Concerning testing and the overall performance analysis of the CAII metric, the following paper presents comprehensive study of results gathered across several testing routines among selected set of samples of stereoscopic video content. As a result, the designed method for stereoscopic video quality evaluation is investigated across the range of synthetic visual impairments injected into the original video stream.
ERIC Educational Resources Information Center
Deskins, John A.; Lorenze, Stephanie Morris
2012-01-01
Support for arts education is a favorite issue in the education media. Policymakers of all stripes routinely declare their devotion to the arts and its importance in students' lives. They often cite findings that students who have arts education outperform their peers on most indicators of academic achievement. Many will relate their own…
Hadziyannis, Emilia; Minopetrou, Martha; Georgiou, Anastasia; Spanou, Fotini; Koskinas, John
2013-01-01
Background Hepatitis C viral (HCV) load detection and quantification is routinely accomplished by HCV RNA measurement, an expensive but essential test, both for the diagnosis and treatment of chronic hepatitis C (CHC). HCV core antigen (Ag) testing has been suggested as an attractive alternative to molecular diagnostics. The aim of the study was to evaluate an automated chemiluminescent immunoassay (CLIA) for HCV core Ag measurement in comparison to quantitative HCV RNA determination. Methods HCV Ag was measured in 105 anti-HCV positive patients, from which 89 were HCV RNA positive with CHC and 16 HCV RNA negative after spontaneous HCV clearance. Viral load was quantified with branched DNA (bDNA, Versant, Siemens). Sera were stored at -70°C and then tested with the Architect HCV Ag test (Abbott Laboratories), a two-step CLIA assay, with high throughput and minimal handling of the specimens. Statistical analysis was performed on logarithmically transformed values. Results HCV-Ag was detectable and quantifiable in 83/89 and in grey zone in 4/89 HCV RNA positive sera. HCV-Ag was undetectable in all 16 HCV RNA negative samples. The sample with the lowest viral load that tested positive for HCV-Ag contained 1200 IU/mL HCV RNA. There was a positive correlation between HCV RNA and HCV-Ag (r=0.89). The HCV RNA/ HCV Ag ratio varied from 1.5 to 3.25. Conclusion The HCV core Ag is an easy test with comparable sensitivity (>90%) and satisfactory correlation with the HCV RNA bDNA assay. Its role in diagnostics and other clinical applications has to be determined based on cost effectiveness. PMID:24714621
NASA Technical Reports Server (NTRS)
Ahmadian, M.; Inman, D. J.
1982-01-01
Systems described by the matrix differental equation are considered. An interactive design routine is presented for positive definite mass, damping, and stiffness matrices. Designing is accomplished by adjusting the mass, damping, and stiffness matrices to obtain a desired oscillation behavior. The algorithm also features interactively modifying the physical structure of the system, obtaining the matrix structure and a number of other system properties. In case of a general system, where the M, C, and K matrices lack any special properties, a routine for the eigenproblem solution of the system is developed. The latent roots are obtained by computing the characteristic polynomial of the system and solving for its roots. The above routines are prepared in FORTRAN IV and prove to be usable for the machines with low core memory.
Image analysis library software development
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Bryant, J.
1977-01-01
The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.
Ojeda-May, Pedro; Nam, Kwangho
2017-08-08
The strategy and implementation of scalable and efficient semiempirical (SE) QM/MM methods in CHARMM are described. The serial version of the code was first profiled to identify routines that required parallelization. Afterward, the code was parallelized and accelerated with three approaches. The first approach was the parallelization of the entire QM/MM routines, including the Fock matrix diagonalization routines, using the CHARMM message passage interface (MPI) machinery. In the second approach, two different self-consistent field (SCF) energy convergence accelerators were implemented using density and Fock matrices as targets for their extrapolations in the SCF procedure. In the third approach, the entire QM/MM and MM energy routines were accelerated by implementing the hybrid MPI/open multiprocessing (OpenMP) model in which both the task- and loop-level parallelization strategies were adopted to balance loads between different OpenMP threads. The present implementation was tested on two solvated enzyme systems (including <100 QM atoms) and an S N 2 symmetric reaction in water. The MPI version exceeded existing SE QM methods in CHARMM, which include the SCC-DFTB and SQUANTUM methods, by at least 4-fold. The use of SCF convergence accelerators further accelerated the code by ∼12-35% depending on the size of the QM region and the number of CPU cores used. Although the MPI version displayed good scalability, the performance was diminished for large numbers of MPI processes due to the overhead associated with MPI communications between nodes. This issue was partially overcome by the hybrid MPI/OpenMP approach which displayed a better scalability for a larger number of CPU cores (up to 64 CPUs in the tested systems).
Neutronics calculation of RTP core
NASA Astrophysics Data System (ADS)
Rabir, Mohamad Hairie B.; Zin, Muhammad Rawi B. Mohamed; Karim, Julia Bt. Abdul; Bayar, Abi Muttaqin B. Jalal; Usang, Mark Dennis Anak; Mustafa, Muhammad Khairul Ariff B.; Hamzah, Na'im Syauqi B.; Said, Norfarizan Bt. Mohd; Jalil, Muhammad Husamuddin B.
2017-01-01
Reactor calculation and simulation are significantly important to ensure safety and better utilization of a research reactor. The Malaysian's PUSPATI TRIGA Reactor (RTP) achieved initial criticality on June 28, 1982. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes. Since early 90s, neutronics modelling were used as part of its routine in-core fuel management activities. The are several computer codes have been used in RTP since then, based on 1D neutron diffusion, 2D neutron diffusion and 3D Monte Carlo neutron transport method. This paper describes current progress and overview on neutronics modelling development in RTP. Several important parameters were analysed such as keff, reactivity, neutron flux, power distribution and fission product build-up for the latest core configuration. The developed core neutronics model was validated by means of comparison with experimental and measurement data. Along with the RTP core model, the calculation procedure also developed to establish better prediction capability of RTP's behaviour.
An Open Source modular platform for hydrological model implementation
NASA Astrophysics Data System (ADS)
Kolberg, Sjur; Bruland, Oddbjørn
2010-05-01
An implementation framework for setup and evaluation of spatio-temporal models is developed, forming a highly modularized distributed model system. The ENKI framework allows building space-time models for hydrological or other environmental purposes, from a suite of separately compiled subroutine modules. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational hydropower forecasting or other water resource management. Written in C++, ENKI uses a plug-in structure to build a complete model from separately compiled subroutine implementations. These modules contain very little code apart from the core process simulation, and are compiled as dynamic-link libraries (dll). A narrow interface allows the main executable to recognise the number and type of the different variables in each routine. The framework then exposes these variables to the user within the proper context, ensuring that time series exist for input variables, initialisation for states, GIS data sets for static map data, manually or automatically calibrated values for parameters etc. ENKI is designed to meet three different levels of involvement in model construction: • Model application: Running and evaluating a given model. Regional calibration against arbitrary data using a rich suite of objective functions, including likelihood and Bayesian estimation. Uncertainty analysis directed towards input or parameter uncertainty. o Need not: Know the model's composition of subroutines, or the internal variables in the model, or the creation of method modules. • Model analysis: Link together different process methods, including parallel setup of alternative methods for solving the same task. Investigate the effect of different spatial discretization schemes. o Need not: Write or compile computer code, handle file IO for each modules, • Routine implementation and testing. Implementation of new process-simulating methods/equations, specialised objective functions or quality control routines, testing of these in an existing framework. o Need not: Implement user or model interface for the new routine, IO handling, administration of model setup and run, calibration and validation routines etc. From being developed for Norway's largest hydropower producer Statkraft, ENKI is now being turned into an Open Source project. At the time of writing, the licence and the project administration is not established. Also, it remains to port the application to other compilers and computer platforms. However, we hope that ENKI will prove useful for both academic and operational users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aleksandrowicz, J.
1963-03-01
The experimental equipment used in the work at the horizontal reactor channels is listed. Diagrams of the utilization of the nominal reactor power and core loading are given, the reactivity fractions of the separate fuel assemblies are detonated, together with the diagram of reactivity versus burnup. Reactor channels and space used for sample irradiation and isotope production are described, and the total number of irradiations is given. Results of the measurements connected with the routine reactor operation are quoted, namely: analysis of water purity in the primary circuit, analysis of the work of the ion exchanger and mechanical filter, andmore » analysis of air activity in the special ventilation system. Data are given concerning radiation protection of personnel, including individual monitoring. Leak testing of the fuel elements is discussed. Damage of the reactor equipment and appearance of alarm signals are described. (auth)« less
Accurate image-charge method by the use of the residue theorem for core-shell dielectric sphere
NASA Astrophysics Data System (ADS)
Fu, Jing; Xu, Zhenli
2018-02-01
An accurate image-charge method (ICM) is developed for ionic interactions outside a core-shell structured dielectric sphere. Core-shell particles have wide applications for which the theoretical investigation requires efficient methods for the Green's function used to calculate pairwise interactions of ions. The ICM is based on an inverse Mellin transform from the coefficients of spherical harmonic series of the Green's function such that the polarization charge due to dielectric boundaries is represented by a series of image point charges and an image line charge. The residue theorem is used to accurately calculate the density of the line charge. Numerical results show that the ICM is promising in fast evaluation of the Green's function, and thus it is useful for theoretical investigations of core-shell particles. This routine can also be applicable for solving other problems with spherical dielectric interfaces such as multilayered media and Debye-Hückel equations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen-Mayer, H; Tosh, R
2015-06-15
Purpose: To reconcile air kerma and calorimetry measurements in a prototype calorimeter for obtaining absorbed dose in diagnostic CT beams. While corrections for thermal artifacts are routine and generally small in calorimetry of radiotherapy beams, large differences in relative stopping powers of calorimeter materials at the lower energies typical of CT beams greatly magnify their effects. Work-to-date on the problem attempts to reconcile laboratory measurements with modeling output from Monte Carlo and finite-element analysis of heat transfer. Methods: Small thermistor beads were embedded in a polystyrene (PS) core element of 1 cm diameter, which was inserted into a cylindrical HDPEmore » phantom of 30 cm diameter and subjected to radiation in a diagnostic CT x-ray imaging system. Resistance changes in the thermistors due to radiation heating were monitored via lock-in amplifier. Multiple 3-second exposures were recorded at 8 different dose-rates from the CT system, and least-squares fits to experimental data were compared to an expected thermal response obtained by finite-element analysis incorporating source terms based on semi-empirical modeling and Monte Carlo simulation. Results: Experimental waveforms exhibited large thermal artifacts with fast time constants, associated with excess heat in wires and glass, and smaller steps attributable to radiation heating of the core material. Preliminary finite-element analysis follows the transient component of the signal qualitatively, but predicts a slower decay of temperature spikes. This was supplemented by non-linear least-squares fits incorporating semi-empirical formulae for heat transfer, which were used to obtain dose-to-PS in reasonable agreement with the output of Monte Carlo calculations that converts air kerma to absorbed dose. Conclusion: Discrepancies between the finite-element analysis and our experimental data testify to the very significant heat transfer correction required for absorbed dose calorimetry of diagnostic CT beams. The results obtained here are being used to refine both simulations and design of calorimeter core components.« less
Cloud-Based Tools to Support High-Resolution Modeling (Invited)
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Swain, N.; Christensen, S.
2013-12-01
The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
Lobb, Eric C
2016-07-08
Version 6.3 of the RITG148+ software package offers eight automated analysis routines for quality assurance of the TomoTherapy platform. A performance evaluation of each routine was performed in order to compare RITG148+ results with traditionally accepted analysis techniques and verify that simulated changes in machine parameters are correctly identified by the software. Reference films were exposed according to AAPM TG-148 methodology for each routine and the RITG148+ results were compared with either alternative software analysis techniques or manual analysis techniques in order to assess baseline agreement. Changes in machine performance were simulated through translational and rotational adjustments to subsequently irradiated films, and these films were analyzed to verify that the applied changes were accurately detected by each of the RITG148+ routines. For the Hounsfield unit routine, an assessment of the "Frame Averaging" functionality and the effects of phantom roll on the routine results are presented. All RITG148+ routines reported acceptable baseline results consistent with alternative analysis techniques, with 9 of the 11 baseline test results showing agreement of 0.1mm/0.1° or better. Simulated changes were correctly identified by the RITG148+ routines within approximately 0.2 mm/0.2° with the exception of the Field Centervs. Jaw Setting routine, which was found to have limited accuracy in cases where field centers were not aligned for all jaw settings due to inaccurate autorotation of the film during analysis. The performance of the RITG148+ software package was found to be acceptable for introduction into our clinical environment as an automated alternative to traditional analysis techniques for routine TomoTherapy quality assurance testing.
Rafferty, Rae; Fairbrother, Greg
2015-06-01
To introduce a theory which describes the process of and explicates the factors moderating, the acquisition and integration of leadership coaching skills into the routine practice of senior nurses and midwives. Organizations invest significant resources in leadership coaching programs to ensure that coaching is embedded as a core function of the manager's role. However, even after training, many managers remain unable to undertake this role successfully. The process by which health professionals translate 'manager as coach' training into successful practice outcomes, has remained largely unexplored. A grounded theory study design. Data, collected between February 2012-May 2013, included in-depth interviews with 20 senior nurses and midwives who had attended a leadership coaching program and analysis of nine reflective practice journals. Multiple researchers coded and analysed the data using constant comparative techniques. The outcomes of coaching training ranged from inappropriate use of the coaching skills through to transformed managerial practice. These outcomes were influenced by the dynamic interaction of three central domains of the emergent theoretical model: pre-existing individual perceptions, program elements and contemporaneous experiences. Interactions occurred within the domains and between them, impacting on activators such as courage, motivation, commitment and confidence. The study offers new insights into how senior nurses and midwives acquire and integrate coaching skills into their routine practice. The process is described as multifactorial and dynamic and has implications for the training design, delivery and organizational support of future leadership coaching programs. © 2015 John Wiley & Sons Ltd.
An efficient and portable SIMD algorithm for charge/current deposition in Particle-In-Cell codes
Vincenti, H.; Lobet, M.; Lehe, R.; ...
2016-09-19
In current computer architectures, data movement (from die to network) is by far the most energy consuming part of an algorithm (≈20pJ/word on-die to ≈10,000 pJ/word on the network). To increase memory locality at the hardware level and reduce energy consumption related to data movement, future exascale computers tend to use many-core processors on each compute nodes that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD registermore » length is expected to double every four years. As a consequence, Particle-In-Cell (PIC) codes will have to achieve good vectorization to fully take advantage of these upcoming architectures. In this paper, we present a new algorithm that allows for efficient and portable SIMD vectorization of current/charge deposition routines that are, along with the field gathering routines, among the most time consuming parts of the PIC algorithm. Our new algorithm uses a particular data structure that takes into account memory alignment constraints and avoids gather/scat;ter instructions that can significantly affect vectorization performances on current CPUs. The new algorithm was successfully implemented in the 3D skeleton PIC code PICSAR and tested on Haswell Xeon processors (AVX2-256 bits wide data registers). Results show a factor of ×2 to ×2.5 speed-up in double precision for particle shape factor of orders 1–3. The new algorithm can be applied as is on future KNL (Knights Landing) architectures that will include AVX-512 instruction sets with 512 bits register lengths (8 doubles/16 singles). Program summary Program Title: vec_deposition Program Files doi:http://dx.doi.org/10.17632/nh77fv9k8c.1 Licensing provisions: BSD 3-Clause Programming language: Fortran 90 External routines/libraries: OpenMP > 4.0 Nature of problem: Exascale architectures will have many-core processors per node with long vector data registers capable of performing one single instruction on multiple data during one clock cycle. Data register lengths are expected to double every four years and this pushes for new portable solutions for efficiently vectorizing Particle-In-Cell codes on these future many-core architectures. One of the main hotspot routines of the PIC algorithm is the current/charge deposition for which there is no efficient and portable vector algorithm. Solution method: Here we provide an efficient and portable vector algorithm of current/charge deposition routines that uses a new data structure, which significantly reduces gather/scatter operations. Vectorization is controlled using OpenMP 4.0 compiler directives for vectorization which ensures portability across different architectures. Restrictions: Here we do not provide the full PIC algorithm with an executable but only vector routines for current/charge deposition. These scalar/vector routines can be used as library routines in your 3D Particle-In-Cell code. However, to get the best performances out of vector routines you have to satisfy the two following requirements: (1) Your code should implement particle tiling (as explained in the manuscript) to allow for maximized cache reuse and reduce memory accesses that can hinder vector performances. The routines can be used directly on each particle tile. (2) You should compile your code with a Fortran 90 compiler (e.g Intel, gnu or cray) and provide proper alignment flags and compiler alignment directives (more details in README file).« less
An efficient and portable SIMD algorithm for charge/current deposition in Particle-In-Cell codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vincenti, H.; Lobet, M.; Lehe, R.
In current computer architectures, data movement (from die to network) is by far the most energy consuming part of an algorithm (≈20pJ/word on-die to ≈10,000 pJ/word on the network). To increase memory locality at the hardware level and reduce energy consumption related to data movement, future exascale computers tend to use many-core processors on each compute nodes that will have a reduced clock speed to allow for efficient cooling. To compensate for frequency decrease, machine vendors are making use of long SIMD instruction registers that are able to process multiple data with one arithmetic operator in one clock cycle. SIMD registermore » length is expected to double every four years. As a consequence, Particle-In-Cell (PIC) codes will have to achieve good vectorization to fully take advantage of these upcoming architectures. In this paper, we present a new algorithm that allows for efficient and portable SIMD vectorization of current/charge deposition routines that are, along with the field gathering routines, among the most time consuming parts of the PIC algorithm. Our new algorithm uses a particular data structure that takes into account memory alignment constraints and avoids gather/scat;ter instructions that can significantly affect vectorization performances on current CPUs. The new algorithm was successfully implemented in the 3D skeleton PIC code PICSAR and tested on Haswell Xeon processors (AVX2-256 bits wide data registers). Results show a factor of ×2 to ×2.5 speed-up in double precision for particle shape factor of orders 1–3. The new algorithm can be applied as is on future KNL (Knights Landing) architectures that will include AVX-512 instruction sets with 512 bits register lengths (8 doubles/16 singles). Program summary Program Title: vec_deposition Program Files doi:http://dx.doi.org/10.17632/nh77fv9k8c.1 Licensing provisions: BSD 3-Clause Programming language: Fortran 90 External routines/libraries: OpenMP > 4.0 Nature of problem: Exascale architectures will have many-core processors per node with long vector data registers capable of performing one single instruction on multiple data during one clock cycle. Data register lengths are expected to double every four years and this pushes for new portable solutions for efficiently vectorizing Particle-In-Cell codes on these future many-core architectures. One of the main hotspot routines of the PIC algorithm is the current/charge deposition for which there is no efficient and portable vector algorithm. Solution method: Here we provide an efficient and portable vector algorithm of current/charge deposition routines that uses a new data structure, which significantly reduces gather/scatter operations. Vectorization is controlled using OpenMP 4.0 compiler directives for vectorization which ensures portability across different architectures. Restrictions: Here we do not provide the full PIC algorithm with an executable but only vector routines for current/charge deposition. These scalar/vector routines can be used as library routines in your 3D Particle-In-Cell code. However, to get the best performances out of vector routines you have to satisfy the two following requirements: (1) Your code should implement particle tiling (as explained in the manuscript) to allow for maximized cache reuse and reduce memory accesses that can hinder vector performances. The routines can be used directly on each particle tile. (2) You should compile your code with a Fortran 90 compiler (e.g Intel, gnu or cray) and provide proper alignment flags and compiler alignment directives (more details in README file).« less
Expanding the North American Breeding Bird Survey analysis to include additional species and regions
Sauer, John; Niven, Daniel; Pardieck, Keith L.; Ziolkowski, David; Link, William
2017-01-01
The North American Breeding Bird Survey (BBS) contains data for >700 bird species, but analyses often focus on a core group of ∼420 species. We analyzed data for 122 species of North American birds for which data exist in the North American Breeding Bird Survey (BBS) database but are not routinely analyzed on the BBS Summary and Analysis Website. Many of these species occur in the northern part of the continent, on routes that fall outside the core survey area presently analyzed in the United States and southern Canada. Other species not historically analyzed occur in the core survey area with very limited data but have large portions of their ranges in Mexico and south. A third group of species not historically analyzed included species thought to be poorly surveyed by the BBS, such as rare, coastal, or nocturnal species. For 56 species found primarily in regions north of the core survey area, we expanded the scope of the analysis, using data from 1993 to 2014 during which ≥3 survey routes had been sampled in 6 northern strata (Bird Conservation regions in Alaska, Yukon, and Newfoundland and Labrador) and fitting log-linear hierarchical models for an augmented BBS survey area that included both the new northern strata and the core survey area. We also applied this model to 168 species historically analyzed in the BBS that had data from these additional northern strata. For both groups of species we calculated survey-wide trends for the both core and augmented survey areas from 1993 to 2014; for species that did not occur in the newly defined strata, we computed trends from 1966 to 2014. We evaluated trend estimates in terms of established credibility criteria for BBS results, screening for imprecise trends, small samples, and low relative abundance. Inclusion of data from the northern strata permitted estimation of trend for 56 species not historically analyzed, but only 4 of these were reasonably monitored and an additional 13 were questionably monitored; 39 of these species were likely poorly monitored because of small numbers of samples or very imprecisely estimated trends. Only 4 of 66 “new” species found in the core survey area were reasonably monitored by the BBS; 20 were questionably monitored; and 42 were likely poorly monitored by the BBS because of inefficiency in precision, abundance, or sample size. The hierarchical analyses we present provide a means for reasonable inclusion of the additional species and strata in a common analysis with data from the core area, a critical step in the evolution of the BBS as a continent-scale survey. We recommend that results be presented both 1) from 1993 to the present using the expanded survey area, and 2) from 1966 to the present for the core survey area. Although most of the “new” species we analyzed were poorly monitored by the BBS during 1993–2014, continued expansion of the BBS will improve the quality of information in future analyses for these species and for the many other species presently monitored by the BBS.
NASA Astrophysics Data System (ADS)
López, Gloria I.; Bialik, Or; Waldmann, Nicolas
2017-04-01
When dealing with fine-grained, organic-rich, colour-monotone, underwater marine sediment cores retrieved from the continental shelf or slope, the initial visual impression, upon split-opening the vessels, is often of a "disappointing" homogeneous, monotonous, continuous archive. Only after thorough, micro- to macro-scale, multi-parameter investigations the sediment reveals its treasures, initially by performing some measurements on the intact core itself, hence depicting for the first time its contents, and subsequently by carrying out the destructive, multi-proxy sample-based analyses. Usually, routine Multi-Sensor Core Logger (MSCL) measurements of petrophysical parameters (e.g. magnetic susceptibility, density, P-Wave velocity) on un-split sediment cores are the first undertaken while still on-board in the field or back at the laboratory. Less often done, but equally valuable, are continuous X-Ray and CT scan imaging of the same intact archives. Upon splitting, routine granulometry, micro- and macro-fossil and invertebrate identification, total organic / inorganic carbon content (TOC / TIC), amid other analyses take place. The geochronology component is also established usually by AMS 14C on selected organic-rich units, and less common is Optically Stimulated Luminescence (OSL) dating used on the coarser-grained, siliciclastic layers. A relatively new tool used in Luminescence, the Portable OSL Reader, employed to rapidly assess the luminescence signal of untreated poly-mineral samples to assist with targeted field sampling for full OSL dating, was used for the first time in marine sediment cores as a novel petrophysical characterization tool with astonishing results. In this study, two 2 m-long underwater piston sediment cores recovered from 200 m depths on the continental shelf off-southern Israel, were subjected to pulsed-photon stimulation (PPSL) obtaining favourable luminescence signals along their entire lengths. Astoundingly, luminescence signals were obtained on both, already split-opened cores. Both cores depicted the monotonous characteristics of homogeneousness down-core as per most of the results obtained from the non-destructive and destructive tests. One of the cores showed several small higher energy events, including a Mass Transport Deposit (MTD) within its first 10 cm, only fully visible on the CT scan imaging, the PPSL profile and particle size distribution plot. This initial investigation demonstrates the feasibility and usefulness of luminescence profiling as a new sedimentological and petrophysical proxy to better visualize homogeneous yet complex, fine-grained, underwater archives. Moreover, it helps to understand the continuity of the stratigraphy and linearity of deposition of the sediment, besides assisting on the estimation of relative ages provided that good OSL ages are obtained throughout the recovered archive.
NASA Astrophysics Data System (ADS)
Herrington, A. R.; Reed, K. A.
2018-02-01
A set of idealized experiments are developed using the Community Atmosphere Model (CAM) to understand the vertical velocity response to reductions in forcing scale that is known to occur when the horizontal resolution of the model is increased. The test consists of a set of rising bubble experiments, in which the horizontal radius of the bubble and the model grid spacing are simultaneously reduced. The test is performed with moisture, through incorporating moist physics routines of varying complexity, although convection schemes are not considered. Results confirm that the vertical velocity in CAM is to first-order, proportional to the inverse of the horizontal forcing scale, which is consistent with a scale analysis of the dry equations of motion. In contrast, experiments in which the coupling time step between the moist physics routines and the dynamical core (i.e., the "physics" time step) are relaxed back to more conventional values results in severely damped vertical motion at high resolution, degrading the scaling. A set of aqua-planet simulations using different physics time steps are found to be consistent with the results of the idealized experiments.
Berenger, Byron M; Berry, Chrystal; Peterson, Trevor; Fach, Patrick; Delannoy, Sabine; Li, Vincent; Tschetter, Lorelee; Nadon, Celine; Honish, Lance; Louie, Marie; Chui, Linda
2015-01-01
A standardised method for determining Escherichia coli O157:H7 strain relatedness using whole genome sequencing or virulence gene profiling is not yet established. We sought to assess the capacity of either high-throughput polymerase chain reaction (PCR) of 49 virulence genes, core-genome single nt variants (SNVs) or k-mer clustering to discriminate between outbreak-associated and sporadic E. coli O157:H7 isolates. Three outbreaks and multiple sporadic isolates from the province of Alberta, Canada were included in the study. Two of the outbreaks occurred concurrently in 2014 and one occurred in 2012. Pulsed-field gel electrophoresis (PFGE) and multilocus variable-number tandem repeat analysis (MLVA) were employed as comparator typing methods. The virulence gene profiles of isolates from the 2012 and 2014 Alberta outbreak events and contemporary sporadic isolates were mostly identical; therefore the set of virulence genes chosen in this study were not discriminatory enough to distinguish between outbreak clusters. Concordant with PFGE and MLVA results, core genome SNV and k-mer phylogenies clustered isolates from the 2012 and 2014 outbreaks as distinct events. k-mer phylogenies demonstrated increased discriminatory power compared with core SNV phylogenies. Prior to the widespread implementation of whole genome sequencing for routine public health use, issues surrounding cost, technical expertise, software standardisation, and data sharing/comparisons must be addressed.
The Structure of Medical Informatics Journal Literature
Morris, Theodore A.; McCain, Katherine W.
1998-01-01
Abstract Objective: Medical informatics is an emergent interdisciplinary field described as drawing upon and contributing to both the health sciences and information sciences. The authors elucidate the disciplinary nature and internal structure of the field. Design: To better understand the field's disciplinary nature, the authors examine the intercitation relationships of its journal literature. To determine its internal structure, they examined its journal cocitation patterns. Measurements: The authors used data from the Science Citation Index (SCI) and Social Science Citation Index (SSCI) to perform intercitation studies among productive journal titles, and software routines from SPSS to perform multivariate data analyses on cocitation data for proposed core journals. Results: Intercitation network analysis suggests that a core literature exists, one mark of a separate discipline. Multivariate analyses of cocitation data suggest that major focus areas within the field include biomedical engineering, biomedical computing, decision support, and education. The interpretable dimensions of multidimensional scaling maps differed for the SCI and SSCI data sets. Strong links to information science literature were not found. Conclusion: The authors saw indications of a core literature and of several major research fronts. The field appears to be viewed differently by authors writing in journals indexed by SCI from those writing in journals indexed by SSCI, with more emphasis placed on computers and engineering versus decision making by the former and more emphasis on theory versus application (clinical practice) by the latter. PMID:9760393
Core elements of physiotherapy in cerebral palsy children: proposal for a trial checklist.
Meghi, P; Rossetti, L; Corrado, C; Maran, E; Arosio, N; Ferrari, A
2012-03-01
Currently international literature describes physiotherapy in cerebral palsy (CP) children only in generic terms (traditional / standard / background / routine). The aim of this study is to create a checklist capable of describing the different modalities employed in physiotherapeutic treatment by means of a non-bias, common, universal, standardised language. A preliminary checklist was outlined by a group of physiotherapists specialised in child rehabilitation. For its experimentation, several physiotherapists from various paediatric units from all over Italy with different methodological approaches and backgrounds, were involved. Using the interpretative model, proposed by Ferrari et al., and through collective analysis and discussion of clinical videos, the core elements were progressively selected and codified. A reliability study was then carried out by eight expert physiotherapists using an inter-rate agreement model. The checklist analyses therapeutic proposals of CP rehabilitation through the description of settings, exercises and facilitations and consists of items and variables which codify all possible physiotherapeutic interventions. It is accompanied by written explanations, demonstrative videos, caregiver interviews and descriptions of applied environmental adaptations. All checklist items obtained a high level of agreement (according to Cohen's kappa coefficient), revealing that the checklist is clearly and easily interpretable. The checklist should facilitate interaction and communication between specialists and families, and lead to comparable research studies and scientific advances. The main value is to be able to correlate therapeutic results with core elements of adopted physiotherapy.
A Purposeful Dynamic Stretching Routine
ERIC Educational Resources Information Center
Leon, Craig; Oh, Hyun-Ju; Rana, Sharon
2012-01-01
Dynamic stretching, which involves moving parts of the body and gradually increases range of motion, speed of movement, or both through controlled, sport-specific movements, has become the popular choice of pre-exercise warm-up. This type of warm-up has evolved to encompass several variations, but at its core is the principle theme that preparing…
Schools Together: Enhancing the Citizenship Curriculum through a Non-Formal Education Programme
ERIC Educational Resources Information Center
O'Connor, Una
2012-01-01
In divided societies education for diversity, often introduced via the combined approaches of civic education, citizenship education and community-relations activity, is advocated as a core element of the school curriculum. Its delivery, through formal and non-formal educational approaches, has been routinely recognised as an opportunity for…
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 3 2013-10-01 2013-10-01 false Condition of participation: Nursing services-Waiver of requirement that substantially all nursing services be routinely provided directly by a hospice...: Patient Care Core Services § 418.66 Condition of participation: Nursing services—Waiver of requirement...
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 3 2014-10-01 2014-10-01 false Condition of participation: Nursing services-Waiver of requirement that substantially all nursing services be routinely provided directly by a hospice...: Patient Care Core Services § 418.66 Condition of participation: Nursing services—Waiver of requirement...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 3 2012-10-01 2012-10-01 false Condition of participation: Nursing services-Waiver of requirement that substantially all nursing services be routinely provided directly by a hospice...: Patient Care Core Services § 418.66 Condition of participation: Nursing services—Waiver of requirement...
ERIC Educational Resources Information Center
Rupp, André A.; van Rijn, Peter W.
2018-01-01
We review the GIDNA and CDM packages in R for fitting cognitive diagnosis/diagnostic classification models. We first provide a summary of their core capabilities and then use both simulated and real data to compare their functionalities in practice. We found that the most relevant routines in the two packages appear to be more similar than…
Fekete, Szabolcs; Fekete, Jeno
2011-04-15
The performance of 5 cm long narrow-bore columns packed with 2.6-2.7 μm core-shell particles and a column packed with 1.7 μm totally porous particles was compared in very fast gradient separations of polar neutral active pharmaceutical compounds. Peak capacities as a function of flow-rate and gradient time were measured. Peak capacities around 160-170 could be achieved within 25 min with these 5 cm long columns. The highest peak capacity was obtained with the Kinetex column however it was found that as the flow-rate increases, the peak capacity of the new Poroshell-120 column is getting closer to that obtained with the Kinetex column. Considering the column permeability, peak capacity per unit time and per unit pressure was also calculated. In this comparison the advantage of sub-3 μm core-shell particles is more significant compared to sub-2 μm totally porous particles. Moreover it was found that the very similar sized (d(p)=2.7 μm) and structured (ρ=0.63) new Poroshell-120 and the earlier introduced Ascentis Express particles showed different efficiency. Results obtained showed that the 5 cm long narrow bore columns packed with sub-3 μm core-shell particles offer the chance of very fast and efficient gradient separations, thus these columns can be applied for fast screening measurements of routine pharmaceutical analysis such as cleaning validation. Copyright © 2011 Elsevier B.V. All rights reserved.
Schmid-Bindert, Gerald; Wang, Yongsheng; Jiang, Hongbin; Sun, Hui; Henzler, Thomas; Wang, Hao; Pilz, Lothar R.; Ren, Shengxiang; Zhou, Caicun
2013-01-01
Background Multiple biomarker testing is necessary to facilitate individualized treatment of lung cancer patients. More than 80% of lung cancers are diagnosed based on very small tumor samples. Often there is not enough tissue for molecular analysis. We compared three minimal invasive sampling methods with respect to RNA quantity for molecular testing. Methods 106 small biopsies were prospectively collected by three different methods forceps biopsy, endobronchial ultrasound (EBUS) guided transbronchial needle aspiration (TBNA), and CT-guided core biopsy. Samples were split into two halves. One part was formalin fixed and paraffin embedded for standard pathological evaluation. The other part was put in RNAlater for immediate RNA/DNA extraction. If the pathologist confirmed the diagnosis of non-small cell lung cancer(NSCLC), the following molecular markers were tested: EGFR mutation, ERCC1, RRM1 and BRCA1. Results Overall, RNA-extraction was possible in 101 out of 106 patients (95.3%). We found 49% adenocarcinomas, 38% squamouscarcinomas, and 14% non-otherwise-specified(NOS). The highest RNA yield came from endobronchial ultrasound guided needle aspiration, which was significantly higher than bronchoscopy (37.74±41.09 vs. 13.74±15.53 ng respectively, P = 0.005) and numerically higher than CT-core biopsy (37.74±41.09 vs. 28.72±44.27 ng respectively, P = 0.244). EGFR mutation testing was feasible in 100% of evaluable patients and its incidence was 40.8%, 7.9% and 14.3% in adenocarcinomas, squamouscarcinomas and NSCLC NOS subgroup respectively. There was no difference in the feasibility of molecular testing between the three sampling methods with feasibility rates for ERCC1, RRM1 and BRCA1 of 91%, 87% and 81% respectively. Conclusion All three methods can provide sufficient tumor material for multiple biomarkers testing from routinely obtained small biopsies in lung cancer patients. In our study EBUS guided needle aspiration provided the highest amount of tumor RNA compared to bronchoscopy or CT guided core biopsy. Thus EBUS should be considered as an acceptable option for tissue acquisition for molecular testing. PMID:24205040
Major refit of R/V MARION DUFRESNE and giant sediment corer improvements
NASA Astrophysics Data System (ADS)
Leau, Hélène; Réaud, Yvan
2015-04-01
The french Research Vessel MARION DUFRESNE is equipped with a unique sediment coring facility, called CALYPSO, developed initially by Yvon BALUT at the French Polar Institute, Paul-Emile Victor (IPEV) that operates the vessel 217 days per year in all oceans. The CALYPSO sediment corer retrieves routinely 50 m long undisturbed sediment cores in any water depths, and presently holds the worldwide record of the longest core ever retrieved, that is 64.5 m. This vessel is then a fantastic opportunity for the paleoceanographic community to carry out expeditions at sea. Over the last 20 years, many international IMAGES coring expeditions were organized in all the ocean basins around the world on board the R/V MARION DUFRESNE. More than 1500 cores were retrieved, leading to major advances in the paleoceanography and paleoclimatology of the Late Quaternary. The vessel will celebrate her 20th anniversary in 2015 and will undergo a major refit on hull & machineries, public spaces, as well as scientific equipment. The coring capacity is currently being developed to further improve - The length of the retrievable core, with an objective of 75 m long core in routine - The quality of the sediment un-disturbance with a specially designed coring cable with controlled minimum elasticity - The safety of the operations at sea - The quality control of the operations with a suite of sensors and software allowing a detailed monitoring of the coring operation - The time requested for each operation - The environment data collection, in the same time as the coring operations The detailed description of the upgrades will be presented. They consist in a new suite of acoustic sensors that will be integrated on board the vessel during the 4 months ship yard stay from April to July 2015, amongst which a KONSBERG EM122 multibeam echo-sounder and a SBP 120-3 sub-bottom profiler, both mounted on a gondola fitted under the hull of the vessel. This equipment will allow the highest quality images of the seabed and upper sediment, allowing accurate survey of the coring targets. Scientific deck equipment, such as coring winch and associated A frame, hydrological winch and deployment system, will be upgraded, the coring operating equipment being brought up to 45 t SWL. The ship side will be modified to allow the deployment of 75 m long corer. Finally, the laboratories will be renovated and the main PC room will be redistributed in order to offer more space to the scientific party and secure a new IT dedicated room. The vessel will undergo a series of tests at sea before being available to the scientific community again from January 2016. As before her midlife refit, IPEV wishes to carry on training programs such as "University at Sea" and EGU supported "Teachers at sea" program in association with future paleoceanographic activities.
Many-core graph analytics using accelerated sparse linear algebra routines
NASA Astrophysics Data System (ADS)
Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric
2016-05-01
Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.
Integrated NMR Core and Log Investigations With Respect to ODP LEG 204
NASA Astrophysics Data System (ADS)
Arnold, J.; Pechnig, R.; Clauser, C.; Anferova, S.; Blümich, B.
2005-12-01
NMR techniques are widely used in the oil industry and are one of the most suitable methods to evaluate in-situ formation porosity and permeability. Recently, efforts are directed towards adapting NMR methods also to the Ocean Drilling Program (ODP) and the upcoming Integrated Ocean Drilling Program (IODP). We apply a newly developed light-weight, mobile NMR core scanner as a non-destructive instrument to determine routinely rock porosity and to estimate the pore size distribution. The NMR core scanner is used for transverse relaxation measurements on water-saturated core sections using a CPMG sequence with a short echo time. A regularized Laplace-transform analysis yields the distribution of transverse relaxation times T2. In homogeneous magnetic fields, T2 is proportional to the pore diameter of rocks. Hence, the T2 signal maps the pore-size distribution of the studied rock samples. For fully saturated samples the integral of the distribution curve and the CPMG echo amplitude extrapolated to zero echo time are proportional to porosity. Preliminary results show that the NMR core scanner is a suitable tool to determine rock porosity and to estimate pore size distribution of limestones and sandstones. Presently our investigations focus on Leg 204, where NMR Logging-While-Drilling (LWD) was performed for the first time in ODP. Leg 204 was drilled into Hydrate Ridge on the Cascadia accretionary margin, offshore Oregon. All drilling and logging operations were highly successful, providing excellent core, wireline, and LWD data from adjacent boreholes. Cores recovered during Leg 204 consist mainly of clay and claystone. As the NMR core scanner operates at frequencies higher than that of the well-logging sensor it has a shorter dead time. This advantage makes the NMR core scanner sensitive to signals with T2 values down to 0.1 ms as compared to 3 ms in NMR logging. Hence, we can study even rocks with small pores, such as the mudcores recovered during Leg 204. We present a comparison of data from core scanning and NMR logging. Future integration of conventional wireline data and electrical borehole wall images (RAB/FMS) will provide a detailed characterization of the sediments in terms of lithology, petrophysics and, fluid flow properties.
NASA Technical Reports Server (NTRS)
Parker, K. C.; Torian, J. G.
1980-01-01
A sample environmental control and life support model performance analysis using the environmental analysis routines library is presented. An example of a complete model set up and execution is provided. The particular model was synthesized to utilize all of the component performance routines and most of the program options.
Culture-independent Analysis of Pediatric Broncho-alveolar Lavage (BAL) Specimens.
Zachariah, Philip; Ryan, Chanelle; Nadimpalli, Sruti; Coscia, Gina; Kolb, Michelle; Smith, Hannah; Foca, Marc; Saiman, Lisa; Planet, Paul J
2018-06-07
The clinical utility of culture-independent testing of pediatric bronchoalveolar lavage (BAL) specimens is unknown. Additionally, the variability of the pediatric pulmonary microbiome with patient characteristics is not well understood. Study subjects were ≤22 years old and underwent BAL from May 2013 - July 2015 as part of clinical care. DNA extracted from BAL specimens was used for 16S rRNA gene-based (16S-based) microbiome analysis, and results compared to routine cultures from the same samples. Indices of microbial diversity and relative taxon abundances were compared based on subject characteristics. From 81 participants (51% male, median age 9 years), 89 samples were collected and the 16S rRNA genes of 77 (86.5%) samples were successfully analyzed. Subjects included 23 cystic fibrosis (CF), 19 immunocompromised (IC), and 28 non-immunocompromised (nIC) patients. Of 68 organisms identified in culture, 16S-based analyses were concordant with culture results in 66 (97.1%), and also identified potentially clinically significant taxa missed by cultures (e.g. Staphylococcus, Legionella, Pseudomonas). Significant differences in abundance were noted mostly for non-cultured taxa with diagnosis and antibiotic use (Veillonella, Corynebacterium, Haemophilus). The microbiota of CF samples was less diverse and a "core" group of fifteen taxa shared across samples was identified. 16S-based culture independent analysis was concordant with routine cultures and showed potential to detect non-cultured pathogens. While 16S- based testing identified relative changes in organism abundance associated with clinical characteristics, distinct microbiome profiles associated with disease states were not identified.
Weight optimization of an aerobrake structural concept for a lunar transfer vehicle
NASA Technical Reports Server (NTRS)
Bush, Lance B.; Unal, Resit; Rowell, Lawrence F.; Rehder, John J.
1992-01-01
An aerobrake structural concept for a lunar transfer vehicle was weight optimized through the use of the Taguchi design method, finite element analyses, and element sizing routines. Six design parameters were chosen to represent the aerobrake structural configuration. The design parameters included honeycomb core thickness, diameter-depth ratio, shape, material, number of concentric ring frames, and number of radial frames. Each parameter was assigned three levels. The aerobrake structural configuration with the minimum weight was 44 percent less than the average weight of all the remaining satisfactory experimental configurations. In addition, the results of this study have served to bolster the advocacy of the Taguchi method for aerospace vehicle design. Both reduced analysis time and an optimized design demonstrated the applicability of the Taguchi method to aerospace vehicle design.
Holschbach, A; Kriete, A; Schäffer, R
1990-01-01
Papillae with fibrovascular cores are characteristic of papillary carcinoma of the thyroid. Papillae may be found in diffuse hyperplasia, nodular hyperplasia, Hashimoto's disease and follicular adenoma. Tissues from ten benign hyperplasias and ten papillary carcinomas were reconstructed from serial sections with three dimensional reconstruction programs. Significant qualitative and quantitative differences were found between the hyperplasia and the carcinoma. The principal differences between papillae of papillary carcinoma and hyperplasia were more clearly seen in the three dimensional reconstruction, than by means of morphometric methods. Certain criteria, e.g. the volume of papillae, were useful only with regard to the third dimension. Nevertheless, three dimensional reconstruction of biological tissue is a time consuming procedure which is not yet suitable for routine examination.
2014-01-01
Background Diagnosing adipocytic tumors can be challenging because it is often difficult to morphologically distinguish between benign, intermediate and malignant adipocytic tumors, and other sarcomas that are histologically similar. Recently, a number of tumor-specific chromosome translocations and associated fusion genes have been identified in adipocytic tumors and atypical lipomatous tumors/well-differentiated liposarcomas (ALT/WDL), which have a supernumerary ring and/or giant chromosome marker with amplified sequences of the MDM2 and CDK4 genes. The purpose of this study was to investigate whether quantitative real-time polymerase chain reaction (PCR) could be used to amplify MDM2 and CDK4 from total RNA samples obtained from core-needle biopsy sections for the diagnosis of ALT/WDL. Methods A series of lipoma (n = 124) and ALT/WDL (n = 44) cases were analyzed for cytogenetic analysis and lipoma fusion genes, as well as for MDM2 and CDK4 expression by real-time PCR. Moreover, the expression of MDM2 and CDK4 in whole tissue sections was compared with that in core-needle biopsy sections of the same tumor in order to determine whether real-time PCR could be used to distinguish ALT/WDL from lipoma at the preoperative stage. Results In whole tissue sections, the medians for MDM2 and CDK4 expression in ALT/WDL were higher than those in the lipomas (P < 0.05). Moreover, karyotype subdivisions with rings and/or giant chromosomes had higher MDM2 and CDK4 expression levels compared to karyotypes with 12q13-15 rearrangements, other abnormal karyotypes, and normal karyotypes (P < 0.05). On the other hand, MDM2 and CDK4 expression levels in core-needle biopsy sections were similar to those in whole-tissue sections (MDM2: P = 0.6, CDK4: P = 0.8, Wilcoxon signed-rank test). Conclusion Quantitative real-time PCR of total RNA can be used to evaluate the MDM2 and CDK4 expression levels in core-needle biopsies and may be useful for distinguishing ALT/WDL from adipocytic tumors. Thus, total RNA from core-needle biopsy sections may have potential as a routine diagnostic tool for other tumors where gene overexpression is a feature of the tumor. PMID:24965044
Manios, Yannis; Vlachopapadopoulou, Elpis; Moschonis, George; Karachaliou, Feneli; Psaltopoulou, Theodora; Koutsouki, Dimitra; Bogdanis, Gregory; Carayanni, Vilelmine; Hatzakis, Angelos; Michalacos, Stefanos
2016-12-01
Early identification of infants being at high risk to become obese at their later childhood or adolescence can be of vital importance in any obesity prevention initiative. The aim of the present study was to examine the utility and applicability of the "Childhood Obesity Risk Evaluation (CORE)" index as a screening tool for the early prediction of obesity in childhood and adolescence. Anthropometric, socio-demographic data were collected cross-sectionally and retrospectively from a representative sample of 5946 children, and adolescents and were combined for calculating the CORE-index score. Logistic regression analyses were performed to examine the associations of the CORE-index score with obesity by gender and age group, and cut-off point analysis was also applied to identify the optimal value of the CORE-index score that differentiates obese from non-obese children. Mean CORE-index score in the total sample was 3.06 (sd 1.92) units (range 0-11 units). Each unit increase in the CORE-index score was found to be associated with a 30 % (95 % C.I. 1.24-1.36) increased likelihood for obesity in childhood or adolescence, while the optimal cut-off value of the CORE-index score that predicted obesity with the highest possible sensitivity and specificity was found to be 3.5. The present study supports the utility and applicability of the CORE-index as a screening tool for the early identification of infants that are potentially at a higher risk for becoming obese at their childhood and adolescence. This tool could be routinely used by health professionals to identify infants at high risk and provide appropriate counselling to their parents and caregivers so as to maximize the effectiveness of early obesity prevention initiatives. What is known? • Childhood obesity has reached epidemic proportions worldwide. • Certain perinatal and socio-demographic indices that were previously identified as correlates of childhood obesity in children were combined to develop the CORE-index, a screening tool that estimates obesity risk in 9-13 year-old children. What is new? • The utility and applicability of the CORE-index as screening tool can be extended to the age range of 6-15 years. • The CORE-index is a cost-effective screening tool that can assist health professionals in initiating obesity preventive measures from early life.
Estimation of Circadian Body Temperature Rhythm Based on Heart Rate in Healthy, Ambulatory Subjects.
Sim, Soo Young; Joo, Kwang Min; Kim, Han Byul; Jang, Seungjin; Kim, Beomoh; Hong, Seungbum; Kim, Sungwan; Park, Kwang Suk
2017-03-01
Core body temperature is a reliable marker for circadian rhythm. As characteristics of the circadian body temperature rhythm change during diverse health problems, such as sleep disorder and depression, body temperature monitoring is often used in clinical diagnosis and treatment. However, the use of current thermometers in circadian rhythm monitoring is impractical in daily life. As heart rate is a physiological signal relevant to thermoregulation, we investigated the feasibility of heart rate monitoring in estimating circadian body temperature rhythm. Various heart rate parameters and core body temperature were simultaneously acquired in 21 healthy, ambulatory subjects during their routine life. The performance of regression analysis and the extended Kalman filter on daily body temperature and circadian indicator (mesor, amplitude, and acrophase) estimation were evaluated. For daily body temperature estimation, mean R-R interval (RRI), mean heart rate (MHR), or normalized MHR provided a mean root mean square error of approximately 0.40 °C in both techniques. The mesor estimation regression analysis showed better performance than the extended Kalman filter. However, the extended Kalman filter, combined with RRI or MHR, provided better accuracy in terms of amplitude and acrophase estimation. We suggest that this noninvasive and convenient method for estimating the circadian body temperature rhythm could reduce discomfort during body temperature monitoring in daily life. This, in turn, could facilitate more clinical studies based on circadian body temperature rhythm.
Linked Learning: Using Learning Time Creatively to Prepare Students for College and Career
ERIC Educational Resources Information Center
Almond, Monica R.; Miller, Tiffany D.
2014-01-01
American public education is in a constant state of experimentation, with new waves of reforms and education initiatives unveiled routinely--many recycled and some reinvented. Yet few are truly innovative. The newest and most promising reform thus far are the Common Core State Standards, which are rigorous standards in English language arts and…
75 FR 47646 - Idaho State University; Notice of Issuance of Director's Decision
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-06
... detector from the reactor core. In the petition Dr. Crawford states that this was cited as an Apparent... detector. The personnel roof access hatch was also addressed in Rev. 3 and Rev. 4 of the Physical Security... procedures to wear protective clothing to routinely remove the activated startup channel detector from the...
Non-Traditional Muscular Strength and Endurance Activities for Elementary and Middle School Children
ERIC Educational Resources Information Center
Maina, Michael P.; Feather, Ryan; Edmunds, Cynthia; Maina, Julie Schlegel; Ryan, Stu; Griffin, Michael
2014-01-01
Over the past decade many muscular strength and endurance routines have been introduced to children and adults toward improving overall health and fitness. When performed correctly, there are countless benefits to performing weight bearing resistance-type exercises to develop the upper, lower, and core areas of the body. The National Association…
ERIC Educational Resources Information Center
McKenzie, Karen; Murray, George C.; Prior, Seamus; Stark, Lynda
2011-01-01
An evaluation of a Scottish secondary school-based counselling service for students aged 11 to 18 is presented. Improvement in student emotional well-being was measured using the Young Persons Clinical Outcomes for Routine Evaluation (YP CORE) questionnaire and participant questionnaires which were developed for the study. Significant improvements…
Implementing an Evidence-Based Instructional Routine to Enhance Comprehension of Expository Text
ERIC Educational Resources Information Center
Wexler, Jade; Reed, Deborah K.; Mitchell, Marisa; Doyle, Brie; Clancy, Erin
2015-01-01
To graduate from high school and become competitive in the workplace and other postsecondary endeavors, adolescents are required to meet rigorous standards, such as the Common Core State Standards. To meet such standards, teachers must teach students to read and make sense of increasingly complex content-area expository text. Secondary teachers,…
NASA Astrophysics Data System (ADS)
Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin
2014-05-01
During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560,640 equivalent cores. Scientific applications, such as CESM, are also required to demonstrate a "computational readiness capability" to efficiently scale across and utilize 20% of the entire system. The 0,25 deg configuration of the spectral element dynamical core of the Community Atmosphere Model (CAM-SE), the atmospheric component of CESM, has been demonstrated to scale efficiently across more than 5,000 nodes (80,000 CPU cores) on Titan. The tracer transport routines of CAM-SE have also been ported to take advantage of the hybrid many-core architecture of Titan using GPUs [see EGU2014-4233], yielding over 2X speedup when transporting over 100 tracers. The high throughput I/O in CESM, based on the Parallel IO Library (PIO), is being further augmented to support even higher resolutions and enhance resiliency. The application performance of the individual runs are archived in a database and routinely analyzed to identify and rectify performance degradation during the course of the experiments. The various resources available at the OLCF now support a scientific workflow to facilitate high-resolution climate modelling. A high-speed center-wide parallel file system, called ATLAS, capable of 1 TB/s, is available on Titan as well as on the clusters used for analysis (Rhea) and visualization (Lens/EVEREST). Long-term archive is facilitated by the HPSS storage system. The Earth System Grid (ESG), featuring search & discovery, is also used to deliver data. The end-to-end workflow allows OLCF users to efficiently share data and publish results in a timely manner.
GOCE User Toolbox and Tutorial
NASA Astrophysics Data System (ADS)
Knudsen, P.; Benveniste, J.
2011-07-01
The GOCE User Toolbox GUT is a compilation of tools for the utilisation and analysis of GOCE Level 2 products. GUT support applications in Geodesy, Oceanography and Solid Earth Physics. The GUT Tutorial provides information and guidance in how to use the toolbox for a variety of applications. GUT consists of a series of advanced computer routines that carry out the required computations. It may be used on Windows PCs, UNIX/Linux Workstations, and Mac. The toolbox is supported by The GUT Algorithm Description and User Guide and The GUT Install Guide. A set of a-priori data and models are made available as well. GUT has been developed in a collaboration within the GUT Core Group. The GUT Core Group: S. Dinardo, D. Serpe, B.M. Lucas, R. Floberghagen, A. Horvath (ESA), O. Andersen, M. Herceg (DTU), M.-H. Rio, S. Mulet, G. Larnicol (CLS), J. Johannessen, L.Bertino (NERSC), H. Snaith, P. Challenor (NOC), K. Haines, D. Bretherton (NCEO), C. Hughes (POL), R.J. Bingham (NU), G. Balmino, S. Niemeijer, I. Price, L. Cornejo (S&T), M. Diament, I Panet (IPGP), C.C. Tscherning (KU), D. Stammer, F. Siegismund (UH), T. Gruber (TUM),
Mori, Taizo; Hegmann, Torsten
2016-01-01
Size, shape, overall composition, and surface functionality largely determine the properties and applications of metal nanoparticles. Aside from well-defined metal clusters, their composition is often estimated assuming a quasi-spherical shape of the nanoparticle core. With decreasing diameter of the assumed circumscribed sphere, particularly in the range of only a few nanometers, the estimated nanoparticle composition increasingly deviates from the real composition, leading to significant discrepancies between anticipated and experimentally observed composition, properties, and characteristics. We here assembled a compendium of tables, models, and equations for thiol-protected gold nanoparticles that will allow experimental scientists to more accurately estimate the composition of their gold nanoparticles using TEM image analysis data. The estimates obtained from following the routines described here will then serve as a guide for further analytical characterization of as-synthesized gold nanoparticles by other bulk (thermal, structural, chemical, and compositional) and surface characterization techniques. While the tables, models, and equations are dedicated to gold nanoparticles, the composition of other metal nanoparticle cores with face-centered cubic lattices can easily be estimated simply by substituting the value for the radius of the metal atom of interest.
Rydberg atoms in hollow-core photonic crystal fibres.
Epple, G; Kleinbach, K S; Euser, T G; Joly, N Y; Pfau, T; Russell, P St J; Löw, R
2014-06-19
The exceptionally large polarizability of highly excited Rydberg atoms-six orders of magnitude higher than ground-state atoms--makes them of great interest in fields such as quantum optics, quantum computing, quantum simulation and metrology. However, if they are to be used routinely in applications, a major requirement is their integration into technically feasible, miniaturized devices. Here we show that a Rydberg medium based on room temperature caesium vapour can be confined in broadband-guiding kagome-style hollow-core photonic crystal fibres. Three-photon spectroscopy performed on a caesium-filled fibre detects Rydberg states up to a principal quantum number of n=40. Besides small energy-level shifts we observe narrow lines confirming the coherence of the Rydberg excitation. Using different Rydberg states and core diameters we study the influence of confinement within the fibre core after different exposure times. Understanding these effects is essential for the successful future development of novel applications based on integrated room temperature Rydberg systems.
Information systems in healthcare - state and steps towards sustainability.
Lenz, R
2009-01-01
To identify core challenges and first steps on the way to sustainable information systems in healthcare. Recent articles on healthcare information technology and related articles from Medical Informatics and Computer Science were reviewed and analyzed. Core challenges that couldn't be solved over the years are identified. The two core problem areas are process integration, meaning to effectively embed IT-systems into routine workflows, and systems integration, meaning to reduce the effort for interconnecting independently developed IT-components. Standards for systems integration have improved a lot, but their usefulness is limited where system evolution is needed. Sustainable Healthcare Information Systems should be based on system architectures that support system evolution and avoid costly system replacements every five to ten years. Some basic principles for the design of such systems are separation of concerns, loose coupling, deferred systems design, and service oriented architectures.
Watson, Todd; Graning, Jessica; McPherson, Sue; Carter, Elizabeth; Edwards, Joshuah; Melcher, Isaac; Burgess, Taylor
2017-02-01
Dance performance requires not only lower extremity muscle strength and endurance, but also sufficient core stabilization during dynamic dance movements. While previous studies have identified a link between core muscle performance and lower extremity injury risk, what has not been determined is if an extended core stabilization training program will improve specific measures of dance performance. This study examined the impact of a nine-week core stabilization program on indices of dance performance, balance measures, and core muscle performance in competitive collegiate dancers. Within-subject repeated measures design. A convenience sample of 24 female collegiate dance team members (age = 19.7 ± 1.1 years, height = 164.3 ± 5.3 cm, weight 60.3 ± 6.2 kg, BMI = 22.5 ± 3.0) participated. The intervention consisted of a supervised and non-supervised core (trunk musculature) exercise training program designed specifically for dance team participants performed three days/week for nine weeks in addition to routine dance practice. Prior to the program implementation and following initial testing, transversus abdominis (TrA) activation training was completed using the abdominal draw-in maneuver (ADIM) including ultrasound imaging (USI) verification and instructor feedback. Paired t tests were conducted regarding the nine-week core stabilization program on dance performance and balance measures (pirouettes, single leg balance in passe' releve position, and star excursion balance test [SEBT]) and on tests of muscle performance. A repeated measures (RM) ANOVA examined four TrA instruction conditions of activation: resting baseline, self-selected activation, immediately following ADIM training and four days after completion of the core stabilization training program. Alpha was set at 0.05 for all analysis. Statistically significant improvements were seen on single leg balance in passe' releve and bilateral anterior reach for the SEBT (both p ≤ 0.01), number of pirouettes (p = 0.011), and all measures of strength (p ≤ 0.05) except single leg heel raise. The RM ANOVA on mean percentage of change in TrA was significant; post hoc paired t tests demonstrated significant improvements in dancers' TrA activations across the four instruction conditions. This core stabilization training program improves pirouette ability, balance (static and dynamic), and measures of muscle performance. Additionally, ADIM training resulted in immediate and short-term (nine-week) improvements in TrA activation in a functional dance position. 2b.
Graning, Jessica; McPherson, Sue; Carter, Elizabeth; Edwards, Joshuah; Melcher, Isaac; Burgess, Taylor
2017-01-01
Background Dance performance requires not only lower extremity muscle strength and endurance, but also sufficient core stabilization during dynamic dance movements. While previous studies have identified a link between core muscle performance and lower extremity injury risk, what has not been determined is if an extended core stabilization training program will improve specific measures of dance performance. Hypothesis/Purpose This study examined the impact of a nine-week core stabilization program on indices of dance performance, balance measures, and core muscle performance in competitive collegiate dancers. Study Design Within-subject repeated measures design. Methods A convenience sample of 24 female collegiate dance team members (age = 19.7 ± 1.1 years, height = 164.3 ± 5.3 cm, weight 60.3 ± 6.2 kg, BMI = 22.5 ± 3.0) participated. The intervention consisted of a supervised and non-supervised core (trunk musculature) exercise training program designed specifically for dance team participants performed three days/week for nine weeks in addition to routine dance practice. Prior to the program implementation and following initial testing, transversus abdominis (TrA) activation training was completed using the abdominal draw-in maneuver (ADIM) including ultrasound imaging (USI) verification and instructor feedback. Paired t tests were conducted regarding the nine-week core stabilization program on dance performance and balance measures (pirouettes, single leg balance in passe’ releve position, and star excursion balance test [SEBT]) and on tests of muscle performance. A repeated measures (RM) ANOVA examined four TrA instruction conditions of activation: resting baseline, self-selected activation, immediately following ADIM training and four days after completion of the core stabilization training program. Alpha was set at 0.05 for all analysis. Results Statistically significant improvements were seen on single leg balance in passe’ releve and bilateral anterior reach for the SEBT (both p ≤ 0.01), number of pirouettes (p = 0.011), and all measures of strength (p ≤ 0.05) except single leg heel raise. The RM ANOVA on mean percentage of change in TrA was significant; post hoc paired t tests demonstrated significant improvements in dancers’ TrA activations across the four instruction conditions Conclusion This core stabilization training program improves pirouette ability, balance (static and dynamic), and measures of muscle performance. Additionally, ADIM training resulted in immediate and short-term (nine-week) improvements in TrA activation in a functional dance position. Level of Evidence 2b PMID:28217414
Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane
2016-02-01
To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and evaluation. © 2015 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
1980-01-01
MATHPAC image-analysis library is collection of general-purpose mathematical and statistical routines and special-purpose data-analysis and pattern-recognition routines for image analysis. MATHPAC library consists of Linear Algebra, Optimization, Statistical-Summary, Densities and Distribution, Regression, and Statistical-Test packages.
[Health services research the example of atopic dermatitis].
Schmitt, J
2011-03-01
Within the past years, health services research projects have analyzed critically the management of atopic eczema (AE) in routine care, quantified the utility of controlling severe AE, and introduced an international standardization of core outcome measures for AE. With a prevalence of 16%, AE is the most frequent chronic condition at all among children and adolescents seeking medical care. Despite lower prevalence in adults, about 60% of patients with AE in routine care are adults. There is a clinically relevant comorbidity of AE and psychiatric conditions. Independent of patient's age and physician's medical discipline topical corticosteroids dominate outpatient treatment of AE. However, there is considerable heterogeneity in the management of AE between treating physicians. Despite a lack of clinical trials, systemic corticosteroids are most frequently prescribed for severe AE. In contrast, cyclosporine only plays a minor role in routine care of severe AE although its efficacy is well-documented in trials. This observation stimulated a head-to-head trial that indicated superiority of cyclosporine over prednisolone for severe adult AE. The control of severe AE has high priority from the perspective of the general population and from the patients' perspective. Competence of the treating physician, disease severity and patient's competence to adjust treatment to disease activity are the main determinants of patient satisfaction. Aiming for a better comparability of clinical trials and better translation of trial evidence into clinical practice, we conducted a Delphi exercise including clinical experts from 11 countries, editors of international dermatological journals, regulatory agencies, and patient representatives. The preliminary core set of outcome domains for eczema trials as defined by the panel included symptoms, physician-assessed clinical signs, and a measurement for long-term control of flares. Symptoms such as itching should be regularly assessed in clinical practice. The presented studies indicate that health services research not only describes and critically analyzes the effectiveness of routine clinical care, but is also translational research in that it may stimulate clinical trials and generate new, clinically relevant hypotheses for experimental studies.
Routines of families with adolescents with autistic disorders: a comparison study.
Bagatell, Nancy J; Cram, Megan; Alvarez, Christian G; Loehle, Laura
2014-02-01
Research has consistently shown that families with children with autism spectrum disorders (ASD) have difficulty engaging in family routines, yet little is known about families with adolescents with ASD. The purpose of this study is to compare the routines of families with adolescents with ASD (FASD) and families with typically developing adolescents. Twenty families in each group were compared using the Family Routines Inventory and supplemental questions. Data were analyzed using a Mann-Whitney U and content analysis. No significant difference between groups was found; however, there was a trending toward significance in the subscale of mealtime routines in both endorsement and adherence. Analysis of open-ended questions revealed differences in how routines were carried out. Occupational therapists should consider assessing and addressing routines of importance to FASD to increase family health and well-being. Further research is needed to better understand the routines of FASD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rivera-Sanfeliz, Gerant, E-mail: gerantrivera@ucsd.edu; Kinney, Thomas B.; Rose, Steven C.
2005-06-15
Purpose: To describe our experience with ultrasound (US)-guided percutaneous liver biopsies using the INRAD 18G Express core needle biopsy system.Methods: One hundred and fifty-four consecutive percutaneous core liver biopsy procedures were performed in 153 men in a single institution over 37 months. The medical charts, pathology reports, and radiology files were retrospectively reviewed. The number of needle passes, type of guidance, change in hematocrit level, and adequacy of specimens for histologic analysis were evaluated.Results: All biopsies were performed for histologic staging of chronic liver diseases. The majority of patients had hepatitis C (134/153, 90.2%). All patients were discharged to homemore » after 4 hr of postprocedural observation. In 145 of 154 (94%) biopsies, a single needle pass was sufficient for diagnosis. US guidance was utilized in all but one of the procedures (153/154, 99.4%). The mean hematocrit decrease was 1.2% (44.1-42.9%). Pain requiring narcotic analgesia, the most frequent complication, occurred in 28 of 154 procedures (18.2%). No major complications occurred. The specimens were diagnostic in 152 of 154 procedures (98.7%).Conclusions: Single-pass percutaneous US-guided liver biopsy with the INRAD 18G Express core needle biopsy system is safe and provides definitive pathologic diagnosis of chronic liver disease. It can be performed on an outpatient basis. Routine post-biopsy monitoring of hematocrit level in stable, asymptomatic patients is probably not warranted.« less
Gupta, Priyanka; Sharma, Amil; Pathak, Vivek K; Mankeliya, Saurabh; Bhardwaj, Shivanshu; Dhanare, Poorvasha
2017-12-01
Post and core restorations are routinely used for restoring grossly decayed tooth structures. Various chemical agents are known to affect the interfacial adhesions between the post and the core. Hence, we planned the present study to evaluate the effect of various post-surface treatments on the interfacial strength between the posts and composite materials that are used for building up the core portion. The present study included assessment of the effect of surface conditioning of posts on the interfacial adhesion in post-core restorations. A total of 80 clear post-tapers were included and were divided broadly into four study groups based on the type of chemical testing protocols used. Various chemical treatments included alkaline potassium permanganate, hydrogen peroxide, and phosphoric acid. The fourth group was the control group. The composite core material was used for building up the core. Testing of the tensile load was done on a universal testing machine. All the results were analyzed by the Statistical Package for the Social Sciences (SPSS) software. The highest bond strength was observed in the study group treated with alkaline potassium permanganate, while the lowest was observed in the control group followed by the hydrogen peroxide group. While comparing the mean bond strength in between various study groups, significant results were obtained. Chemical treatment protocol significantly alters the mean bond strength of the post and core restoration. Potassium permanganate significantly increases the bond strength between the fiber post and core restoration.
Intrauterine temperature during intrapartum amnioinfusion: a prospective observational study.
Tomlinson, T M; Schaecher, C; Sadovsky, Y; Gross, G
2012-07-01
To determine the influence of routine intrapartum amnioinfusion (AI) on intrauterine temperature. Prospective observational study. Maternity unit, Barnes Jewish Hospital, St Louis, MO, USA. Forty women with singleton gestations and an indication for intrapartum intrauterine pressure catheter placement. Using a temperature probe, we digitally recorded intrauterine temperature every 10 minutes during labour. Amnioinfusion was administered according to a standard protocol using saline equilibrated to the ambient temperature. Mean intrauterine temperature during labour. Participants were monitored for a mean of 280 minutes (range 20-820). A total of 164 intrauterine temperature readings in the AI cohort were compared with 797 control measurements. When compared with controls, we observed a lower intrauterine temperature in the AI cohort (36.4 versus 37.4°C, P<0.01). More measurements in the AI cohort were recorded in the presence of intrapartum fever (40% versus 30%). A subgroup analysis of measurements recorded in afebrile parturients revealed an even greater effect of AI (1.5°C decrease, 37.3 versus 35.8°C, P<0.01). Routine intrapartum AI using saline equilibrated to a mean ambient temperature of 25.0°C reduces intrauterine temperature and may thereby affect fetal core temperature. © 2012 The Authors BJOG An International Journal of Obstetrics and Gynaecology © 2012 RCOG.
Cowdrey, Natasha D; Waller, Glenn
2015-12-01
Psychotherapists report routinely not practising evidence-based treatments. However, there is little research examining the content of therapy from the patient perspective. This study examined the self-reported treatment experiences of individuals who had been told that they had received cognitive-behavior therapy (CBT) for their eating disorder. One hundred and fifty-seven such sufferers (mean age = 25.69 years) were recruited from self-help organisations. Participants completed an online survey assessing demographics, clinical characteristics, and therapy components. The use of evidence-based CBT techniques varied widely, with core elements for the eating disorders (e.g., weighing and food monitoring) used at well below the optimum level, while a number of unevidenced techniques were reported as being used commonly. Cluster analysis showed that participants received different patterns of intervention under the therapist label of 'CBT', with evidence-based CBT being the least common. Therapist age and patient diagnosis were related to the pattern of intervention delivered. It appears that clinicians are not subscribing to a transdiagnostic approach to the treatment of eating disorders. Patient recollections in this study support the conclusion that evidence-based practice is not routinely undertaken with this client group, even when the therapy offered is described as such. Copyright © 2015 Elsevier Ltd. All rights reserved.
ISE: An Integrated Search Environment. The manual
NASA Technical Reports Server (NTRS)
Chu, Lon-Chan
1992-01-01
Integrated Search Environment (ISE), a software package that implements hierarchical searches with meta-control, is described in this manual. ISE is a collection of problem-independent routines to support solving searches. Mainly, these routines are core routines for solving a search problem and they handle the control of searches and maintain the statistics related to searches. By separating the problem-dependent and problem-independent components in ISE, new search methods based on a combination of existing methods can be developed by coding a single master control program. Further, new applications solved by searches can be developed by coding the problem-dependent parts and reusing the problem-independent parts already developed. Potential users of ISE are designers of new application solvers and new search algorithms, and users of experimental application solvers and search algorithms. The ISE is designed to be user-friendly and information rich. In this manual, the organization of ISE is described and several experiments carried out on ISE are also described.
A wireline piston core barrel for sampling cohesionless sand and gravel below the water table
Zapico, Michael M.; Vales, Samuel; Cherry, John A.
1987-01-01
A coring device has been developed to obtain long and minimally disturbed samples of saturated cohesionless sand and gravel. The coring device, which includes a wireline and piston, was developed specifically for use during hollow-stem auger drilling but it also offers possibilities for cable tool and rotary drilling. The core barrel consists of an inner liner made of inexpensive aluminum or plastic tubing, a piston for core recovery, and an exterior steel housing that protects the liner when the core barrel is driven into the aquifer. The core barrel, which is approximately 1.6m (5.6 feet) long, is advanced ahead of the lead auger by hammering at the surface on drill rods that are attached to the core barrel. After the sampler has been driven 1.5m (5 feet), the drill rods are detached and a wireline is used to hoist the core barrel, with the sample contained in the aluminum or plastic liner, to the surface. A vacuum developed by the piston during the coring operation provides good recovery of both the sediment and aquifer fluids contained in the sediment. In the field the sample tubes can be easily split along their length for on-site inspection or they can be capped with the pore water fluids inside and transported to the laboratory. The cores are 5cm (2 inches) in diameter by 1.5m (5 feet) long. Core acquisition to depths of 35m (115 feet), with a recovery greater than 90 percent, has become routine in University of Waterloo aquifer studies. A large diameter (12.7cm [5 inch]) version has also been used successfully. Nearly continuous sample sequences from sand and gravel aquifers have been obtained for studies of sedimentology, hydraulic conductivity, hydrogeochemistry and microbiology.
A Conservative Approach to the Management of a Dental Trauma for Immediate Natural Esthetics.
Mahesh Patni, Pallav; Jain, Pradeep; Jain Patni, Mona
2016-06-01
The fracture of front teeth is one of the routine presentations of traumatic injuries. The treatment of a fractured tooth involving the pulp includes root canal therapy and post placement followed by core build-up or by the extraction of the fractured tooth if it is not restorable. We report a case of an adult male who had traumatized both his maxillary central incisors following a blow experienced during domestic violence. He had lost a fractured fragment of the right central incisor, while the left incisor had complicated fractures with fragments retained attached to the soft tissue. Following radiovisiography (RVG), both incisors were conservatively treated in a single visit by reattachment and post and core techniques. The treatment reported for reattachment of the tooth fractures and post and core techniques are reasonably easy while providing immediate and lasting results in patients' regaining of social confidence and functionality.
Semantic Technologies for Re-Use of Clinical Routine Data.
Kreuzthaler, Markus; Martínez-Costa, Catalina; Kaiser, Peter; Schulz, Stefan
2017-01-01
Routine patient data in electronic patient records are only partly structured, and an even smaller segment is coded, mainly for administrative purposes. Large parts are only available as free text. Transforming this content into a structured and semantically explicit form is a prerequisite for querying and information extraction. The core of the system architecture presented in this paper is based on SAP HANA in-memory database technology using the SAP Connected Health platform for data integration as well as for clinical data warehousing. A natural language processing pipeline analyses unstructured content and maps it to a standardized vocabulary within a well-defined information model. The resulting semantically standardized patient profiles are used for a broad range of clinical and research application scenarios.
Surface-Source Downhole Seismic Analysis in R
Thompson, Eric M.
2007-01-01
This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the inversion for the velocity of each layer. The analyst usually picks layer-interfaces by visual inspection of the travel-time data. I have also developed an algorithm that automatically finds boundaries which can save a significant amount of the time when analyzing a large number of sites. The results of the automatic routines should be reviewed to check that they are reasonable. The interactivity of these scripts allows the user to add and to remove layers quickly, thus allowing rapid feedback on how the residuals are affected by each additional parameter in the inversion. In addition, the script allows many models to be compared at the same time.
Mineral and Lithology Mapping of Drill Core Pulps Using Visible and Infrared Spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, G. R., E-mail: G.Taylor@unsw.edu.au
2000-12-15
A novel approach for using field spectrometry for determining both the mineralogy and the lithology of drill core pulps (powders) is developed and evaluated. The methodology is developed using material from a single drillhole through a mineralized sequence of rocks from central New South Wales. Mineral library spectra are used in linear unmixing routines to determine the mineral abundances in drill core pulps that represent between 1 m and 3 m of core. Comparison with X-Ray Diffraction (XRD) analyses shows that for most major constituents, spectrometry provides an estimate of quantitative mineralogy that is as reliable as that provided bymore » XRD. Confusion between the absorption features of calcite and those of chlorite causes the calcite contents determined by spectrometry to be unreliable. Convex geometry is used to recognize the spectra of those samples that are extreme and are representative of unique lithologies. Linear unmixing is used to determine the abundance of these lithologies in each drillhole sample and these abundances are used to interpret the geology of the drillhole. The interpreted geology agrees well with conventional drillhole logs of the visible geology and photographs of the split core. The methods developed provide a quick and cost-effective way of determining the lithology and alteration mineralogy of drill core pulps.« less
Tang, Jing-Hua; An, Xin; Lin, Xi; Gao, Yuan-Hong; Liu, Guo-Chen; Kong, Ling-Heng; Pan, Zhi-Zhong; Ding, Pei-Rong
2015-10-20
Patients with pathological complete remission (pCR) after treated with neoadjuvant chemoradiotherapy (nCRT) have better long-term outcome and may receive conservative treatments in locally advanced rectal cancer (LARC). The study aimed to evaluate the value of forceps biopsy and core needle biopsy in prediction of pCR in LARC treated with nCRT. In total, 120 patients entered this study. Sixty-one consecutive patients received preoperative forceps biopsy during endoscopic examination. Ex vivo core needle biopsy was performed in resected specimens of another 43 consecutive patients. The accuracy for ex vivo core needle biopsy was significantly higher than forceps biopsy (76.7% vs. 36.1%; p < 0.001). The sensitivity for ex vivo core needle biopsy was significantly lower in good responder (TRG 3) than poor responder (TRG ≤ 2) (52.9% vs. 94.1%; p = 0.017). In vivo core needle biopsy was further performed in 16 patients with good response. Eleven patients had residual cancer cells in final resected specimens, among whom 4 (36.4%) patients were biopsy positive. In conclusion, routine forceps biopsy was of limited value in identifying pCR after nCRT. Although core needle biopsy might further identify a subset of patients with residual cancer cells, the accuracy was not substantially increased in good responders.
Gao, Yuan-Hong; Liu, Guo-Chen; Kong, Ling-Heng; Pan, Zhi-Zhong; Ding, Pei-Rong
2015-01-01
Patients with pathological complete remission (pCR) after treated with neoadjuvant chemoradiotherapy (nCRT) have better long-term outcome and may receive conservative treatments in locally advanced rectal cancer (LARC). The study aimed to evaluate the value of forceps biopsy and core needle biopsy in prediction of pCR in LARC treated with nCRT. In total, 120patients entered this study. Sixty-one consecutive patients received preoperative forceps biopsy during endoscopic examination. Ex vivo core needle biopsy was performed in resected specimens of another 43 consecutive patients. The accuracy for ex vivo core needle biopsy was significantly higher than forceps biopsy (76.7% vs. 36.1%; p < 0.001). The sensitivity for ex vivo core needle biopsy was significantly lower in good responder (TRG 3) than poor responder (TRG ≤ 2) (52.9% vs. 94.1%; p = 0.017). In vivo core needle biopsy was further performed in 16 patients with good response. Eleven patients had residual cancer cells in final resected specimens, among whom 4 (36.4%) patients were biopsy positive. In conclusion, routine forceps biopsy was of limited value in identifying pCR after nCRT. Although core needle biopsy might further identify a subset of patients with residual cancer cells, the accuracy was not substantially increased in good responders. PMID:26416245
Inflight and Preflight Detection of Pitot Tube Anomalies
NASA Technical Reports Server (NTRS)
Mitchell, Darrell W.
2014-01-01
The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.
Iftimia, Nicusor; Park, Jesung; Maguluri, Gopi; Krishnamurthy, Savitri; McWatters, Amanda; Sabir, Sharjeel H
2018-02-01
We report the development and the pre-clinical testing of a new technology based on optical coherence tomography (OCT) for investigating tissue composition at the tip of the core biopsy needle. While ultrasound, computed tomography, and magnetic resonance imaging are routinely used to guide needle placement within a tumor, they still do not provide the resolution needed to investigate tissue cellularity (ratio between viable tumor and benign stroma) at the needle tip prior to taking a biopsy core. High resolution OCT imaging, however, can be used to investigate tissue morphology at the micron scale, and thus to determine if the biopsy core would likely have the expected composition. Therefore, we implemented this capability within a custom-made biopsy gun and evaluated its capability for a correct estimation of tumor tissue cellularity. A pilot study on a rabbit model of soft tissue cancer has shown the capability of this technique to provide correct evaluation of tumor tissue cellularity in over 85% of the cases. These initial results indicate the potential benefit of the OCT-based approach for improving the success of the core biopsy procedures.
A Xhosa language translation of the CORE-OM using South African university student samples.
Campbell, Megan M; Young, Charles
2016-10-01
The translation of well established psychometric tools from English into Xhosa may assist in improving access to psychological services for Xhosa speakers. The aim of this study was to translate the Clinical Outcomes in Routine Evaluation - Outcome Measure (CORE-OM), a measure of general distress and dysfunction developed in the UK, into Xhosa for use at South African university student counselling centres. The CORE-OM and embedded CORE-10 were translated into Xhosa using a five-stage translation design. This design included (a) forward-translation, (b) back-translation, (c) committee approach, (d) qualitative piloting, and (e) quantitative piloting on South African university students. Clinical and general samples were drawn from English-medium South African universities. Clinical samples were generated from university student counselling centres. General student samples were generated through random stratified cluster sampling of full-time university students. Qualitative feedback from the translation process and results from quantitative piloting of the 34-item CORE-OM English and Xhosa versions supported the reduction of the scale to 10 items. This reduced scale is referred to as the South African CORE-10 (SA CORE-10). A measurement and structural model of the SA CORE-10 English version was developed and cross-validated using an English-speaking university student sample. Equivalence of this model with the SA CORE-10 Xhosa version was investigated using a first-language Xhosa-speaking university sample. Partial measurement equivalence was achieved at the metric level. The resultant SA CORE-10 Xhosa and English versions provide core measures of distress and dysfunction. Additional, culture- and language-specific domains could be added to increase sensitivity and specificity. © The Author(s) 2016.
Xenon-induced power oscillations in a generic small modular reactor
NASA Astrophysics Data System (ADS)
Kitcher, Evans Damenortey
As world demand for energy continues to grow at unprecedented rates, the world energy portfolio of the future will inevitably include a nuclear energy contribution. It has been suggested that the Small Modular Reactor (SMR) could play a significant role in the spread of civilian nuclear technology to nations previously without nuclear energy. As part of the design process, the SMR design must be assessed for the threat to operations posed by xenon-induced power oscillations. In this research, a generic SMR design was analyzed with respect to just such a threat. In order to do so, a multi-physics coupling routine was developed with MCNP/MCNPX as the neutronics solver. Thermal hydraulic assessments were performed using a single channel analysis tool developed in Python. Fuel and coolant temperature profiles were implemented in the form of temperature dependent fuel cross sections generated using the SIGACE code and reactor core coolant densities. The Power Axial Offset (PAO) and Xenon Axial Offset (XAO) parameters were chosen to quantify any oscillatory behavior observed. The methodology was benchmarked against results from literature of startup tests performed at a four-loop PWR in Korea. The developed benchmark model replicated the pertinent features of the reactor within ten percent of the literature values. The results of the benchmark demonstrated that the developed methodology captured the desired phenomena accurately. Subsequently, a high fidelity SMR core model was developed and assessed. Results of the analysis revealed an inherently stable SMR design at beginning of core life and end of core life under full-power and half-power conditions. The effect of axial discretization, stochastic noise and convergence of the Monte Carlo tallies in the calculations of the PAO and XAO parameters was investigated. All were found to be quite small and the inherently stable nature of the core design with respect to xenon-induced power oscillations was confirmed. Finally, a preliminary investigation into excess reactivity control options for the SMR design was conducted confirming the generally held notion that existing PWR control mechanisms can be used in iPWR SMRs with similar effectiveness. With the desire to operate the SMR under the boron free coolant condition, erbium oxide fuel integral burnable absorber rods were identified as a possible means to retain the dispersed absorber effect of soluble boron in the reactor coolant in replacement.
Facilitating Infant/Toddler Skills in Family-Child Routines.
ERIC Educational Resources Information Center
Stremel, Kathleen; And Others
This paper on facilitating skill development of infants and toddlers with disabilities within family-child routines focuses on: (1) developing a routine analysis by incorporating multiple Individual Family Service Plan (IFSP) objectives into family-selected routines; (2) utilizing systematic family training procedures to integrate targeted skills…
NASA Technical Reports Server (NTRS)
Jewett, M. E.; Duffy, J. F.; Czeisler, C. A.
2000-01-01
A double-stimulus experiment was conducted to evaluate the phase of the underlying circadian clock following light-induced phase shifts of the human circadian system. Circadian phase was assayed by constant routine from the rhythm in core body temperature before and after a three-cycle bright-light stimulus applied near the estimated minimum of the core body temperature rhythm. An identical, consecutive three-cycle light stimulus was then applied, and phase was reassessed. Phase shifts to these consecutive stimuli were no different from those obtained in a previous study following light stimuli applied under steady-state conditions over a range of circadian phases similar to those at which the consecutive stimuli were applied. These data suggest that circadian phase shifts of the core body temperature rhythm in response to a three-cycle stimulus occur within 24 h following the end of the 3-day light stimulus and that this poststimulus temperature rhythm accurately reflects the timing of the underlying circadian clock.
Efficiency of static core turn-off in a system-on-a-chip with variation
Cher, Chen-Yong; Coteus, Paul W; Gara, Alan; Kursun, Eren; Paulsen, David P; Schuelke, Brian A; Sheets, II, John E; Tian, Shurong
2013-10-29
A processor-implemented method for improving efficiency of a static core turn-off in a multi-core processor with variation, the method comprising: conducting via a simulation a turn-off analysis of the multi-core processor at the multi-core processor's design stage, wherein the turn-off analysis of the multi-core processor at the multi-core processor's design stage includes a first output corresponding to a first multi-core processor core to turn off; conducting a turn-off analysis of the multi-core processor at the multi-core processor's testing stage, wherein the turn-off analysis of the multi-core processor at the multi-core processor's testing stage includes a second output corresponding to a second multi-core processor core to turn off; comparing the first output and the second output to determine if the first output is referring to the same core to turn off as the second output; outputting a third output corresponding to the first multi-core processor core if the first output and the second output are both referring to the same core to turn off.
NASA Astrophysics Data System (ADS)
Lougheed, Bryan; Metcalfe, Brett; Wacker, Lukas
2017-04-01
Marine sediment cores used in palaeoceanography form the basis of our current understanding of past global climate and ocean chemistry. Precision and accuracy of geochronological control in these sediment cores are crucial in unravelling the timing of rapid shifts in palaeoclimate and, ultimately, the interdependency of global climate mechanisms and their causality. Aware of the problems associated with bioturbation (the mixing of ocean sediments by benthic organisms) palaeoceanographers generally aim to retrieve sediment cores from locations with high sediment accumulation rates, thus minimising the influence of bioturbation as much as possible. However, the practice of concentrating only on areas of the ocean floor with high sedimentation accumulation rates has the potential to introduce a geographical bias into our understanding of global palaeoclimate. For example, global time averaged sediment accumulation rates for the ocean floor (excluding continental margins) indicate that vast areas of the ocean floor have sediment accumulation rates less than the recommended minimum advised sediment accumulation rates of 10 cm/ka or greater. Whilst many studies have focussed on quantifying the impact of bioturbation on our understanding of the past, few have attempted to overcome the problems associated with bioturbation. Recent pioneering developments in 14C AMS at the Laboratory of Ion Beam Physics at ETH Zürich have led to the development of the Mini Carbon Dating System (MICADAS). This compact 14C AMS system can be coupled to a carbonate handling system, thus enabling the direct AMS measurement of gaseous samples, i.e. without graphitisation, allowing for the analysis of carbonate samples of <100 μg. Likewise, while earlier isotope ratio mass spectrometry (IRMS) technology required a minimum of 100 μg of carbonate to produce a successful δ18O measurement, more recent advances in IRMS technology have made routine measurements of as little as 5 μg possible. Combining both analytical techniques enables palaeoclimate reconstructions that are independent of depth. Here, we present work on a low sedimentation core ( 2 cm/ka) core in the North Atlantic (core T86-10P, 37° 8.13' N, 29° 59.15' W) on single shells of the benthic species of foraminifera, Cibicidoides wuellerstorfi. Preliminary downcore single specimen 14C data display a large scatter in 14C ages for the various discrete 1 cm depth intervals analysed. In the case of depth intervals where three or more single specimens have been analysed, we find that the standard deviation in 14C age varies between 1210 and 9437 14C yr, with the mean variation for all such discrete depths being 3384 14C yr.
O'Sullivan, Daniel J.; O'Sullivan, Orla; McSweeney, Paul L. H.; Sheehan, Jeremiah J.
2015-01-01
We sought to determine if the time, within a production day, that a cheese is manufactured has an influence on the microbial community present within that cheese. To facilitate this, 16S rRNA amplicon sequencing was used to elucidate the microbial community dynamics of brine-salted continental-type cheese in cheeses produced early and late in the production day. Differences in the microbial composition of the core and rind of the cheese were also investigated. Throughout ripening, it was apparent that cheeses produced late in the day had a more diverse microbial population than their early equivalents. Spatial variation between the cheese core and rind was also noted in that cheese rinds were initially found to have a more diverse microbial population but thereafter the opposite was the case. Interestingly, the genera Thermus, Pseudoalteromonas, and Bifidobacterium, not routinely associated with a continental-type cheese produced from pasteurized milk, were detected. The significance, if any, of the presence of these genera will require further attention. Ultimately, the use of high-throughput sequencing has facilitated a novel and detailed analysis of the temporal and spatial distribution of microbes in this complex cheese system and established that the period during a production cycle at which a cheese is manufactured can influence its microbial composition. PMID:25636841
Seeking Humanizing Care in Patient-Centered Care Process: A Grounded Theory Study.
Cheraghi, Mohammad Ali; Esmaeili, Maryam; Salsali, Mahvash
Patient-centered care is both a goal in itself and a tool for enhancing health outcomes. The application of patient-centered care in health care services globally however is diverse. This article reports on a study that sought to introduce patient-centered care. The aim of this study is to explore the process of providing patient-centered care in critical care units. The study used a grounded theory method. Data were collected on 5 critical care units in Tehran University of Medical Sciences. Purposive and theoretical sampling directed the collection of data using 29 semistructured interviews with 27 participants (nurses, patients, and physician). Data obtained were analyzed according to the analysis stages of grounded theory and constant comparison to identify the concepts, context, and process of the study. The core category of this grounded theory is "humanizing care," which consisted of 4 interrelated phases, including patient acceptance, purposeful patient assessment and identification, understanding patients, and patient empowerment. A core category of humanizing care integrated the theory. Humanizing care was an outcome and process. Patient-centered care is a dynamic and multifaceted process provided according to the nurses' understanding of the concept. Patient-centered care does not involve repeating routine tasks; rather, it requires an all-embracing understanding of the patients and showing respect for their values, needs, and preferences.
Suppa, Per; Hampel, Harald; Kepp, Timo; Lange, Catharina; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph
2016-01-01
MRI-based hippocampus volume, a core feasible biomarker of Alzheimer's disease (AD), is not yet widely used in clinical patient care, partly due to lack of validation of software tools for hippocampal volumetry that are compatible with routine workflow. Here, we evaluate fully-automated and computationally efficient hippocampal volumetry with FSL-FIRST for prediction of AD dementia (ADD) in subjects with amnestic mild cognitive impairment (aMCI) from phase 1 of the Alzheimer's Disease Neuroimaging Initiative. Receiver operating characteristic analysis of FSL-FIRST hippocampal volume (corrected for head size and age) revealed an area under the curve of 0.79, 0.70, and 0.70 for prediction of aMCI-to-ADD conversion within 12, 24, or 36 months, respectively. Thus, FSL-FIRST provides about the same power for prediction of progression to ADD in aMCI as other volumetry methods.
Entanglement-Based Machine Learning on a Quantum Computer
NASA Astrophysics Data System (ADS)
Cai, X.-D.; Wu, D.; Su, Z.-E.; Chen, M.-C.; Wang, X.-L.; Li, Li; Liu, N.-L.; Lu, C.-Y.; Pan, J.-W.
2015-03-01
Machine learning, a branch of artificial intelligence, learns from previous experience to optimize performance, which is ubiquitous in various fields such as computer sciences, financial analysis, robotics, and bioinformatics. A challenge is that machine learning with the rapidly growing "big data" could become intractable for classical computers. Recently, quantum machine learning algorithms [Lloyd, Mohseni, and Rebentrost, arXiv.1307.0411] were proposed which could offer an exponential speedup over classical algorithms. Here, we report the first experimental entanglement-based classification of two-, four-, and eight-dimensional vectors to different clusters using a small-scale photonic quantum computer, which are then used to implement supervised and unsupervised machine learning. The results demonstrate the working principle of using quantum computers to manipulate and classify high-dimensional vectors, the core mathematical routine in machine learning. The method can, in principle, be scaled to larger numbers of qubits, and may provide a new route to accelerate machine learning.
RAPID RESCUE: BREAKING THE MOLD OF ROUTINE CONTINGENCY RESPONSE FOR PERSONNEL RECOVERY
2016-10-23
ultimately lead to timelier response and greater economy of force for an already critically strained Air Force core function. 1 INTRODUCTION...achieve economy of force. When an OPLAN calls for PR, a capability is requested rather than individual unit. The existing UTCs are too rigid and...the combatant commander and operational planners to achieve unity and economy of force without exceeding PR capacity.61 Tactical Employment
Structural health monitoring of compression connectors for overhead transmission lines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Hong; Wang, Jy-An John; Swindeman, Joseph P
Two-stage aluminum conductor steel-reinforced (ACSR) compression connectors are extensively used in US overhead transmission lines. The connectors are made by crimping a steel sleeve onto a steel core and an aluminum sleeve over aluminum conductive strands. The connectors are designed to operate at temperatures up to 125 C, but their performance is increasingly degrading because of overloading of lines. Currently, electric utilities conduct routine line inspections using thermal and electrical measurements. However, information about the structural integrity of connectors cannot be obtained. In this work, structural health monitoring (SHM) of compression connectors was studied using electromechanical impedance (EMI) analysis. Leadmore » zirconate titanate (PZT)-5A was identified as a smart material for SHM. A flexible high-temperature bonding layer was used to address challenges in PZT integration due to a significant difference in the coefficients of thermal expansion of PZT and the aluminum substrate. The steel joint on the steel core was investigated because it is responsible for the ultimate tensile strength of the connector. Tensile testing was used to create structural damage to the joint, or steel core pullout, and thermal cycling introduced additional structural perturbations. EMI measurements were conducted between the tests. The root mean square deviation (RMSD) of EMI was identified as a damage index. The use of steel joints has been shown to enable SHM under simulated conditions. The EMI signature is sensitive to variations in structural conditions. RMSD can be correlated to the structural health of a connector and has potential for use in the SHM and structural integrity evaluation.« less
Miller's Pyramid and Core Competency Assessment: A Study in Relationship Construct Validity.
Williams, Betsy White; Byrne, Phil D; Welindt, Dillon; Williams, Michael V
2016-01-01
Continuous professional development relies on the link between performance and an educational process aimed at improving knowledge and skill. One of the most broadly used frameworks for assessing skills is Miller's Pyramid. This Pyramid has a series of levels of achievement beginning with knowledge (at the base) and ending with routine application in the clinical setting. The purpose of this study was to determine the degree of convergence of two measurement methods, one based on Miller's framework, the second using the Accreditation Council for Graduate Medical Education/American Board of Medical Specialties (ACGME/ABMS) Core Competency framework. The data were gathered from the faculty of a large, Midwestern regional health care provider and hospital system. Data from 264 respondents were studied. The 360° data were from raters of physicians holding supervisory roles in the organization. The scale items were taken from an instrument that has been validated for both structure and known group prediction. The Miller scale was purposely built for this application. The questions were designed to describe each level of the model. The Miller scale was reduced to a single dimension. This result was then regressed on the items from the 360° item ratings. Results of a multivariate analysis of variance isolated a significant relationship between the Miller's Pyramid score and the competency items (P < 0.001). These findings demonstrate a relationship between measures based on Miller's framework and behavioral measures based on the ABMS/ACGME core competencies. Equally important is the finding that while they are related they are not identical. These findings have implications for continuous professional development programing design.
Structural health monitoring of compression connectors for overhead transmission lines
NASA Astrophysics Data System (ADS)
Wang, Hong; Wang, Jy-An John; Swindeman, Joseph P.; Ren, Fei; Chan, John
2017-04-01
Two-stage aluminum conductor steel-reinforced (ACSR) compression connectors are extensively used in US overhead transmission lines. The connectors are made by crimping a steel sleeve onto a steel core and an aluminum sleeve over electrical conducting aluminum strands. The connectors are designed to operate at temperatures up to 125°C, but their performance is increasingly degrading because of overloading of lines. Currently, electric utilities conduct routine line inspections using thermal and electrical measurements, but these methods do not provide information about the structural integrity of connectors. In this work, structural health monitoring (SHM) of compression connectors was studied using electromechanical impedance (EMI) analysis. Lead zirconate titanate (PZT)-5A was identified as a smart material for SHM. A flexible high-temperature bonding layer was used to address challenges in PZT integration due to a significant difference in the coefficients of thermal expansion of PZT and the aluminum substrate. The steel joint on the steel core was investigated because it is responsible for the ultimate tensile strength of the connector. Tensile testing was used to induce structural damage to the joint, or steel core pullout, and thermal cycling introduced additional structural perturbations. EMI measurements were conducted between the tests. The root mean square deviation (RMSD) of EMI was identified as a damage index. The use of steel joints has been shown to enable SHM under simulated conditions. The EMI signature is sensitive to variations in structural conditions. RMSD can be correlated to the structural health of a connector and has potential for use in the SHM and structural integrity evaluation.
Jolley, Keith A.; Reed, Elizabeth; Martinez-Urtaza, Jaime
2017-01-01
ABSTRACT Vibrio parahaemolyticus is an important human foodborne pathogen whose transmission is associated with the consumption of contaminated seafood, with a growing number of infections reported over recent years worldwide. A multilocus sequence typing (MLST) database for V. parahaemolyticus was created in 2008, and a large number of clones have been identified, causing severe outbreaks worldwide (sequence type 3 [ST3]), recurrent outbreaks in certain regions (e.g., ST36), or spreading to other regions where they are nonendemic (e.g., ST88 or ST189). The current MLST scheme uses sequences of 7 genes to generate an ST, which results in a powerful tool for inferring the population structure of this pathogen, although with limited resolution, especially compared to pulsed-field gel electrophoresis (PFGE). The application of whole-genome sequencing (WGS) has become routine for trace back investigations, with core genome MLST (cgMLST) analysis as one of the most straightforward ways to explore complex genomic data in an epidemiological context. Therefore, there is a need to generate a new, portable, standardized, and more advanced system that provides higher resolution and discriminatory power among V. parahaemolyticus strains using WGS data. We sequenced 92 V. parahaemolyticus genomes and used the genome of strain RIMD 2210633 as a reference (with a total of 4,832 genes) to determine which genes were suitable for establishing a V. parahaemolyticus cgMLST scheme. This analysis resulted in the identification of 2,254 suitable core genes for use in the cgMLST scheme. To evaluate the performance of this scheme, we performed a cgMLST analysis of 92 newly sequenced genomes, plus an additional 142 strains with genomes available at NCBI. cgMLST analysis was able to distinguish related and unrelated strains, including those with the same ST, clearly showing its enhanced resolution over conventional MLST analysis. It also distinguished outbreak-related from non-outbreak-related strains within the same ST. The sequences obtained from this work were deposited and are available in the public database (http://pubmlst.org/vparahaemolyticus). The application of this cgMLST scheme to the characterization of V. parahaemolyticus strains provided by different laboratories from around the world will reveal the global picture of the epidemiology, spread, and evolution of this pathogen and will become a powerful tool for outbreak investigations, allowing for the unambiguous comparison of strains with global coverage. PMID:28330888
Lucyshyn, Joseph M.; Irvin, Larry K.; Blumberg, E. Richard; Laverty, Robelyn; Horner, Robert H.; Sprague, Jeffrey R.
2015-01-01
We conducted an observational study of parent-child interaction in home activity settings (routines) of families raising young children with developmental disabilities and problem behavior. Our aim was to empirically investigate the construct validity of coercion in typical but unsuccessful family routines. The long-term goal was to develop an expanded ecological unit of analysis that may contribute to sustainable behavioral family intervention. Ten children with autism and/or mental retardation and their families participated. Videotaped observations were conducted in typical but unsuccessful home routines. Parent-child interaction in routines was coded in real time and sequential analyses were conducted to test hypotheses about coercive processes. Following observation, families were interviewed about the social validity of the construct. Results confirmed the presence of statistically significant, attention-driven coercive processes in routines in which parents were occupied with non-child centered tasks. Results partially confirmed the presence of escape-driven coercive processes in routines in which parent demands are common. Additional analysis revealed an alternative pattern with greater magnitude. Family perspectives suggested the social validity of the construct. Results are discussed in terms of preliminary, partial evidence for coercive processes in routines of families of children with developmental disabilities. Implications for behavioral assessment and intervention design are discussed. PMID:26321883
Modular thermal analyzer routine, volume 1
NASA Technical Reports Server (NTRS)
Oren, J. A.; Phillips, M. A.; Williams, D. R.
1972-01-01
The Modular Thermal Analyzer Routine (MOTAR) is a general thermal analysis routine with strong capabilities for performing thermal analysis of systems containing flowing fluids, fluid system controls (valves, heat exchangers, etc.), life support systems, and thermal radiation situations. Its modular organization permits the analysis of a very wide range of thermal problems for simple problems containing a few conduction nodes to those containing complicated flow and radiation analysis with each problem type being analyzed with peak computational efficiency and maximum ease of use. The organization and programming methods applied to MOTAR achieved a high degree of computer utilization efficiency in terms of computer execution time and storage space required for a given problem. The computer time required to perform a given problem on MOTAR is approximately 40 to 50 percent that required for the currently existing widely used routines. The computer storage requirement for MOTAR is approximately 25 percent more than the most commonly used routines for the most simple problems but the data storage techniques for the more complicated options should save a considerable amount of space.
Towe, Vivian L.; Acosta, Joie D.; Chandra, Anita
2017-01-01
Nongovernmental organizations (NGOs) are being integrated into U.S. strategies to expand the services that are available during health security threats like disasters. Identifying better ways to classify NGOs and their services could optimize disaster planning. We surveyed NGOs about the types of services they provided during different disaster phases. Survey responses were used to categorize NGO services as core—critical to fulfilling their organizational mission—or adaptive—services implemented during a disaster based on community need. We also classified NGOs as being core or adaptive types of organizations by calculating the percentage of each NGO’s services classified as core. Service types classified as core were mainly social services, while adaptive service types were those typically relied upon during disasters (e.g., warehousing, food services, etc.). In total, 120 NGOs were classified as core organizations, meaning they mainly provided the same services across disaster phases, while 100 NGOs were adaptive organizations, meaning their services changed. Adaptive NGOs were eight times more likely to report routinely participating in disaster planning as compared to core NGOs. One reason for this association may be that adaptive NGOs are more aware of the changing needs in their communities across disaster phases because of their involvement in disaster planning. PMID:29160810
Abrahim, Ahmed; Al-Sayah, Mohammad; Skrdla, Peter; Bereznitski, Yuri; Chen, Yadan; Wu, Naijun
2010-01-05
Fused-core silica stationary phases represent a key technological advancement in the arena of fast HPLC separations. These phases are made by fusing a 0.5 microm porous silica layer onto 1.7 microm nonporous silica cores. The reduced intra-particle flow path of the fused particles provides superior mass transfer kinetics and better performance at high mobile phase velocities, while the fused-core particles provide lower pressure than sub-2 microm particles. In this work, chromatographic performance of the fused-core particles (Ascentis Express) was investigated and compared to that of sub-2 microm porous particles (1.8 microm Zorbax Eclipse Plus C18 and 1.7 microm Acquity BEH C18). Specifically, retention, selectivity, and loading capacity were systematically compared for these two types of columns. Other chromatographic parameters such as efficiency and pressure drop were also studied. Although the fused-core column was found to provide better analyte shape selectivity, both columns had similar hydrophobic, hydrogen bonding, total ion-exchange, and acidic ion-exchange selectivities. As expected, the retention factors and sample loading capacity on the fused-core particle column were slightly lower than those for the sub-2 microm particle column. However, the most dramatic observation was that similar efficiency separations to the sub-2 microm particles could be achieved using the fused-core particles, without the expense of high column back pressure. The low pressure of the fused-core column allows fast separations to be performed routinely on a conventional LC system without significant loss in efficiency or resolution. Applications to the HPLC impurity profiling of drug substance candidates were performed using both types of columns to validate this last point.
Rubel, Oliver; Bowen, Benjamin P
2018-01-01
Mass spectrometry imaging (MSI) is a transformative imaging method that supports the untargeted, quantitative measurement of the chemical composition and spatial heterogeneity of complex samples with broad applications in life sciences, bioenergy, and health. While MSI data can be routinely collected, its broad application is currently limited by the lack of easily accessible analysis methods that can process data of the size, volume, diversity, and complexity generated by MSI experiments. The development and application of cutting-edge analytical methods is a core driver in MSI research for new scientific discoveries, medical diagnostics, and commercial-innovation. However, the lack of means to share, apply, and reproduce analyses hinders the broad application, validation, and use of novel MSI analysis methods. To address this central challenge, we introduce the Berkeley Analysis and Storage Toolkit (BASTet), a novel framework for shareable and reproducible data analysis that supports standardized data and analysis interfaces, integrated data storage, data provenance, workflow management, and a broad set of integrated tools. Based on BASTet, we describe the extension of the OpenMSI mass spectrometry imaging science gateway to enable web-based sharing, reuse, analysis, and visualization of data analyses and derived data products. We demonstrate the application of BASTet and OpenMSI in practice to identify and compare characteristic substructures in the mouse brain based on their chemical composition measured via MSI.
Saxena, Manoj K; Taylor, Colman; Billot, Laurent; Bompoint, Severine; Gowardman, John; Roberts, Jason A; Lipman, Jeffery; Myburgh, John
2015-01-01
Strategies to prevent pyrexia in patients with acute neurological injury may reduce secondary neuronal damage. The aim of this study was to determine the safety and efficacy of the routine administration of 6 grams/day of intravenous paracetamol in reducing body temperature following severe traumatic brain injury, compared to placebo. A multicentre, randomised, blind, placebo-controlled clinical trial in adult patients with traumatic brain injury (TBI). Patients were randomised to receive an intravenous infusion of either 1g of paracetamol or 0.9% sodium chloride (saline) every 4 hours for 72 hours. The primary outcome was the mean difference in core temperature during the study intervention period. Forty-one patients were included in this study: 21 were allocated to paracetamol and 20 to saline. The median (interquartile range) number of doses of study drug was 18 (17-18) in the paracetamol group and 18 (16-18) in the saline group (P = 0.85). From randomisation until 4 hours after the last dose of study treatment, there were 2798 temperature measurements (median 73 [67-76] per patient). The mean ± standard deviation temperature was 37.4±0.5°C in the paracetamol group and 37.7±0.4°C in the saline group (absolute difference -0.3°C; 95% confidence interval -0.6 to 0.0; P = 0.09). There were no significant differences in the use of physical cooling, or episodes of hypotension or hepatic abnormalities, between the two groups. The routine administration of 6g/day of intravenous paracetamol did not significantly reduce core body temperature in patients with TBI. Australian New Zealand Clinical Trials Registry ACTRN12609000444280.
Walker, Andreas; Bergmann, Matthias; Camdereli, Jennifer; Kaiser, Rolf; Lübke, Nadine; Timm, Jörg
2017-06-01
HCV treatment options and cure rates have tremendously increased in the last decade. Although a pan-genotype HCV treatment has recently been approved, most DAA therapies are still genotype specific. Resistance-associated variants (RAVs) can limit the efficacy of DAA therapy and are associated with increased risk for therapy failure. With the approval of DAA regimens that recommend resistance testing prior to therapy, correct assessment of the genotype and testing for viruses with RAVs is clinically relevant. However, genotyping and resistance testing is generally done in costly and laborious separate reactions. The aim of the study was to establish a genotype-independent full-genome reverse transcription protocol to generate a template for both genotyping and resistance testing and to implement it into our routine diagnostic setup. The complete HCV genome was reverse transcribed with a pan-genotype primer binding at the 3'end of the viral RNA. This cDNA served as template for transcription of the genotyping amplicon in the core region as well as for the resistance testing of NS3, NS5A, and NS5B. With the established RT-protocol the HCV core region was successfully amplified and genotyped from 124 out of 125 (99.2%) HCV-positive samples. The amplification efficiency of RAV containing regions in NS3, NS5A, NS5B was 96.2%, 96.6% and 94.4%, respectively. We developed a method for HCV full-genome cDNA synthesis and implemented it into a routine diagnostic setup. This cDNA can be used as template for genotyping amplicons covering the core or NS5B region as well as for resistance testing amplicons in NS3, NS5A and NS5B. Copyright © 2017 Elsevier B.V. All rights reserved.
Kinetic turbulence simulations at extreme scale on leadership-class systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Bei; Ethier, Stephane; Tang, William
2013-01-01
Reliable predictive simulation capability addressing confinement properties in magnetically confined fusion plasmas is critically-important for ITER, a 20 billion dollar international burning plasma device under construction in France. The complex study of kinetic turbulence, which can severely limit the energy confinement and impact the economic viability of fusion systems, requires simulations at extreme scale for such an unprecedented device size. Our newly optimized, global, ab initio particle-in-cell code solving the nonlinear equations underlying gyrokinetic theory achieves excellent performance with respect to "time to solution" at the full capacity of the IBM Blue Gene/Q on 786,432 cores of Mira at ALCFmore » and recently of the 1,572,864 cores of Sequoia at LLNL. Recent multithreading and domain decomposition optimizations in the new GTC-P code represent critically important software advances for modern, low memory per core systems by enabling routine simulations at unprecedented size (130 million grid points ITER-scale) and resolution (65 billion particles).« less
Biological, Physical, And Chemical Data From Gulf of Mexico Core PE0305-GC1
Osterman, Lisa E.; Swarzenski, Peter W.; Hollander, David
2010-01-01
This paper presents benthic foraminiferal census data, and magnetic susceptibility, 210Pb , radiocarbon, and geochemical measurements from gravity core PE0305-GC1 (=GC1). Core GC1 was collected from the Louisiana continental shelf as part of an initiative to investigate the geographic and temporal extent of hypoxia, low-oxygen water, in the Gulf of Mexico. Hypoxia (<1.4 ml/l or <2 ppm oxygen concentration) in Gulf of Mexico waters can eventually lead to death of marine species. The development of hypoxia off the Mississippi delta has increased steadily since routine and systematic measurements were begun in 1985 and has been linked to the use of fertilizer in the Mississippi basin. Benthic foraminifers provide a proxy to track the development of hypoxia prior to 1985. Previous work determined that the relative occurrence of three low-oxygen-tolerant species is highest in the hypoxia zone. The cumulative percentage of these three species (% Pseudononion atlanticum + % Epistominella vitrea, + % Buliminella morgani = PEB index of hypoxia) was used to investigate fluctuation in paleohypoxia in four cores, including the upper 60 cm of GC1. In this report, we compile all available data from GC1 as the basis for further publications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wardaya, P. D., E-mail: pongga.wardaya@utp.edu.my; Noh, K. A. B. M., E-mail: pongga.wardaya@utp.edu.my; Yusoff, W. I. B. W., E-mail: pongga.wardaya@utp.edu.my
This paper discusses a new approach for investigating the seismic wave velocity of rock, specifically carbonates, as affected by their pore structures. While the conventional routine of seismic velocity measurement highly depends on the extensive laboratory experiment, the proposed approach utilizes the digital rock physics view which lies on the numerical experiment. Thus, instead of using core sample, we use the thin section image of carbonate rock to measure the effective seismic wave velocity when travelling on it. In the numerical experiment, thin section images act as the medium on which wave propagation will be simulated. For the modeling, anmore » advanced technique based on artificial neural network was employed for building the velocity and density profile, replacing image's RGB pixel value with the seismic velocity and density of each rock constituent. Then, ultrasonic wave was simulated to propagate in the thin section image by using finite difference time domain method, based on assumption of an acoustic-isotropic medium. Effective velocities were drawn from the recorded signal and being compared to the velocity modeling from Wyllie time average model and Kuster-Toksoz rock physics model. To perform the modeling, image analysis routines were undertaken for quantifying the pore aspect ratio that is assumed to represent the rocks pore structure. In addition, porosity and mineral fraction required for velocity modeling were also quantified by using integrated neural network and image analysis technique. It was found that the Kuster-Toksoz gives the closer prediction to the measured velocity as compared to the Wyllie time average model. We also conclude that Wyllie time average that does not incorporate the pore structure parameter deviates significantly for samples having more than 40% porosity. Utilizing this approach we found a good agreement between numerical experiment and theoretically derived rock physics model for estimating the effective seismic wave velocity of rock.« less
NASA Astrophysics Data System (ADS)
Cécillon, Lauric; Quénéa, Katell; Anquetil, Christelle; Barré, Pierre
2015-04-01
Due to its large heterogeneity at all scales (from soil core to the globe), several measurements are often mandatory to get a meaningful value of a measured soil property. A large number of measurements can therefore be needed to study a soil property whatever the scale of the study. Moreover, several soil investigation techniques produce large and complex datasets, such as pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS) which produces complex 3-way data. In this context, straightforward methods designed to speed up data treatments are needed to deal with large datasets. GC-MS pyrolysis (py-GCMS) is a powerful and frequently used tool to characterize soil organic matter (SOM). However, the treatment of the results of a py-GCMS analysis of soil sample is time consuming (number of peaks, co-elution, etc.) and the treatment of large data set of py-GCMS results is rather laborious. Moreover, peak position shifts and baseline drifts between analyses make the automation of GCMS programs data treatment difficult. These problems can be fixed using the Parallel Factor Analysis 2 (PARAFAC 2, Kiers et al., 1999; Bro et al., 1999). This algorithm has been applied frequently on chromatography data but has never been applied to analyses of SOM. We developed a Matlab routine based on existing Matlab packages dedicated to the simultaneous treatment of dozens of pyro-chromatograms mass spectra. We applied this routine on 40 soil samples. The benefits and expected improvements of our method will be discussed in our poster. References Kiers et al. (1999) PARAFAC2 - PartI. A direct fitting algorithm for the PARAFAC2 model. Journal of Chemometrics, 13: 275-294. Bro et al. (1999) PARAFAC2 - PartII. Modeling chromatographic data with retention time shifts. Journal of Chemometrics, 13: 295-309.
City Core - detecting the anthropocene in urban lake cores
NASA Astrophysics Data System (ADS)
Kjaer, K. H.; Ilsøe, P.; Andresen, C. S.; Rasmussen, P.; Andersen, T. J.; Frei, R.; Schreiber, N.; Odgaard, B.; Funder, S.; Holm, J. M.; Andersen, K.
2011-12-01
Here, we presents the preliminary results from lake cores taken in ditches associated with the historical fortifications enclosing the oldest - central Copenhagen to achieve new knowledge from sediment deposits related to anthropogenic activities. We have examined sediment cores with X-ray fluorescence (XRF) analyzers to correlate element patterns from urban and industrial emissions. Thus, we aim to track these patterns back in time - long before regular routines of recording of atmospheric environment began around 1978. Furthermore, we compare our data to alternative sources of information in order to constrain and expand the temporal dating limits (approximately 1890) achieved from 210Pb activity. From custom reports and statistic sources, information on imported volumes from coal, metal and oil was obtained and related contaminants from these substances to the sediment archives. Intriguingly, we find a steep increase in import of coal and metals matching the exponential increase of lead and zinc counts from XRF-recordings of the sediment cores. In this finding, we claim to have constrain the initiation of urban industrialization. In order to confirm the age resolution of the lake cores, DNA was extracted from sediments, sedaDNA. Thus we attempt to trace plantation of well documented exotic plants to, for instance, the Botanical Garden. Through extraction and sampling of sedaDNA from these floral and arboreal specimens we intend to locate their strataigraphic horizons in the sediment core. These findings may correlate data back to 1872, when the garden was established on the area of the former fortification. In this line of research, we hope to achieve important supplementary knowledge of sedaDNA-leaching frequencies within freshwater sediments.
Hakeem, Abdul R; Birks, Theodore; Azeem, Qasim; Di Franco, Filippo; Gergely, Szabolcs; Harris, Adrian M
2016-06-01
There is conflicting evidence for the use of warmed, humidified carbon dioxide (CO2) for creating pneumoperitoneum during laparoscopic cholecystectomy. Few studies have reported less post-operative pain and analgesic requirement when warmed CO2 was used. This systematic review and meta-analysis aims to analyse the literature on the use of warmed CO2 in comparison to standard temperature CO2 during laparoscopic cholecystectomy. Systematic review and meta-analysis carried out in line with the PRISMA guidelines. Primary outcomes of interest were post-operative pain at 6 h, day 1 and day 2 following laparoscopic cholecystectomy. Secondary outcomes were analgesic usage and drop in intra-operative core body temperature. Standard Mean Difference (SMD) was calculated for continuous variables. Six randomised controlled trials (RCTs) met the inclusion criteria (n = 369). There was no significant difference in post-operative pain at 6 h [3 RCTs; SMD = -0.66 (-1.33, 0.02) (Z = 1.89) (P = 0.06)], day 1 [4 RCTs; SMD = -0.51 (-1.47, 0.44) (Z = 1.05) (P = 0.29)] and day 2 [2 RCTs; SMD = -0.96 (-2.30, 0.37) (Z = 1.42) (P = 0.16)] between the warmed CO2 and standard CO2 group. There was no difference in analgesic usage between the two groups, but pooled analysis was not possible. Two RCTs reported significant drop in intra-operative core body temperature, but there were no adverse events related to this. This review showed no difference in post-operative pain and analgesic requirements between the warmed and standard CO2 insufflation during laparoscopic cholecystectomy. Currently there is not enough high quality evidence to suggest routine usage of warmed CO2 for creating pneumoperitoneum during laparoscopic cholecystectomy. Copyright © 2015 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.
Cox, Helen S; Mbhele, Slindile; Mohess, Neisha; Whitelaw, Andrew; Muller, Odelia; Zemanay, Widaad; Little, Francesca; Azevedo, Virginia; Simpson, John; Boehme, Catharina C; Nicol, Mark P
2014-11-01
Xpert MTB/RIF is approved for use in tuberculosis (TB) and rifampicin-resistance diagnosis. However, data are limited on the impact of Xpert under routine conditions in settings with high TB burden. A pragmatic prospective cluster-randomised trial of Xpert for all individuals with presumptive (symptomatic) TB compared to the routine diagnostic algorithm of sputum microscopy and limited use of culture was conducted in a large TB/HIV primary care clinic. The primary outcome was the proportion of bacteriologically confirmed TB cases not initiating TB treatment by 3 mo after presentation. Secondary outcomes included time to TB treatment and mortality. Unblinded randomisation occurred on a weekly basis. Xpert and smear microscopy were performed on site. Analysis was both by intention to treat (ITT) and per protocol. Between 7 September 2010 and 28 October 2011, 1,985 participants were assigned to the Xpert (n = 982) and routine (n = 1,003) diagnostic algorithms (ITT analysis); 882 received Xpert and 1,063 routine (per protocol analysis). 13% (32/257) of individuals with bacteriologically confirmed TB (smear, culture, or Xpert) did not initiate treatment by 3 mo after presentation in the Xpert arm, compared to 25% (41/167) in the routine arm (ITT analysis, risk ratio 0.51, 95% CI 0.33-0.77, p = 0.0052). The yield of bacteriologically confirmed TB cases among patients with presumptive TB was 17% (167/1,003) with routine diagnosis and 26% (257/982) with Xpert diagnosis (ITT analysis, risk ratio 1.57, 95% CI 1.32-1.87, p<0.001). This difference in diagnosis rates resulted in a higher rate of treatment initiation in the Xpert arm: 23% (229/1,003) and 28% (277/982) in the routine and Xpert arms, respectively (ITT analysis, risk ratio 1.24, 95% CI 1.06-1.44, p = 0.013). Time to treatment initiation was improved overall (ITT analysis, hazard ratio 0.76, 95% CI 0.63-0.92, p = 0.005) and among HIV-infected participants (ITT analysis, hazard ratio 0.67, 95% CI 0.53-0.85, p = 0.001). There was no difference in 6-mo mortality with Xpert versus routine diagnosis. Study limitations included incorrect intervention allocation for a high proportion of participants and that the study was conducted in a single clinic. These data suggest that in this routine primary care setting, use of Xpert to diagnose TB increased the number of individuals with bacteriologically confirmed TB who were treated by 3 mo and reduced time to treatment initiation, particularly among HIV-infected participants. Pan African Clinical Trials Registry PACTR201010000255244. Please see later in the article for the Editors' Summary.
Nursing Routine Data as a Basis for Association Analysis in the Domain of Nursing Knowledge
Sellemann, Björn; Stausberg, Jürgen; Hübner, Ursula
2012-01-01
This paper describes the data mining method of association analysis within the framework of Knowledge Discovery in Databases (KDD) with the aim to identify standard patterns of nursing care. The approach is application-oriented and used on nursing routine data of the method LEP nursing 2. The increasing use of information technology in hospitals, especially of nursing information systems, requires the storage of large data sets, which hitherto have not always been analyzed adequately. Three association analyses for the days of admission, surgery and discharge, have been performed. The results of almost 1.5 million generated association rules indicate that it is valid to apply association analysis to nursing routine data. All rules are semantically trivial, since they reflect existing knowledge from the domain of nursing. This may be due either to the method LEP Nursing 2, or to the nursing activities themselves. Nonetheless, association analysis may in future become a useful analytical tool on the basis of structured nursing routine data. PMID:24199122
Pure flat epithelial atypia: is there a place for routine surgery?
Ceugnart, L; Doualliez, V; Chauvet, M-P; Robin, Y-M; Bachelle, F; Chaveron, C; Rocourt, N; Pouliquen, G; Jarraya, H; Taieb, S
2013-09-01
To determine whether it is appropriate to routinely undertake surgery if flat epithelial atypia (FEA) or pure flat epithelial atypia (pFEA) is found on large-core biopsy. Between 2005 and 2010, 1678 large-core biopsy procedures were carried out, which led to 136 FEA sites being identified, 63 of which across 59 patients were pFEA (four patients had two sites of pFEA each). Forty-eight patients underwent further surgical excision, equating to 52 excised sites of pFEA. Of the 52 operated sites, there were 20 benign lesions (38%), 26 borderline lesions (56%), and three ductal carcinomas in situ (6%). The rate of histologic underestimation was put at 3.8%. Of the three cases that were underestimated, one was discarded because the definitive histology was not representative of the site from which microcalcifications had initially been taken. The other two cases that were underestimated were found in patients with an increased individual risk of breast cancer. In patients with no personal or first-degree family history of breast cancer, after complete or subtotal excision under radiology of the radiological lesion, and while excluding images fitting BI-RADS 5, annual monitoring may be offered as an alternative to surgical excision in view of the absence of underestimation found in our study. Copyright © 2013 Éditions françaises de radiologie. Published by Elsevier Masson SAS. All rights reserved.
Fitness characteristics of a suburban special weapons and tactics team.
Pryor, Riana R; Colburn, Deanna; Crill, Matthew T; Hostler, David P; Suyama, J
2012-03-01
Special Weapons and Tactics (SWAT) operators are specialized law enforcement officers who traditionally perform their duties with higher anticipated workloads because of additional body armor, weapons, and equipment used for enhanced operations and protection. This elevated workload increases the need for SWAT operators to improve or maintain their physical fitness to consistently perform routine operations. Typical tasks require trunk rotation, overhead upper extremity use, upper and lower body strength use, and long waiting periods followed by explosive movements while wearing additional equipment. Eleven male SWAT operators from 1 SWAT team performed flexibility, strength, power, and aerobic capacity tests and a variety of job-related tasks. Data were compared with age- and gender-based normative data. Fitness testing revealed that officers ranked high on tests of muscular strength (leg strength, 90th percentile; bench press, 85th percentile); however, body composition (55th percentile), core body strength, and flexibility ranked lower. Furthermore, aerobic capacity and muscular power had a wide range of scores and were also not ideal to support maximal performance during routine operations. These data can assist exercise specialists choose fitness programs specifically for job-related tasks of SWAT operators when creating fitness programs. Fitness programming for law enforcement should focus on improving aerobic fitness, flexibility, core strength, and muscular power while maintaining muscular strength to meet the needs of these specialized officers.
Sim, Y T; Litherland, J; Lindsay, E; Hendry, P; Brauer, K; Dobson, H; Cordiner, C; Gagliardi, T; Smart, L
2015-05-01
To identify factors affecting upgrade rates from B5a (non-invasive) preoperative core biopsies to invasive disease at surgery and ways to improve screening performance. This was a retrospective analysis of 1252 cases of B5a biopsies across all six Scottish Breast Screening Programmes (BSPs), ranging between 2004 and 2012. Final surgical histopathology was correlated with radiological and biopsy factors. Data were analysed using basic Microsoft Excel and standard Chi-squared test used for evaluating statistical significance. B5a upgrade rates for the units ranged from 19.2% to 29.2%, with an average of 23.6%. Mean sizes of invasive tumours were small (3-11 mm). The upgrade rate was significantly higher for cases where the main mammographic abnormality was mass, distortion, or asymmetry, compared with micro-calcification alone (33.2% versus 21.7%, p = 0.0004). The upgrade rate was significantly lower with the use of large-volume vacuum-assisted biopsy (VAB) devices than 14 G core needles (19.9% versus 26%, p = 0.013); in stereotactic than ultrasound-guided biopsies (21.2% versus 36.1%, p < 0.001). Heterogeneity of data from different centres limited evaluation of other potential factors. Upgrade rates are lower for cases with micro-calcification as the sole mammographic feature with the use of VAB devices. Nevertheless, there is variation in practice across Scottish BSPs, including first-line biopsy technique and/or device; and it is of interest that a few centres maintain low upgrade rates despite not using VAB routinely for biopsy of micro-calcification. Copyright © 2015 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Kriegsmann, Mark; Casadonte, Rita; Kriegsmann, Jörg; Dienemann, Hendrik; Schirmacher, Peter; Hendrik Kobarg, Jan; Schwamborn, Kristina; Stenzinger, Albrecht; Warth, Arne; Weichert, Wilko
2016-01-01
Histopathological subtyping of non-small cell lung cancer (NSCLC) into adenocarcinoma (ADC), and squamous cell carcinoma (SqCC) is of utmost relevance for treatment stratification. However, current immunohistochemistry (IHC) based typing approaches on biopsies are imperfect, therefore novel analytical methods for reliable subtyping are needed. We analyzed formalin-fixed paraffin-embedded tissue cores of NSCLC by Matrix-assisted laser desorption/ionization (MALDI) imaging on tissue microarrays to identify and validate discriminating MALDI imaging profiles for NSCLC subtyping. 110 ADC and 98 SqCC were used to train a Linear Discriminant Analysis (LDA) model. Results were validated on a separate set of 58 ADC and 60 SqCC. Selected differentially expressed proteins were identified by tandem mass spectrometry and validated by IHC. The LDA classification model incorporated 339 m/z values. In the validation cohort, in 117 cases (99.1%) MALDI classification on tissue cores was in accordance with the pathological diagnosis made on resection specimen. Overall, three cases in the combined cohorts were discordant, after reevaluation two were initially misclassified by pathology whereas one was classified incorrectly by MALDI. Identification of differentially expressed peptides detected well-known IHC discriminators (CK5, CK7), but also less well known differentially expressed proteins (CK15, HSP27). In conclusion, MALDI imaging on NSCLC tissue cores as small biopsy equivalents is capable to discriminate lung ADC and SqCC with a very high accuracy. In addition, replacing multislide IHC by an one-slide MALDI approach may also save tissue for subsequent predictive molecular testing. We therefore advocate to pursue routine diagnostic implementation strategies for MALDI imaging in solid tumor typing. PMID:27473201
Kriegsmann, Mark; Casadonte, Rita; Kriegsmann, Jörg; Dienemann, Hendrik; Schirmacher, Peter; Hendrik Kobarg, Jan; Schwamborn, Kristina; Stenzinger, Albrecht; Warth, Arne; Weichert, Wilko
2016-10-01
Histopathological subtyping of non-small cell lung cancer (NSCLC) into adenocarcinoma (ADC), and squamous cell carcinoma (SqCC) is of utmost relevance for treatment stratification. However, current immunohistochemistry (IHC) based typing approaches on biopsies are imperfect, therefore novel analytical methods for reliable subtyping are needed. We analyzed formalin-fixed paraffin-embedded tissue cores of NSCLC by Matrix-assisted laser desorption/ionization (MALDI) imaging on tissue microarrays to identify and validate discriminating MALDI imaging profiles for NSCLC subtyping. 110 ADC and 98 SqCC were used to train a Linear Discriminant Analysis (LDA) model. Results were validated on a separate set of 58 ADC and 60 SqCC. Selected differentially expressed proteins were identified by tandem mass spectrometry and validated by IHC. The LDA classification model incorporated 339 m/z values. In the validation cohort, in 117 cases (99.1%) MALDI classification on tissue cores was in accordance with the pathological diagnosis made on resection specimen. Overall, three cases in the combined cohorts were discordant, after reevaluation two were initially misclassified by pathology whereas one was classified incorrectly by MALDI. Identification of differentially expressed peptides detected well-known IHC discriminators (CK5, CK7), but also less well known differentially expressed proteins (CK15, HSP27). In conclusion, MALDI imaging on NSCLC tissue cores as small biopsy equivalents is capable to discriminate lung ADC and SqCC with a very high accuracy. In addition, replacing multislide IHC by an one-slide MALDI approach may also save tissue for subsequent predictive molecular testing. We therefore advocate to pursue routine diagnostic implementation strategies for MALDI imaging in solid tumor typing. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
Discovering epistasis in large scale genetic association studies by exploiting graphics cards.
Chen, Gary K; Guo, Yunfei
2013-12-03
Despite the enormous investments made in collecting DNA samples and generating germline variation data across thousands of individuals in modern genome-wide association studies (GWAS), progress has been frustratingly slow in explaining much of the heritability in common disease. Today's paradigm of testing independent hypotheses on each single nucleotide polymorphism (SNP) marker is unlikely to adequately reflect the complex biological processes in disease risk. Alternatively, modeling risk as an ensemble of SNPs that act in concert in a pathway, and/or interact non-additively on log risk for example, may be a more sensible way to approach gene mapping in modern studies. Implementing such analyzes genome-wide can quickly become intractable due to the fact that even modest size SNP panels on modern genotype arrays (500k markers) pose a combinatorial nightmare, require tens of billions of models to be tested for evidence of interaction. In this article, we provide an in-depth analysis of programs that have been developed to explicitly overcome these enormous computational barriers through the use of processors on graphics cards known as Graphics Processing Units (GPU). We include tutorials on GPU technology, which will convey why they are growing in appeal with today's numerical scientists. One obvious advantage is the impressive density of microprocessor cores that are available on only a single GPU. Whereas high end servers feature up to 24 Intel or AMD CPU cores, the latest GPU offerings from nVidia feature over 2600 cores. Each compute node may be outfitted with up to 4 GPU devices. Success on GPUs varies across problems. However, epistasis screens fare well due to the high degree of parallelism exposed in these problems. Papers that we review routinely report GPU speedups of over two orders of magnitude (>100x) over standard CPU implementations.
Discovering epistasis in large scale genetic association studies by exploiting graphics cards
Chen, Gary K.; Guo, Yunfei
2013-01-01
Despite the enormous investments made in collecting DNA samples and generating germline variation data across thousands of individuals in modern genome-wide association studies (GWAS), progress has been frustratingly slow in explaining much of the heritability in common disease. Today's paradigm of testing independent hypotheses on each single nucleotide polymorphism (SNP) marker is unlikely to adequately reflect the complex biological processes in disease risk. Alternatively, modeling risk as an ensemble of SNPs that act in concert in a pathway, and/or interact non-additively on log risk for example, may be a more sensible way to approach gene mapping in modern studies. Implementing such analyzes genome-wide can quickly become intractable due to the fact that even modest size SNP panels on modern genotype arrays (500k markers) pose a combinatorial nightmare, require tens of billions of models to be tested for evidence of interaction. In this article, we provide an in-depth analysis of programs that have been developed to explicitly overcome these enormous computational barriers through the use of processors on graphics cards known as Graphics Processing Units (GPU). We include tutorials on GPU technology, which will convey why they are growing in appeal with today's numerical scientists. One obvious advantage is the impressive density of microprocessor cores that are available on only a single GPU. Whereas high end servers feature up to 24 Intel or AMD CPU cores, the latest GPU offerings from nVidia feature over 2600 cores. Each compute node may be outfitted with up to 4 GPU devices. Success on GPUs varies across problems. However, epistasis screens fare well due to the high degree of parallelism exposed in these problems. Papers that we review routinely report GPU speedups of over two orders of magnitude (>100x) over standard CPU implementations. PMID:24348518
Sugano, Mitsutoshi; Shimada, Masashi; Moriyoshi, Miho; Kitagawa, Kiyoki; Nakashima, Hiromi; Wada, Hideo; Yanagihara, Katsunori; Fujisawa, Shinya; Yonekawa, Osamu; Honda, Takayuki
2012-05-01
Routine laboratory data are discussed by time series analysis in reversed clinicopathological conferences (R-CPC) at Shinshu University School of Medicine. We can identify fine changes in the laboratory data and the importance of negative data (without any changes) using time series analysis. Routine laboratory tests can be performed repeatedly and relatively cheaply, and time series analysis can be performed. The examination process of routine laboratory data in the R-CPC is almost the same as the process of taking physical findings. Firstly, general findings are checked and then the state of each organ is examined. Although routine laboratory data are cheap, we can obtain much more information about a patient's state than from physical examinations. In this R-CPC, several specialists in the various fields of laboratory medicine discussed the routine laboratory data of a patient, and we tried to understand the detailed state of the patient. R-CPC is an educational method to examine laboratory data and we, reconfirmed the usefulness of R-CPC to elucidate the clinical state of the patient.
O'Sullivan, Daniel J; Cotter, Paul D; O'Sullivan, Orla; Giblin, Linda; McSweeney, Paul L H; Sheehan, Jeremiah J
2015-04-01
We sought to determine if the time, within a production day, that a cheese is manufactured has an influence on the microbial community present within that cheese. To facilitate this, 16S rRNA amplicon sequencing was used to elucidate the microbial community dynamics of brine-salted continental-type cheese in cheeses produced early and late in the production day. Differences in the microbial composition of the core and rind of the cheese were also investigated. Throughout ripening, it was apparent that cheeses produced late in the day had a more diverse microbial population than their early equivalents. Spatial variation between the cheese core and rind was also noted in that cheese rinds were initially found to have a more diverse microbial population but thereafter the opposite was the case. Interestingly, the genera Thermus, Pseudoalteromonas, and Bifidobacterium, not routinely associated with a continental-type cheese produced from pasteurized milk, were detected. The significance, if any, of the presence of these genera will require further attention. Ultimately, the use of high-throughput sequencing has facilitated a novel and detailed analysis of the temporal and spatial distribution of microbes in this complex cheese system and established that the period during a production cycle at which a cheese is manufactured can influence its microbial composition. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Ollier, Laurence; Laffont, Catherine; Kechkekian, Aurore; Doglio, Alain; Giordanengo, Valérie
2008-12-01
Routine use of the automated chemiluminescent microparticle immunoassay Abbott ARCHITECT anti-HBc for diagnosis of hepatitis B is limited in case of borderline reactive sera with low signal close to the cut-off index. In order to determine the significance of anti-HBc detection when borderline reactivity occurs using the ARCHITECT anti-HBc assay, a comparative study was designed. 3540 serum samples collected over a 2-month period in the hospital of Nice were examined for markers of HBV infection (HBsAg, anti-HBs and anti-HBc). One hundred seven samples with sufficient volume and with borderline reactivity by the ARCHITECT assay were tested by two other anti-HBc assays, a microparticle enzyme immunoassay (MEIA, AxSYM Core, Abbott Laboratories, IL, USA) and an enzyme linked fluorescent assay (ELFA, VIDAS Anti-HBc Total II, bioMérieux, Lyon, France). Only 46 samples were confirmed by the AxSYM and the VIDAS assays. Additional serological information linked to patient history showed that the remaining samples (61) were false positives (11), had low titer of anti-HBc antibodies (13), or were inconclusive (37). This comparative study highlighted the existence of a grey zone around the cut-off index. Confirmative results through a different immunoassay are needed to confirm the diagnosis of HBV on borderline reactive sera using the ARCHITECT anti-HBc assay.
Towards a Formative Assessment of Classroom Competencies (FACCs) for postgraduate medical trainees.
Gray, Christopher S; Hildreth, Anthony J; Fisher, Catherine; Brown, Andrew; Jones, Anita; Turner, Ruth; Boobis, Leslie
2008-12-17
An assumption of clinical competency is no longer acceptable or feasible in routine clinical practice. We sought to determine the feasibility, practicability and efficacy of undertaking a formal assessment of clinical competency for all postgraduate medical trainees in a large NHS foundation trust. FY1 doctors were asked to complete a questionnaire to determine prior experience and self reported confidence in performing the GMC core competencies. From this a consensus panel of key partners considered and developed an 8 station Objective Structured Clinical Examination (OSCE) circuit to assess clinical competencies in all training grade medical staff... The OSCE was then administered to all training grade doctors as part of their NHS trust induction process. 106 (87.6% of all trainees) participated in the assessment during the first 14 days of appointment. Candidates achieved high median raw percentage scores for the majority of stations however analysis of pre defined critical errors and omissions identified important areas for concern. Performance of newly qualified FY1 doctor was significantly better than other grades for the arterial blood gas estimation and nasogastric tube insertion stations. Delivering a formal classroom assessment of clinical competencies to all trainees as part of the induction process was both feasible and useful. The assessment identified areas of concern for future training and also served to reassure as to the proficiency of trainees in undertaking the majority of core competencies.
Overview of Recent Alcator C-Mod Highlights
NASA Astrophysics Data System (ADS)
Marmar, Earl; C-Mod Team
2013-10-01
Analysis and modeling of recent C-Mod experiments has yielded significant results across multiple research topics. I-mode provides routine access to high confinement plasma (H98 up to 1.2) in quasi-steady state, without large ELMs; pedestal pressure and impurity transport are regulated by short-wavelength EM waves, and core turbulence is reduced. Multi-channel transport is being investigated in Ohmic and RF-heated plasmas, using advanced diagnostics to validate non-linear gyrokinetic simulations. Results from the new field-aligned ICRF antenna, including significantly reduced high-Z metal impurity contamination, and greatly improved load-tolerance, are being understood through antenna-plasma modeling. Reduced LHCD efficiency at high density correlates with parametric decay and enhanced edge absorption. Strong flow drive and edge turbulence suppression are seen from LHRF, providing new approaches for plasma control. Plasma density profiles directly in front of the LH coupler show non-linear modifications, with important consequences for wave coupling. Disruption-mitigation experiments using massive gas injection at multiple toroidal locations show unexpected results, with potentially significant implications for ITER. First results from a novel accelerator-based PMI diagnostic are presented. What would be the world's first actively-heated high-temperature advanced tungsten divertor is designed and ready for construction. Conceptual designs are being developed for an ultra-advanced divertor facility, Alcator DX, to attack key FNSF and DEMO heat-flux challenges integrated with a high-performance core. Supported by USDOE.
Campbell, Malcolm; Gibson, Will; Hall, Andy; Richards, David; Callery, Peter
2008-05-01
Web-based technologies are increasingly being used to create modes of online learning for nurses but their effect has not been assessed in nurse education. Assess whether participation in face-to-face discussion seminars or online asynchronous discussion groups had different effects on educational attainment in a web-based course. Non-randomised or quasi-experimental design with two groups-students choosing to have face-to-face discussion seminars and students choosing to have online discussions. The Core Methods module of a postgraduate research methods course. All 114 students participating in the first 2 yr during which the course teaching material was delivered online. Assignment mark for Core Methods course module. Background details of the students, their choices of modules and assignment marks were collected as part of the routine course administration. Students' online activities were identified using the student tracking facility within WebCT. Regression models were fitted to explore the association between available explanatory variables and assignment mark. Students choosing online discussions had a higher Core Methods assignment mark (mean 60.8/100) than students choosing face-to-face discussions (54.4); the difference was statistically significant (t=3.13, df=102, p=0.002), although this ignores confounding variables. Among online discussion students, assignment mark was significantly correlated with the numbers of discussion messages read (Kendall's tau(b)=0.22, p=0.050) and posted (Kendall's tau(b)=0.27, p=0.017); among face-to-face discussion students, it was significantly associated with the number of non-discussion hits in WebCT (Kendall's tau(b)=0.19, p=0.036). In regression analysis, choice of discussion method, whether an M.Phil./Ph.D. student, number of non-discussion hits in WebCT, number of online discussion messages read and number posted were associated with assignment mark at the 5% level of significance when taken singly; in combination, only whether an M.Phil./Ph.D. student (p=0.024) and number of non-discussion hits (p=0.045) retained significance. This study demonstrates that a research methods course can be delivered to postgraduate healthcare students at least as successfully by an entirely online method in which students participate in online discussion as by a blended method in which students accessing web-based teaching material attend face-to-face seminar discussions. Increased online activity was associated with higher assignment marks. The study highlights new opportunities for educational research that arise from the use of virtual learning environments that routinely record the activities of learners and tutors.
Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J
2012-11-09
A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.
Brandes, Ivo F; Perl, Thorsten; Bauer, Martin; Bräuer, Anselm
2015-02-01
Reliable continuous perioperative core temperature measurement is of major importance. The pulmonary artery catheter is currently the gold standard for measuring core temperature but is invasive and expensive. Using a manikin, we evaluated the new, noninvasive SpotOn™ temperature monitoring system (SOT). With a sensor placed on the lateral forehead, SOT uses zero heat flux technology to noninvasively measure core temperature; and because the forehead is devoid of thermoregulatory arteriovenous shunts, a piece of bone cement served as a model of the frontal bone in this study. Bias, limits of agreements, long-term measurement stability, and the lowest measurable temperature of the device were investigated. Bias and limits of agreement of the temperature data of two SOTs and of the thermistor placed on the manikin's surface were calculated. Measurements obtained from SOTs were similar to thermistor values. The bias and limits of agreement lay within a predefined clinically acceptable range. Repeat measurements differed only slightly, and stayed stable for hours. Because of its temperature range, the SOT cannot be used to monitor temperatures below 28°C. In conclusion, the new SOT could provide a reliable, less invasive and cheaper alternative for measuring perioperative core temperature in routine clinical practice. Further clinical trials are needed to evaluate these results.
High-performance 3D compressive sensing MRI reconstruction.
Kim, Daehyun; Trzasko, Joshua D; Smelyanskiy, Mikhail; Haider, Clifton R; Manduca, Armando; Dubey, Pradeep
2010-01-01
Compressive Sensing (CS) is a nascent sampling and reconstruction paradigm that describes how sparse or compressible signals can be accurately approximated using many fewer samples than traditionally believed. In magnetic resonance imaging (MRI), where scan duration is directly proportional to the number of acquired samples, CS has the potential to dramatically decrease scan time. However, the computationally expensive nature of CS reconstructions has so far precluded their use in routine clinical practice - instead, more-easily generated but lower-quality images continue to be used. We investigate the development and optimization of a proven inexact quasi-Newton CS reconstruction algorithm on several modern parallel architectures, including CPUs, GPUs, and Intel's Many Integrated Core (MIC) architecture. Our (optimized) baseline implementation on a quad-core Core i7 is able to reconstruct a 256 × 160×80 volume of the neurovasculature from an 8-channel, 10 × undersampled data set within 56 seconds, which is already a significant improvement over existing implementations. The latest six-core Core i7 reduces the reconstruction time further to 32 seconds. Moreover, we show that the CS algorithm benefits from modern throughput-oriented architectures. Specifically, our CUDA-base implementation on NVIDIA GTX480 reconstructs the same dataset in 16 seconds, while Intel's Knights Ferry (KNF) of the MIC architecture even reduces the time to 12 seconds. Such level of performance allows the neurovascular dataset to be reconstructed within a clinically viable time.
An Analysis of Students' Mistakes on Routine Slope Tasks
ERIC Educational Resources Information Center
Cho, Peter; Nagle, Courtney
2017-01-01
This study extends past research on students' understanding of slope by analyzing college students' mistakes on routine tasks involving slope. We conduct quantitative and qualitative analysis of students' mistakes to extract information regarding slope conceptualizations described in prior research. Results delineate procedural proficiencies and…
Shiferaw, Atsede Mazengia; Zegeye, Dessalegn Tegabu; Assefa, Solomon; Yenit, Melaku Kindie
2017-08-07
Using reliable information from routine health information systems over time is an important aid to improving health outcomes, tackling disparities, enhancing efficiency, and encouraging innovation. In Ethiopia, routine health information utilization for enhancing performance is poor among health workers, especially at the peripheral levels of health facilities. Therefore, this study aimed to assess routine health information system utilization and associated factors among health workers at government health institutions in East Gojjam Zone, Northwest Ethiopia. An institution based cross-sectional study was conducted at government health institutions of East Gojjam Zone, Northwest Ethiopia from April to May, 2013. A total of 668 health workers were selected from government health institutions, using the cluster sampling technique. Data collected using a standard structured and self-administered questionnaire and an observational checklist were cleaned, coded, and entered into Epi-info version 3.5.3, and transferred into SPSS version 20 for further statistical analysis. Variables with a p-value of less than 0.05 at multiple logistic regression analysis were considered statistically significant factors for the utilization of routine health information systems. The study revealed that 45.8% of the health workers had a good level of routine health information utilization. HMIS training [AOR = 2.72, 95% CI: 1.60, 4.62], good data analysis skills [AOR = 6.40, 95%CI: 3.93, 10.37], supervision [AOR = 2.60, 95% CI: 1.42, 4.75], regular feedback [AOR = 2.20, 95% CI: 1.38, 3.51], and favorable attitude towards health information utilization [AOR = 2.85, 95% CI: 1.78, 4.54] were found significantly associated with a good level of routine health information utilization. More than half of the health workers working at government health institutions of East Gojjam were poor health information users compared with the findings of others studies. HMIS training, data analysis skills, supervision, regular feedback, and favorable attitude were factors related to routine health information system utilization. Therefore, a comprehensive training, supportive supervision, and regular feedback are highly recommended for improving routine health information utilization among health workers at government health facilities.
Towards shared patient records: an architecture for using routine data for nationwide research.
Knaup, Petra; Garde, Sebastian; Merzweiler, Angela; Graf, Norbert; Schilling, Freimut; Weber, Ralf; Haux, Reinhold
2006-01-01
Ubiquitous information is currently one of the most challenging slogans in medical informatics research. An adequate architecture for shared electronic patient records is needed which can use data for multiple purposes and which is extensible for new research questions. We introduce eardap as architecture for using routine data for nationwide clinical research in a multihospital environment. eardap can be characterized as terminology-based. Main advantage of our approach is the extensibility by new items and new research questions. Once the definition of items for a research question is finished, a consistent, corresponding database can be created without any informatics skills. Our experiences in pediatric oncology in Germany have shown the applicability of eardap. The functions of our core system were in routine clinical use in several hospitals. We validated the terminology management system (TMS) and the module generation tool with the basic data set of pediatric oncology. The multiple usability depends mainly on the quality of item planning in the TMS. High quality harmonization will lead to a higher amount of multiply used data. When using eardap, special emphasis is to be placed on interfaces to local hospital information systems and data security issues.
ARCHANGEL: Galaxy Photometry System
NASA Astrophysics Data System (ADS)
Schombert, James
2011-07-01
ARCHANGEL is a Unix-based package for the surface photometry of galaxies. While oriented for large angular size systems (i.e. many pixels), its tools can be applied to any imaging data of any size. The package core contains routines to perform the following critical galaxy photometry functions: sky determination; frame cleaning; ellipse fitting; profile fitting; and total and isophotal magnitudes. The goal of the package is to provide an automated, assembly-line type of reduction system for galaxy photometry of space-based or ground-based imaging data. The procedures outlined in the documentation are flux independent, thus, these routines can be used for non-optical data as well as typical imaging datasets. ARCHANGEL has been tested on several current OS's (RedHat Linux, Ubuntu Linux, Solaris, Mac OS X). A tarball for installation is available at the download page. The main routines are Python and FORTRAN based, therefore, a current installation of Python and a FORTRAN compiler are required. The ARCHANGEL package also contains Python hooks to the PGPLOT package, an XML processor and network tools which automatically link to data archives (i.e. NED, HST, 2MASS, etc) to download images in a non-interactive manner.
Capture and surveillance of quad-bike (ATV)-related injuries in administrative data collections.
Mitchell, Rebecca J; Grzebieta, Raphael; Rechnitzer, George
2016-09-01
Identifying quad-bike-related injuries in administrative data collections can be problematic. This study sought to determine whether quad-bike-related injuries could be identified in routinely collected administrative data collections in New South Wales (NSW), Australia, and to determine the information recorded according to World Health Organization (WHO) injury surveillance guidelines that could assist injury prevention efforts. Five routinely collected administrative data collections in NSW in the period 2000-2012 were reviewed. The WHO core minimum data items recorded in each of the five data collections ranged from 37.5% to 75.0%. Age and sex of the injured individual were the only data items that were recorded in all data collections. The data collections did not contain detailed information on the circumstances of quad bike incidents. Major improvements are needed in the information collected in these data-sets, if their value is to be increased and used for injury prevention purposes.
Larson, Elizabeth; Miller-Bishoff, Thomas
2014-01-01
Using mixed methods, this study examined the relationship of caregivers of children with disabilities’ psychological well-being (PWB) and their orchestration of daily routines within their ecological niche. Thirty-nine U.S. caregivers completed in-depth interviews, PWB Scales, and Family Time and Routines Index (FTRI). We used a multi-step analysis. Interview data was coded and vignettes created without knowledge of PWB and FTRI ratings. Next, the relationship of quantitative measures was analyzed. Four groups were created using FTRI-extent and PWB means: (1) low routine-low PWB, (2) low routine-high PWB, (3) high routine-low PWB, and (4) high routine-high PWB. We examined qualitative differences in key features between groups. Findings: Total PWB and FTRI scores were not significantly correlated, PWB Purpose in Life and FTRI-extent scores were moderately positively correlated, and PWB Environmental Mastery and FTRI-extent correlation approached significance. Qualitative findings describe caregivers’ structuring of routines, intensity of oversight, support in routines, management of dinner, paid work, and needs for respite. The four groups differed in paid work, household support, degree the child could self-occupy, Environmental Mastery, and opportunities to recuperate. Caregivers with higher levels of well-being and more regular routines did paid work, had supportive spouses, had children who more often could follow routines, had higher Environmental Mastery, could orchestrate a family meal, and had breaks from care in either work or leisure. All Native American caregivers and Mexican American caregivers with spouses were in the high routine-high PWB group. Insight into this complex negotiation between family members within daily routines may provide practitioners a better understanding of how to work within family circles to foster therapeutic alliances, identify focused intervention targets, and promote positive family wide outcomes. PMID:24910625
Analysis of routine pilot-controller communication
NASA Technical Reports Server (NTRS)
Morrow, Daniel G.; Lee, Alfred; Rodvold, Michelle
1990-01-01
Although pilot-controller communication is central to aviation safety, this area of aviation human factors has not been extensively researched. Most research has focused on what kinds of communication problems occur. A more complete picture of communication problems requires understanding how communication usually works in routine operations. A sample of routine pilot-controller communication in the TRACON environment is described. After describing several dimensions of routine communication, three kinds of communication problems are treated: inaccuracies such as incorrect readbacks, procedural deviations such as missing callsigns and readbacks, and nonroutine transactions where pilot and controller must deal with misunderstandings or other communication problems. Preliminary results suggest these problems are not frequent events in daily operations. However, analysis of the problems that do occur suggest some factors that may cause them.
Cavanagh, Kate; Seccombe, Nick; Lidbetter, Nicky
2011-07-01
The efficacy and effectiveness of a computerized cognitive behavioural therapy (CCBT) package, Beating the Blues, has been demonstrated in a large randomized controlled trial and several pragmatic studies in the National Health Service (NHS). The current study tests the generalizability of this finding to the implementation of CCBT in a service user-led, third sector Self Help Clinic. 510 referrals for the Beating the Blues program were received over a 16 month period in routine care. The Patient Health Questionnaire Depression (PHQ-9) and Anxiety (GAD-7) Scales were administered pre-treatment and during each treatment session. The 10-item Clinical Outcomes in Routine Evaluation-Outcome Measure (CORE-OM), Work and Social Adjustment Scale and Patient Experience Questionnaire were also administered pre-treatment and immediately on completing treatment. More than two-thirds of referrals were suitable for treatment and completed a baseline assessment; 84% of these started the Beating the Blues program. Two-hundred and twenty-six people meeting caseness criteria at baseline completed at least two sessions of CCBT. Of these, 50% met recovery criteria at their final point of measurement. Completer and intention-to-treat analysis also demonstrated statistically and clinically significant improvements on key outcome measures. CCBT can be effectively implemented in a service user-led, third sector Self Help Clinic, increasing access to psychological therapies to meet local needs for tier two interventions for depression and anxiety.
Cognitive-behavioural treatment for weight loss in primary care: a prospective study.
Eichler, Klaus; Zoller, Marco; Steurer, Johann; Bachmann, Lucas M
2007-09-08
Cognitive-behavioural treatment (CBT) is effective for weight loss in obese patients, but such programmes are difficult to implement in primary care. We assessed whether implementation of a community-based CBT weight loss programme for adults in routine care is feasible and prospectively assessed patient outcome. The weight loss programme was provided by a network of Swiss general practitioners in cooperation with a community centre for health education. We chose a five-step strategy focusing on structure of care rather than primarily addressing individual physician behaviour. A multidisciplinary core group of trained CBT instructors acted as the central element of the programme. Overweight and obese adults from the community (BMI >25 kg/m2) were included. We used a patient perspective to report the impact on delivery of care and assessed weight change of consecutive participants prospectively with a follow-up of 12 months. Twenty-eight courses, with 16 group meetings each, were initiated over a period of 3 years. 44 of 110 network physicians referred patients to the programme. 147 of 191 study participants were monitored for one year (attrition rate: 23%). Median weight loss after 12 months for 147 completers was 4 kg (IQR: 1-7 kg; intention-to-treat analysis for 191 participants: 2 kg, IQR: 0-5 kg). The programme produced a clinically meaningful weight loss in our participants, with a relatively low attrition rate. Implementation of an easily accessible CBT programme for weight loss in daily routine primary care is feasible.
Maindal, Helle Terkildsen; Støvring, Henrik; Sandbaek, Annelli
2014-08-29
The periodic health check-up has been a fundamental part of routine medical practice for decades, despite a lack of consensus regarding its value in health promotion and disease prevention. A large-scale Danish population-based preventive programme 'Check your health' was developed based on available evidence of screening and successive accepted treatment, prevention for diseases and health promotion, and is closely aligned with the current health care system.The objective of the 'Check your health' [CORE] trial is to investigate effectiveness on health outcomes of a preventive health check offered at a population-level to all individuals aged 30-49 years, and to establish the cost-effectiveness. The trial will be conducted as a pragmatic household-cluster randomised controlled trial involving 10,505 individuals. All individuals within a well-defined geographical area in the Central Denmark Region, Denmark (DK) were randomised to be offered a preventive health check (Intervention group, n = 5250) or to maintain routine access to healthcare until a delayed intervention (Comparison group, n = 5255). The programme consists of a health examination which yields an individual risk profile, and according to this participants are assigned to one of the following interventions: (a) referral to a health promoting consultation in general practice, (b) behavioural programmes at the local Health Centre, or (c) no need for follow-up.The primary outcomes at 4 years follow-up are: ten-year-risk of fatal cardiovascular event (Heart-SCORE model), physical activity level (self-report and cardiorespiratory fitness), quality of life (SF12), sick leave and labour market attachment. Cost-effectiveness will be evaluated according to life years gained, direct costs and total health costs. Intention to treat analysis will be performed. Results from the largest Danish health check programme conducted within the current healthcare system, spanning the sectors which share responsibility for the individual, will provide a scientific basis to be used in the development of systems to optimise population health in the 21st century. The trial has registered at ClinicalTrials.gov with an ID: NCT02028195 (7. March 2014).
Gimbel, Sarah; Mwanza, Moses; Nisingizwe, Marie Paul; Michel, Cathy; Hirschhorn, Lisa
2017-12-21
High-quality data are critical to inform, monitor and manage health programs. Over the seven-year African Health Initiative of the Doris Duke Charitable Foundation, three of the five Population Health Implementation and Training (PHIT) partnership projects in Mozambique, Rwanda, and Zambia introduced strategies to improve the quality and evaluation of routinely-collected data at the primary health care level, and stimulate its use in evidence-based decision-making. Using the Consolidated Framework for Implementation Research (CFIR) as a guide, this paper: 1) describes and categorizes data quality assessment and improvement activities of the projects, and 2) identifies core intervention components and implementation strategy adaptations introduced to improve data quality in each setting. The CFIR was adapted through a qualitative theme reduction process involving discussions with key informants from each project, who identified two domains and ten constructs most relevant to the study aim of describing and comparing each country's data quality assessment approach and implementation process. Data were collected on each project's data quality improvement strategies, activities implemented, and results via a semi-structured questionnaire with closed and open-ended items administered to health management information systems leads in each country, with complementary data abstraction from project reports. Across the three projects, intervention components that aligned with user priorities and government systems were perceived to be relatively advantageous, and more readily adapted and adopted. Activities that both assessed and improved data quality (including data quality assessments, mentorship and supportive supervision, establishment and/or strengthening of electronic medical record systems), received higher ranking scores from respondents. Our findings suggest that, at a minimum, successful data quality improvement efforts should include routine audits linked to ongoing, on-the-job mentoring at the point of service. This pairing of interventions engages health workers in data collection, cleaning, and analysis of real-world data, and thus provides important skills building with on-site mentoring. The effect of these core components is strengthened by performance review meetings that unify multiple health system levels (provincial, district, facility, and community) to assess data quality, highlight areas of weakness, and plan improvements.
Jones, K.; Kim, K.; Patel, B.; Kelsen, S.; Braverman, A.; Swinton, D.; Gafken, P.; Jones, L.; Lane, W.; Neveu, J.; Leung, H.; Shaffer, S.; Leszyk, J.; Stanley, B.; Fox, T.; Stanley, A.; Yeung, Anthony
2013-01-01
Proteomic research can benefit from simultaneous access to multiple cutting-edge mass spectrometers. 18 core facilities responded to our investigators seeking service through the ABRF Discussion Forum. Five of the facilities selected completed four plasma proteomics experiments as routine fee-for-service. Each biological experiment entailed an iTRAQ 4-plex proteome comparison of immunodepleted plasma provided as 30 labeled-peptide fractions. Identical samples were analyzed by two AB SCIEX TripleTOF 5600 and three Thermo Orbitrap (Elite/Velos Pro/Q Exactive) instruments. 480 LC-MS/MS runs delivered >250 GB of data over two months. We compare herein routine service analyses of three peptide fractions of different peptide abundance. Data files from each instrument were studied to develop optimal analysis parameters to compare with default parameters in Mascot Distiller 2.4, ProteinPilot 4.5 beta, AB Sciex MS Data Converter 1.3 beta, and Proteome Discover 1.3. Peak-picking for TripleTOFs was best by ProteinPilot 4.5 beta while Mascot Distiller and Proteome Discoverer were comparable for the Orbitraps. We compared protein identification and quantitation in SwissProt 2012_07 database by Mascot Server 2.4.01 versus ProteinPilot. By all search methods, more proteins, up to two fold, were identified using the Q Exactive than others. Q Exactive excelled also at the number of unique significant peptide ion sequences. However, software-dependent impact on subsequent interpretation, due to peptide modifications, can be critical. These findings may have special implications for iTRAQ plasma proteomics. For the low abundance peptide ions, the slope of the dynamic range drop-off in the plasma proteome is uniquely sharp compared with cell lysates. Our study provides data for testable improvements in the operation of these mass spectrometers. More importantly, we have demonstrated a new affordable expedient workflow for investigators to perform proteomic experiments through the ABRF infrastructure. (We acknowledge John Cottrell for optimizing the peak-picking parameters for Mascot Distiller).
76 FR 4435 - Privacy Act of 1974; Report of Modified or Altered System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-25
... entry, computer systems analysis and computer programming services. The contractors promptly return data.... Proposed Routine Use Disclosures of Data in the System This System of Records contains information such as... compatible use of data is known as a ``routine use''. The routine uses proposed for this System are...
Computer routine adds plotting capabilities to existing programs
NASA Technical Reports Server (NTRS)
Harris, J. C.; Linnekin, J. S.
1966-01-01
PLOTAN, a generalized plot analysis routine written for the IBM 7094 computer, minimizes the difficulties in adding plot capabilities to large existing programs. PLOTAN is used in conjunction with a binary tape writing routine and has the ability to plot any variable on the intermediate binary tape as a function of any other.
Gonzalez-Escalona, Narjol; Jolley, Keith A; Reed, Elizabeth; Martinez-Urtaza, Jaime
2017-06-01
Vibrio parahaemolyticus is an important human foodborne pathogen whose transmission is associated with the consumption of contaminated seafood, with a growing number of infections reported over recent years worldwide. A multilocus sequence typing (MLST) database for V. parahaemolyticus was created in 2008, and a large number of clones have been identified, causing severe outbreaks worldwide (sequence type 3 [ST3]), recurrent outbreaks in certain regions (e.g., ST36), or spreading to other regions where they are nonendemic (e.g., ST88 or ST189). The current MLST scheme uses sequences of 7 genes to generate an ST, which results in a powerful tool for inferring the population structure of this pathogen, although with limited resolution, especially compared to pulsed-field gel electrophoresis (PFGE). The application of whole-genome sequencing (WGS) has become routine for trace back investigations, with core genome MLST (cgMLST) analysis as one of the most straightforward ways to explore complex genomic data in an epidemiological context. Therefore, there is a need to generate a new, portable, standardized, and more advanced system that provides higher resolution and discriminatory power among V. parahaemolyticus strains using WGS data. We sequenced 92 V. parahaemolyticus genomes and used the genome of strain RIMD 2210633 as a reference (with a total of 4,832 genes) to determine which genes were suitable for establishing a V. parahaemolyticus cgMLST scheme. This analysis resulted in the identification of 2,254 suitable core genes for use in the cgMLST scheme. To evaluate the performance of this scheme, we performed a cgMLST analysis of 92 newly sequenced genomes, plus an additional 142 strains with genomes available at NCBI. cgMLST analysis was able to distinguish related and unrelated strains, including those with the same ST, clearly showing its enhanced resolution over conventional MLST analysis. It also distinguished outbreak-related from non-outbreak-related strains within the same ST. The sequences obtained from this work were deposited and are available in the public database (http://pubmlst.org/vparahaemolyticus). The application of this cgMLST scheme to the characterization of V. parahaemolyticus strains provided by different laboratories from around the world will reveal the global picture of the epidemiology, spread, and evolution of this pathogen and will become a powerful tool for outbreak investigations, allowing for the unambiguous comparison of strains with global coverage. Copyright © 2017 Gonzalez-Escalona et al.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naqvi, S
2014-06-15
Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physicalmore » principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as virtual experiments that give deeper and long lasting understanding of core principles. The student can then make sound judgements in novel situations encountered beyond routine clinical activities.« less
Study and Design of Flight Data Recording Systems for Military Aircraft
1976-06-01
minicomputer (PDP-11/ 40 ) with 24K of core memory and a disk operating system. Peripherals include a CRT terminal, two 9-track magnetic tape drives, a 19 high...in question-answer mode. The NTSB plans to adapt an existing routine to the PDP 11/ 40 which will prepare a ground track of the aircraft from the...20 microseconds). Like PMOS memory, multiple power supplies were required. The next generation of microprocessors were implemented on a 40 pin package
The 1992 activities of the International GPS Geodynamics Service (IGS).
NASA Astrophysics Data System (ADS)
Beutler, G.
The primary goal of the International GPS Geodynamics Service (IGS) is to give the scientific community high quality GPS orbits (and related information like earth orientation parameters) to perform regional or local GPS analyses without further orbit improvement. The declared goal of the three month 1992 IGS Test Campaign was the routine production of accurate GPS orbits using the observations of about 30 globally distributed IGS Core Sites. IGS Epoch Campaigns will be organized about every second year.
Armstrong, Andrea F; Valliant, John F
2010-09-21
Carboranes form stable complexes with the [M(CO)(3)](+) (M = (99m)Tc, Re) core and are viable ligands for the development of targeted radiopharmaceuticals. (99m)Tc-carborane complexes were found to exhibit substantially different 1,2-->1,7 cage isomerisation behaviour than their Re counterparts, challenging the validity of the routine use of rhenium as a surrogate for the development of technetium-99m based molecular imaging agents.
A pluggable framework for parallel pairwise sequence search.
Archuleta, Jeremy; Feng, Wu-chun; Tilevich, Eli
2007-01-01
The current and near future of the computing industry is one of multi-core and multi-processor technology. Most existing sequence-search tools have been designed with a focus on single-core, single-processor systems. This discrepancy between software design and hardware architecture substantially hinders sequence-search performance by not allowing full utilization of the hardware. This paper presents a novel framework that will aid the conversion of serial sequence-search tools into a parallel version that can take full advantage of the available hardware. The framework, which is based on a software architecture called mixin layers with refined roles, enables modules to be plugged into the framework with minimal effort. The inherent modular design improves maintenance and extensibility, thus opening up a plethora of opportunities for advanced algorithmic features to be developed and incorporated while routine maintenance of the codebase persists.
Confinement in Wendelstein 7-X Limiter Plasmas
Hirsch, M.; Dinklage, A.; Alonso, A.; ...
2017-06-14
Observations on confinement in the first experimental campaign on the optimized Stellarator Wendelstein 7-X are summarized. In this phase W7-X was equipped with five inboard limiters only and thus the discharge length restricted to avoid local overheating. Stationary plasmas are limited to low densities <2–3 centerdot 10 19 m -3. With the available 4.3 MW ECR Heating core T e ~ 8 keV, T i ~ 1–2 keV are achieved routinely resulting in energy confinement time τ E between 80 ms to 150 ms. For these conditions the plasmas show characteristics of core electron root confinement with peaked T e-profilesmore » and positive E r up to about half of the minor radius. Lastly, profiles and plasma currents respond to on- and off-axis heating and co- and counter ECCD respectively.« less
Munthali, Tendai; Musonda, Patrick; Mee, Paul; Gumede, Sehlulekile; Schaap, Ab; Mwinga, Alwyn; Phiri, Caroline; Kapata, Nathan; Michelo, Charles; Todd, Jim
2017-06-13
The extent to which routinely collected HIV data from Zambia has been used in peer-reviewed published articles remains unexplored. This paper is an analysis of peer-reviewed articles that utilised routinely collected HIV data from Zambia within six programme areas from 2004 to 2014. Articles on HIV, published in English, listed in the Directory of open access journals, African Journals Online, Google scholar, and PubMed were reviewed. Only articles from peer-reviewed journals, that utilised routinely collected data and included quantitative data analysis methods were included. Multi-country studies involving Zambia and another country, where the specific results for Zambia were not reported, as well as clinical trials and intervention studies that did not take place under routine care conditions were excluded, although community trials which referred patients to the routine clinics were included. Independent extraction was conducted using a predesigned data collection form. Pooled analysis was not possible due to diversity in topics reviewed. A total of 69 articles were extracted for review. Of these, 7 were excluded. From the 62 articles reviewed, 39 focused on HIV treatment and retention in care, 15 addressed prevention of mother-to-child transmission, 4 assessed social behavioural change, and 4 reported on voluntary counselling and testing. In our search, no articles were found on condom programming or voluntary male medical circumcision. The most common outcome measures reported were CD4+ count, clinical failure or mortality. The population analysed was children in 13 articles, women in 16 articles, and both adult men and women in 33 articles. During the 10 year period of review, only 62 articles were published analysing routinely collected HIV data in Zambia. Serious consideration needs to be made to maximise the utility of routinely collected data, and to benefit from the funds and efforts to collect these data. This could be achieved with government support of operational research and publication of findings based on routinely collected Zambian HIV data.
PATHFINDER ATOMIC POWER PLANT TECHNICAL PROGRESS REPORT FOR JULY 1, 1959- SEPTEMBER 30, 1959
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1960-10-31
ABS>Fuel Element Research and Development. Dynamic and static corrosion tests on 8001 Al were completed. Annealmmmg of 1100 cladding on 5083 and M400 cladding on X2219 were tested at 500 deg C, and investigation continued on producing X8101 Al alloy cladding in tube plates by extrusion. Boiler fuel element capsule irradiation tests and subassembly tests are described Heat transfer loop studies and fuel fabrication for the critical facility are reported. Boiler fuel element mechanical design and testing progress is desc ribed. and the superheater fuel element temperature evaluating routine is discussed. Low- enrichment superheater fuel element development included design studiesmore » and stainless steel powder and UO/sub 2/ powder fabrication studies Reactor Mechanical Studies. Research is reported on vessel and structure design, fabrication, and testing, recirculation system design, steam separator tests, and control rod studies. Nuclear Analysis. Reactor physics studies are reported on nuclear constants, baffle plate analysis, comparison of core representations, delayed neutron fraction. and shielding analysis of the reactor building. Reactor and system dynamics and critical experiments were also studied. Chemistry. Progress is reported on recombiner. radioactive gas removal and storage, ion exchanger and radiochemical processing. (For preceding period see ACNP-5915.) (T.R.H.)« less
Application of the Toyota Production System improves core laboratory operations.
Rutledge, Joe; Xu, Min; Simpson, Joanne
2010-01-01
To meet the increased clinical demands of our hospital expansion, improve quality, and reduce costs, our tertiary care, pediatric core laboratory used the Toyota Production System lean processing to reorganize our 24-hour, 7 d/wk core laboratory. A 4-month, consultant-driven process removed waste, led to a physical reset of the space to match the work flow, and developed a work cell for our random access analyzers. In addition, visual controls, single piece flow, standard work, and "5S" were instituted. The new design met our goals as reflected by achieving and maintaining improved turnaround time (TAT; mean for creatinine reduced from 54 to 23 minutes) with increased testing volume (20%), monetary savings (4 full-time equivalents), decreased variability in TAT, and better space utilization (25% gain). The project had the unanticipated consequence of eliminating STAT testing because our in-laboratory TAT for routine testing was less than our prior STAT turnaround goal. The viability of this approach is demonstrated by sustained gains and further PDCA (Plan, Do, Check, Act) improvements during the 4 years after completion of the project.
Rigla, Mercedes
2011-01-01
Although current systems for continuous glucose monitoring (CGM) are the result of progressive technological improvement, and although a beneficial effect on glucose control has been demonstrated, few patients are using them. Something similar has happened to telemedicine (TM); in spite of the long-term experience, which began in the early 1980s, no TM system has been widely adopted, and presential visits are still almost the only way diabetologists and patients communicate. The hypothesis developed in this article is that neither CGM nor TM will ever be routinely implemented separately, and their consideration as essential elements for standard diabetes care will one day come from their integration as parts of a telemedical monitoring platform. This platform, which should include artificial intelligence for giving decision support to patients and physicians, will represent the core of a more complex global agent for diabetes care, which will provide control algorithms and risk analysis among other essential functions. © 2010 Diabetes Technology Society.
Report on the survey for electrostatic discharges on Mars using NASA's Deep Space Network (DSN)
NASA Astrophysics Data System (ADS)
Arabshahi, S.; Majid, W.; Geldzahler, B.; Kocz, J.; Schulter, T.; White, L.
2017-12-01
Mars atmosphere has strong dust activity. It is suggested that the larger regional storms are capable of producing electric fields large enough to initiate electrostatic discharges. The storms have charging process similar to terrestrial dust devils and have hot cores and complicated vortex winds similar to terrestrial thunderstorms. However, due to uncertainties in our understanding of the electrical environment of the storms and absence of related in-situ measurements, the existence (or non-existence) of such electrostatic discharges on the planet is yet to be confirmed. Knowing about the electrical activity on Mars is essential for future human explorations of the planet. We have recently launched a long-term monitoring campaign at NASA's Madrid Deep Space Communication Complex (MDSCC) to search for powerful discharges on Mars. The search occurs during routine tracking of Mars orbiting spacecraft by Deep Space Network (DSN) radio telescope. In this presentation, we will report on the result of processing and analysis of the data from the first six months of our campaign.
Coleman, Mary Thoesen; Nasraty, Soraya; Ostapchuk, Michael; Wheeler, Stephen; Looney, Stephen; Rhodes, Sandra
2003-05-01
The Accreditation Council for Graduate Medical Education (ACGME) recommends integrating improvement activities into residency training. A curricular change was designed at the Department of Family and Community Medicine, University of Louisville, to address selected ACGME competencies by incorporating practice-based improvement activities into the routine clinical work of family medicine residents. Teams of residents, faculty, and office staff completed clinical improvement projects at three ambulatory care training sites. Residents were given academic credit for participation in team meetings. After 6 months, residents presented results to faculty, medical students, other residents, and staff from all three training sites. Residents, staff, and faculty were recognized for their participation. Resident teams demonstrated ACGME competencies in practice-based improvement: Chart audits indicated improvement in clinical projects; quality improvement tools demonstrated analysis of root causes and understanding of the process; plan-do-study-act cycle worksheets demonstrated the change process. Improvement activities that affect patient care and demonstrate selected ACGME competencies can be successfully incorporated into the daily work of family medicine residents.
Multi-Core Processor Memory Contention Benchmark Analysis Case Study
NASA Technical Reports Server (NTRS)
Simon, Tyler; McGalliard, James
2009-01-01
Multi-core processors dominate current mainframe, server, and high performance computing (HPC) systems. This paper provides synthetic kernel and natural benchmark results from an HPC system at the NASA Goddard Space Flight Center that illustrate the performance impacts of multi-core (dual- and quad-core) vs. single core processor systems. Analysis of processor design, application source code, and synthetic and natural test results all indicate that multi-core processors can suffer from significant memory subsystem contention compared to similar single-core processors.
Cox, Helen S.; Mbhele, Slindile; Mohess, Neisha; Whitelaw, Andrew; Muller, Odelia; Zemanay, Widaad; Little, Francesca; Azevedo, Virginia; Simpson, John; Boehme, Catharina C.; Nicol, Mark P.
2014-01-01
Background Xpert MTB/RIF is approved for use in tuberculosis (TB) and rifampicin-resistance diagnosis. However, data are limited on the impact of Xpert under routine conditions in settings with high TB burden. Methods and Findings A pragmatic prospective cluster-randomised trial of Xpert for all individuals with presumptive (symptomatic) TB compared to the routine diagnostic algorithm of sputum microscopy and limited use of culture was conducted in a large TB/HIV primary care clinic. The primary outcome was the proportion of bacteriologically confirmed TB cases not initiating TB treatment by 3 mo after presentation. Secondary outcomes included time to TB treatment and mortality. Unblinded randomisation occurred on a weekly basis. Xpert and smear microscopy were performed on site. Analysis was both by intention to treat (ITT) and per protocol. Between 7 September 2010 and 28 October 2011, 1,985 participants were assigned to the Xpert (n = 982) and routine (n = 1,003) diagnostic algorithms (ITT analysis); 882 received Xpert and 1,063 routine (per protocol analysis). 13% (32/257) of individuals with bacteriologically confirmed TB (smear, culture, or Xpert) did not initiate treatment by 3 mo after presentation in the Xpert arm, compared to 25% (41/167) in the routine arm (ITT analysis, risk ratio 0.51, 95% CI 0.33–0.77, p = 0.0052). The yield of bacteriologically confirmed TB cases among patients with presumptive TB was 17% (167/1,003) with routine diagnosis and 26% (257/982) with Xpert diagnosis (ITT analysis, risk ratio 1.57, 95% CI 1.32–1.87, p<0.001). This difference in diagnosis rates resulted in a higher rate of treatment initiation in the Xpert arm: 23% (229/1,003) and 28% (277/982) in the routine and Xpert arms, respectively (ITT analysis, risk ratio 1.24, 95% CI 1.06–1.44, p = 0.013). Time to treatment initiation was improved overall (ITT analysis, hazard ratio 0.76, 95% CI 0.63–0.92, p = 0.005) and among HIV-infected participants (ITT analysis, hazard ratio 0.67, 95% CI 0.53–0.85, p = 0.001). There was no difference in 6-mo mortality with Xpert versus routine diagnosis. Study limitations included incorrect intervention allocation for a high proportion of participants and that the study was conducted in a single clinic. Conclusions These data suggest that in this routine primary care setting, use of Xpert to diagnose TB increased the number of individuals with bacteriologically confirmed TB who were treated by 3 mo and reduced time to treatment initiation, particularly among HIV-infected participants. Trial registration Pan African Clinical Trials Registry PACTR201010000255244 Please see later in the article for the Editors' Summary PMID:25423041
TU-FG-201-04: Computer Vision in Autonomous Quality Assurance of Linear Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, H; Jenkins, C; Yu, S
Purpose: Routine quality assurance (QA) of linear accelerators represents a critical and costly element of a radiation oncology center. Recently, a system was developed to autonomously perform routine quality assurance on linear accelerators. The purpose of this work is to extend this system and contribute computer vision techniques for obtaining quantitative measurements for a monthly multi-leaf collimator (MLC) QA test specified by TG-142, namely leaf position accuracy, and demonstrate extensibility for additional routines. Methods: Grayscale images of a picket fence delivery on a radioluminescent phosphor coated phantom are captured using a CMOS camera. Collected images are processed to correct formore » camera distortions, rotation and alignment, reduce noise, and enhance contrast. The location of each MLC leaf is determined through logistic fitting and a priori modeling based on knowledge of the delivered beams. Using the data collected and the criteria from TG-142, a decision is made on whether or not the leaf position accuracy of the MLC passes or fails. Results: The locations of all MLC leaf edges are found for three different picket fence images in a picket fence routine to 0.1mm/1pixel precision. The program to correct for image alignment and determination of leaf positions requires a runtime of 21– 25 seconds for a single picket, and 44 – 46 seconds for a group of three pickets on a standard workstation CPU, 2.2 GHz Intel Core i7. Conclusion: MLC leaf edges were successfully found using techniques in computer vision. With the addition of computer vision techniques to the previously described autonomous QA system, the system is able to quickly perform complete QA routines with minimal human contribution.« less
Fourier and Wavelet Analysis of Coronal Time Series
NASA Astrophysics Data System (ADS)
Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.
2016-10-01
Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provies a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence & Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence & Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.
Nunamaker, T R
1983-01-01
This article provides an illustrative application of Data Envelopment Analysis (DEA) methodology to the measurement of routine nursing service efficiency at a group of Wisconsin hospitals. The DEA efficiency ratings and cost savings estimates are then compared to those resulting from application of Medicare's routine cost limitation to the sample data. DEA is also used to determine if any changes in the potential for efficient operations occurred during the 1978-1979 period. Empirical results were representative of the fundamental differences existing between the DEA and cost per patient day approaches. No evidence was found to support the notion that the overall potential for efficient delivery of routine services by the sample institutions was greater in one year than another. PMID:6874357
Pupek, Alex; Matthewson, Beverly; Whitman, Erin; Fullarton, Rachel; Chen, Yu
2017-08-28
The pneumatic tube system (PTS) is commonly used in modern clinical laboratories to provide quick specimen delivery. However, its impact on sample integrity and laboratory testing results are still debatable. In addition, each PTS installation and configuration is unique to its institution. We sought to validate our Swisslog PTS by comparing routine chemistry, hematology, coagulation and blood gas test results and sample integrity indices between duplicate samples transported either manually or by PTS. Duplicate samples were delivered to the core laboratory manually by human courier or via the Swisslog PTS. Head-to-head comparisons of 48 routine chemistry, hematology, coagulation and blood gas laboratory tests, and three sample integrity indices were conducted on 41 healthy volunteers and 61 adult patients. The PTS showed no impact on sample hemolysis, lipemia, or icterus indices (all p<0.05). Although alkaline phosphatase, total bilirubin and hemoglobin reached statistical significance (p=0.009, 0.027 and 0.012, respectively), all had very low average bias which ranged from 0.01% to 2%. Potassium, total hemoglobin and percent deoxyhemoglobin were statistically significant for the neonatal capillary tube study (p=0.011, 0.033 and 0.041, respectively) but no biases greater than ±4% were identified for these parameters. All observed differences of these 48 laboratory tests were not clinically significant. The modern PTS investigated in this study is acceptable for reliable sample delivery for routine chemistry, hematology, coagulation and blood gas (in syringe and capillary tube) laboratory tests.
2015-01-01
different PRBC transfusion volumes. We performed multivariate regression analysis using HRV metrics and routine vital signs to test the hypothesis that...study sponsors did not have any role in the study design, data collection, analysis and interpretation of data, report writing, or the decision to...primary outcome was hemorrhagic injury plus different PRBC transfusion volumes. We performed multivariate regression analysis using HRV metrics and
1984-12-01
BLOCK DATA Default values for variables input by menus. LIBR Interface with frame I/O routines. SNSR Interface with sensor routines. ATMOS Interface with...Routines Included in Frame I/O Interface Routine Description LIBR Selects options for input or output to a data library. FRREAD Reads frame from file and/or...Layer", Journal of Applied Meteorology 20, pp. 242-249, March 1981. 15 L.J. Harding, Numerical Analysis and Applications Software Abstracts, Computing
One year of campaigns in Cameroon: effects on routine health services
Mounier-Jack, Sandra; Edengue, Jean Marie; Lagarde, Mylene; Baonga, Simon Franky; Ongolo-Zogo, Pierre
2016-01-01
Background: Targeted campaigns have been reported to disrupt routine health services in low- and middle-income countries. The objective of this study was to evaluate the average effect of public health campaigns over 1 year on routine services such as antenatal care, routine vaccination and outpatient services. Method: We collected daily activity data in 60 health facilities in two regions of Cameroon that traditionally undergo different intensities of campaign activity, the Centre region (low) and the Far North (high), to ascertain effects on routine services. For each outcome, we restricted our analysis to the public health centres for which good data were available and excluded private health facilities given their small number. We used segment-linear regression to account for the longitudinal nature of the data, and assessed whether the number of routine activities decreased in health facilities during periods when campaigns occurred. The analysis controlled for secular trends and serial correlation. Results: We found evidence that vaccination campaigns had a negative impact on routine activities, decreasing outpatient visits when they occurred (Centre: −9.9%, P = 0.079; Far North: −11.6%, P = 0.025). The average negative effect on routine services [outpatient visits −18% (P = 0.02) and antenatal consultations −70% [P = 0.001]) was most pronounced in the Far North during ‘intensive’ campaigns that usually require high mobilization of staff. Discussion: With an increasing number of interventions delivered by campaigns and in the context of elimination and eradication targets, these are important results for countries and agencies to consider. Achieving disease control targets hinges on ensuring high uptake of routine services. Therefore, we suggest that campaigns should systematically monitor ‘impact on routine services’, while also devising concrete strategies to mitigate potential adverse effects. PMID:27175031
Operational Research: Evaluating Multimodel Implementations for 24/7 Runtime Environments
NASA Astrophysics Data System (ADS)
Burkhart, J. F.; Helset, S.; Abdella, Y. S.; Lappegard, G.
2016-12-01
We present a new open source framework for operational hydrologic rainfall-runoff modeling. The Statkraft Hydrologic Forecasting Toolbox (Shyft) is unique from existing frameworks in that two primary goals are to provide: i) modern, professionally developed source code, and ii) a platform that is robust and ready for operational deployment. Developed jointly between Statkraft AS and The University of Oslo, the framework is currently in operation in both private and academic environments. The hydrology presently available in the distribution is simple and proven. Shyft provides a platform for distributed hydrologic modeling in a highly efficient manner. In it's current operational deployment at Statkraft, Shyft is used to provide daily 10-day forecasts for critical reservoirs. In a research setting, we have developed a novel implementation of the SNICAR model to assess the impact of aerosol deposition on snow packs. Several well known rainfall-runoff algorithms are available for use, allowing for intercomparing different approaches based on available data and the geographical environment. The well known HBV model is a default option, and other routines with more localized methods handling snow and evapotranspiration, or simplifications of catchment scale processes are included. For the latter, we have implemented the Kirchner response routine. Being developed in Norway, a variety snow-melt routines, including simplified degree day models or more advanced energy balance models, may be selected. Ensemble forecasts, multi-model implementations, and statistical post-processing routines enable a robust toolbox for investigating optimal model configurations in an operational setting. The Shyft core is written in modern templated C++ and has Python wrappers developed for easy access to module sub-routines. The code is developed such that the modules that make up a "method stack" are easy to modify and customize, allowing one to create new methods and test them rapidly. Due to the simple architecture and ease of access to the module routines, we see Shyft as an optimal choice to evaluate new hydrologic routines in an environment requiring robust, professionally developed software and welcome further community participation.
User's Manual: Routines for Radiative Heat Transfer and Thermometry
NASA Technical Reports Server (NTRS)
Risch, Timothy K.
2016-01-01
Determining the intensity and spectral distribution of radiation emanating from a heated surface has applications in many areas of science and engineering. Areas of research in which the quantification of spectral radiation is used routinely include thermal radiation heat transfer, infrared signature analysis, and radiation thermometry. In the analysis of radiation, it is helpful to be able to predict the radiative intensity and the spectral distribution of the emitted energy. Presented in this report is a set of routines written in Microsoft Visual Basic for Applications (VBA) (Microsoft Corporation, Redmond, Washington) and incorporating functions specific to Microsoft Excel (Microsoft Corporation, Redmond, Washington) that are useful for predicting the radiative behavior of heated surfaces. These routines include functions for calculating quantities of primary importance to engineers and scientists. In addition, the routines also provide the capability to use such information to determine surface temperatures from spectral intensities and for calculating the sensitivity of the surface temperature measurements to unknowns in the input parameters.
Larsen, V H; Waldau, T; Gravesen, H; Siggaard-Andersen, O
1996-01-01
To describe a clinical case where an extremely low erythrocyte 2,3-diphosphoglycerate concentration (2,3-DPG) was discovered by routine blood gas analysis supplemented by computer calculation of derived quantities. The finding of a low 2,3-DPG revealed a severe hypophosphatemia. Open uncontrolled study of a patient case. Intensive care observation during 41 days. A 44 year old woman with an abdominal abscess. Surgical drainage, antibiotics and parenteral nutrition. daily routine blood gas analyses with computer calculation of the hemoglobin oxygen affinity and estimation of the 2,3-DPG. An abrupt decline of 2,3-DPG was observed late in the course coincident with a pronounced hypophosphatemia. The fall in 2,3-DPG was verified by enzymatic analysis. 2,3-DPG may be estimated by computer calculation of routine blood gas data. A low 2,3-DPG which may be associated with hypophosphatemia causes an unfavorable increase in hemoglobin oxygen affinity which reduces the oxygen release to the tissues.
User manual for two simple postscript output FORTRAN plotting routines
NASA Technical Reports Server (NTRS)
Nguyen, T. X.
1991-01-01
Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.
Framework GRASP: routine library for optimize processing of aerosol remote sensing observation
NASA Astrophysics Data System (ADS)
Fuertes, David; Torres, Benjamin; Dubovik, Oleg; Litvinov, Pavel; Lapyonok, Tatyana; Ducos, Fabrice; Aspetsberger, Michael; Federspiel, Christian
The present the development of a Framework for the Generalized Retrieval of Aerosol and Surface Properties (GRASP) developed by Dubovik et al., (2011). The framework is a source code project that attempts to strengthen the value of the GRASP inversion algorithm by transforming it into a library that will be used later for a group of customized application modules. The functions of the independent modules include the managing of the configuration of the code execution, as well as preparation of the input and output. The framework provides a number of advantages in utilization of the code. First, it implements loading data to the core of the scientific code directly from memory without passing through intermediary files on disk. Second, the framework allows consecutive use of the inversion code without the re-initiation of the core routine when new input is received. These features are essential for optimizing performance of the data production in processing of large observation sets, such as satellite images by the GRASP. Furthermore, the framework is a very convenient tool for further development, because this open-source platform is easily extended for implementing new features. For example, it could accommodate loading of raw data directly onto the inversion code from a specific instrument not included in default settings of the software. Finally, it will be demonstrated that from the user point of view, the framework provides a flexible, powerful and informative configuration system.
Implementation Science Supports Core Clinical Competencies: An Overview and Clinical Example.
Kirchner, JoAnn E; Woodward, Eva N; Smith, Jeffrey L; Curran, Geoffrey M; Kilbourne, Amy M; Owen, Richard R; Bauer, Mark S
2016-12-08
Instead of asking clinicians to work faster or longer to improve quality of care, implementation science provides another option. Implementation science is an emerging interdisciplinary field dedicated to studying how evidence-based practice can be adopted into routine clinical care. This article summarizes principles and methods of implementation science, illustrates how they can be applied in a routine clinical setting, and highlights their importance to practicing clinicians as well as clinical trainees. A hypothetical clinical case scenario is presented that explains how implementation science improves clinical practice. The case scenario is also embedded within a real-world implementation study to improve metabolic monitoring for individuals prescribed antipsychotics. Context, recipient, and innovation (ie, the evidence-based practice) factors affected improvement of metabolic monitoring. To address these factors, an external facilitator and a local quality improvement team developed an implementation plan involving a multicomponent implementation strategy that included education, performance reports, and clinician follow-up. The clinic remained compliant with recommended metabolic monitoring at 1-year follow up. Implementation science improves clinical practice by addressing context, recipient, and innovation factors and uses this information to develop and utilize specific strategies that improve clinical practice. It also enriches clinical training, aligning with core competencies by the Accreditation Council for Graduate Medical Education and American Boards of Medical Specialties. By learning how to change clinical practice through implementation strategies, clinicians are more able to adapt in complex systems of practice. © Copyright 2016 Physicians Postgraduate Press, Inc.
The significance of routines in nursing practice.
Rytterström, Patrik; Unosson, Mitra; Arman, Maria
2011-12-01
The aim of this study was to illuminate the significance of routines in nursing practice. Clinical nursing is performed under the guidance of routines to varying degrees. In the nursing literature, routine is described as having both negative and positive aspects, but use of the term is inconsistent, and empirical evidence is sparse. In the research on organisational routines, a distinction is made between routine as a rule and routine as action. A qualitative design using a phenomenological-hermeneutic approach. Data collection from three focus groups focused on nurses' experience of routines. Seventeen individual interviews from a previous study focusing on caring culture were also analysed in a secondary qualitative analysis. All participants were employed as 'qualified nursing pool' nurses. Routines are experienced as pragmatic, obstructive and meaningful. The aim of the pragmatic routine was to ensure that daily working life works; this routine is practised more on the basis of rational arguments and obvious intentions. The obstructive routine had negative consequences for nursing practice and was described as nursing losing its humanity and violating the patient's integrity. The meaningful routine involved becoming one with the routine and for the nurses, it felt right and meaningful to adapt to it. Routines become meaningful when the individual action is in harmony with the cultural pattern on which the nursing work is based. Instead of letting contemporary practice passively become routine, routines can be assessed and developed using research and theoretical underpinnings as a starting point for nursing practice. Leaders have a special responsibility to develop and support meaningful routines. One approach could be to let wards examine their routines from a patient perspective on the basis of the themes of pragmatic, meaningful and obstructive routine. © 2010 Blackwell Publishing Ltd.
Yamada, Kiyofumi; Song, Yan; Hippe, Daniel S; Sun, Jie; Dong, Li; Xu, Dongxiang; Ferguson, Marina S; Chu, Baocheng; Hatsukami, Thomas S; Chen, Min; Zhou, Cheng; Yuan, Chun
2012-11-29
Carotid intraplaque hemorrhage (IPH) and lipid rich necrotic core (LRNC) have been associated with accelerated plaque growth, luminal narrowing, future surface disruption and development of symptomatic events. The aim of this study was to evaluate the quantitative relationships between high intensity signals (HIS) in the plaque on TOF-MRA and IPH or LRNC volumes as measured by multicontrast weighted CMR. Seventy six patients with a suspected carotid artery stenosis or carotid plaque by ultrasonography underwent multicontrast carotid CMR. HIS presence and volume were measured from TOF-MRA MIP images while IPH and LRNC volumes were separately measured from multicontrast CMR. For detecting IPH, HIS on MIP images overall had high specificity (100.0%, 95% CI: 93.0 - 100.0%) but relatively low sensitivity (32%, 95% CI: 20.8 - 47.9%). However, the sensitivity had a significant increasing relationship with underlying IPH volume (p = 0.033) and degree of stenosis (p = 0.022). Mean IPH volume was 2.7 times larger in those with presence of HIS than in those without (142.8 ± 97.7 mm(3) vs. 53.4 ± 56.3 mm(3), p = 0.014). Similarly, mean LRNC volume was 3.4 times larger in those with HIS present (379.8 ± 203.4 mm(3) vs. 111.3 ± 122.7 mm(3), p = 0.001). There was a strong correlation between the volume of the HIS region and the IPH volume measured from multicontrast CMR (r = 0.96, p < 0.001). MIP images are easily reformatted from three minute, routine, clinical TOF sequences. High intensity signals in carotid plaque on TOF-MRA MIP images are associated with increased intraplaque hemorrhage and lipid-rich necrotic core volumes. The technique is most sensitive in patients with moderate to severe stenosis.
Shahzad, Khalid; Menon, Ashok; Turner, Paul; Ward, Jeremy; Pursnani, Kishore; Alkhaffaf, Bilal
2015-08-01
The prompt recognition of complications is essential in reducing morbidity following anti-reflux surgery. Consequently, many centres employ a policy of routine post-operative contrast studies. The study aimed to examine whether routine contrast studies more effectively recognised early post-operative complications following anti-reflux surgery compared with selective use. This was a retrospective analysis of 240 adults who had undergone primary anti-reflux surgery. Selective use of water-soluble contrast swallows was employed for 115 patients (Group 1) while 125 patients (Group 2) had routine studies. 10 (0.9%) patients from Group 1 underwent contrast studies, four (40%) of which were abnormal. Routine studies in Group 2 identified thirty-two abnormalities (27%) however the inter-group difference was not significant (p = 0.32). Only one case from group 2 required immediate re-intervention. This was not statistically significant (p = 0.78). Multivariate analysis found no significant association between selective or routine imaging and re-intervention rates. One patient from group 2 presented three days following discharge with wrap migration requiring reoperation despite a normal post-operative study. Routine use of contrast imaging following anti-reflux and hiatus hernia surgery is not necessary. It does not identify a significantly greater number of post-operative complications in comparison to selective use. Additionally, routine use of contrast studies does not ensure the diagnosis of all complications in the post-operative period. Copyright © 2015 IJS Publishing Group Limited. Published by Elsevier Ltd. All rights reserved.
Deeny, Sarah R; Steventon, Adam
2015-01-01
Socrates described a group of people chained up inside a cave, who mistook shadows of objects on a wall for reality. This allegory comes to mind when considering ‘routinely collected data’—the massive data sets, generated as part of the routine operation of the modern healthcare service. There is keen interest in routine data and the seemingly comprehensive view of healthcare they offer, and we outline a number of examples in which they were used successfully, including the Birmingham OwnHealth study, in which routine data were used with matched control groups to assess the effect of telephone health coaching on hospital utilisation. Routine data differ from data collected primarily for the purposes of research, and this means that analysts cannot assume that they provide the full or accurate clinical picture, let alone a full description of the health of the population. We show that major methodological challenges in using routine data arise from the difficulty of understanding the gap between patient and their ‘data shadow’. Strategies to overcome this challenge include more extensive data linkage, developing analytical methods and collecting more data on a routine basis, including from the patient while away from the clinic. In addition, creating a learning health system will require greater alignment between the analysis and the decisions that will be taken; between analysts and people interested in quality improvement; and between the analysis undertaken and public attitudes regarding appropriate use of data. PMID:26065466
NASA Astrophysics Data System (ADS)
Ridgeway, William K.; Millar, David P.; Williamson, James R.
2013-04-01
Fluorescence Correlation Spectroscopy (FCS) is widely used to quantify reaction rates and concentrations of molecules in vitro and in vivo. We recently reported Fluorescence Triple Correlation Spectroscopy (F3CS), which correlates three signals together instead of two. F3CS can analyze the stoichiometries of complex mixtures and detect irreversible processes by identifying time-reversal asymmetries. Here we report the computational developments that were required for the realization of F3CS and present the results as the Triple Correlation Toolbox suite of programs. Triple Correlation Toolbox is a complete data analysis pipeline capable of acquiring, correlating and fitting large data sets. Each segment of the pipeline handles error estimates for accurate error-weighted global fitting. Data acquisition was accelerated with a combination of off-the-shelf counter-timer chips and vectorized operations on 128-bit registers. This allows desktop computers with inexpensive data acquisition cards to acquire hours of multiple-channel data with sub-microsecond time resolution. Off-line correlation integrals were implemented as a two delay time multiple-tau scheme that scales efficiently with multiple processors and provides an unprecedented view of linked dynamics. Global fitting routines are provided to fit FCS and F3CS data to models containing up to ten species. Triple Correlation Toolbox is a complete package that enables F3CS to be performed on existing microscopes. Catalogue identifier: AEOP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOP_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 50189 No. of bytes in distributed program, including test data, etc.: 6135283 Distribution format: tar.gz Programming language: C/Assembly. Computer: Any with GCC and library support. Operating system: Linux and OS X (data acq. for Linux only due to library availability), not tested on Windows. RAM: ≥512 MB. Classification: 16.4. External routines: NIDAQmx (National Instruments), Gnu Scientific Library, GTK+, PLplot (optional) Nature of problem: Fluorescence Triple Correlation Spectroscopy required three things: data acquisition at faster speeds than were possible without expensive custom hardware, triple-correlation routines that could process 1/2 TB data sets rapidly, and fitting routines capable of handling several to a hundred fit parameters and 14,000 + data points, each with error estimates. Solution method: A novel data acquisition concept mixed signal processing with off-the-shelf hardware and data-parallel processing using 128-bit registers found in desktop CPUs. Correlation algorithms used fractal data structures and multithreading to reduce data analysis times. Global fitting was implemented with robust minimization routines and provides feedback that allows the user to critically inspect initial guesses and fits. Restrictions: Data acquisition only requires a National Instruments data acquisition card (it was tested on Linux using card PCIe-6251) and a simple home-built circuit. Unusual features: Hand-coded ×86-64 assembly for data acquisition loops (platform-independent C code also provided). Additional comments: A complete collection of tools to perform Fluorescence Triple Correlation Spectroscopy-from data acquisition to two-tau correlation of large data sets, to model fitting. Running time: 1-5 h of data analysis per hour of data collected. Varies depending on data-acquisition length, time resolution, data density and number of cores used for correlation integrals.
Lei, Lei; Chen, Daqin; Huang, Ping; Xu, Ju; Zhang, Rui; Wang, Yuansheng
2013-11-21
NaGdF4 is regarded as an ideal upconversion (UC) host material for lanthanide (Ln(3+)) activators because of its unique crystal structure, high Ln(3+) solubility, low phonon energy and high photochemical stability, and Ln(3+)-doped NaGdF4 UC nanocrystals (NCs) have been widely investigated as bio-imaging and magnetic resonance imaging agents recently. To realize their practical applications, controlling the size and uniformity of the monodisperse Ln(3+)-doped NaGdF4 UC NCs is highly desired. Unlike the routine routes by finely adjusting the multiple experimental parameters, herein we provide a facile and straightforward strategy to modify the size and uniformity of NaGdF4 NCs via alkaline-earth doping for the first time. With the increase of alkaline-earth doping content, the size of NaGdF4 NCs increases gradually, while the size-uniformity is still retained. We attribute this "focusing" of size distribution to the diffusion controlled growth of NaGdF4 NCs induced by alkaline-earth doping. Importantly, adopting the Ca(2+)-doped Yb/Er:NaGdF4 NCs as cores, the complete Ca/Yb/Er:NaGdF4@NaYF4 core-shell particles with excellent size-uniformity can be easily achieved. However, when taking the Yb/Er:NaGdF4 NCs without Ca(2+) doping as cores, they could not be perfectly covered by NaYF4 shells, and the obtained products are non-uniform in size. As a result, the UC emission intensity of the complete core-shell NCs increases by about 30 times in comparison with that of the cores, owing to the effective surface passivation of the Ca(2+)-doped cores and therefore protection of Er(3+) in the cores from the non-radiative decay caused by surface defects, whereas the UC intensity of the incomplete core-shell NCs is enhanced by only 3 times.
Computing Lives And Reliabilities Of Turboprop Transmissions
NASA Technical Reports Server (NTRS)
Coy, J. J.; Savage, M.; Radil, K. C.; Lewicki, D. G.
1991-01-01
Computer program PSHFT calculates lifetimes of variety of aircraft transmissions. Consists of main program, series of subroutines applying to specific configurations, generic subroutines for analysis of properties of components, subroutines for analysis of system, and common block. Main program selects routines used in analysis and causes them to operate in desired sequence. Series of configuration-specific subroutines put in configuration data, perform force and life analyses for components (with help of generic component-property-analysis subroutines), fill property array, call up system-analysis routines, and finally print out results of analysis for system and components. Written in FORTRAN 77(IV).
Ekelin, Maria; Crang-Svalenius, Elizabeth; Nordström, Berit; Dykes, Anna-Karin
2008-01-01
To conceptualize women's and their partners' experiences and ways of handling the situation before, during, and after second trimester ultrasound examination with the diagnosis of a nonviable fetus. A grounded theory study. A Swedish regional hospital. Nine women and 6 men (n=15) were interviewed within a year of the event. The core category was Unexpected change in life. Four categories that were encompassed by the core category emerged: (a) Deceived by a false sense of security; (b) Confronting reality; (c) Grieving; and (d) Reorientation. These parents were unprepared for the diagnosis of a nonviable fetus. In addition to the crisis reaction, they realized that the sense of security they had experienced was false. As different care givers were involved, the need for a care plan was evident. Support from care givers was a very important factor.
Safety pharmacology--current and emerging concepts.
Hamdam, Junnat; Sethu, Swaminathan; Smith, Trevor; Alfirevic, Ana; Alhaidari, Mohammad; Atkinson, Jeffrey; Ayala, Mimieveshiofuo; Box, Helen; Cross, Michael; Delaunois, Annie; Dermody, Ailsa; Govindappa, Karthik; Guillon, Jean-Michel; Jenkins, Rosalind; Kenna, Gerry; Lemmer, Björn; Meecham, Ken; Olayanju, Adedamola; Pestel, Sabine; Rothfuss, Andreas; Sidaway, James; Sison-Young, Rowena; Smith, Emma; Stebbings, Richard; Tingle, Yulia; Valentin, Jean-Pierre; Williams, Awel; Williams, Dominic; Park, Kevin; Goldring, Christopher
2013-12-01
Safety pharmacology (SP) is an essential part of the drug development process that aims to identify and predict adverse effects prior to clinical trials. SP studies are described in the International Conference on Harmonisation (ICH) S7A and S7B guidelines. The core battery and supplemental SP studies evaluate effects of a new chemical entity (NCE) at both anticipated therapeutic and supra-therapeutic exposures on major organ systems, including cardiovascular, central nervous, respiratory, renal and gastrointestinal. This review outlines the current practices and emerging concepts in SP studies including frontloading, parallel assessment of core battery studies, use of non-standard species, biomarkers, and combining toxicology and SP assessments. Integration of the newer approaches to routine SP studies may significantly enhance the scope of SP by refining and providing mechanistic insight to potential adverse effects associated with test compounds. Copyright © 2013 Elsevier Inc. All rights reserved.
Boundary perturbations coupled to core 3/2 tearing modes on the DIII-D tokamak
NASA Astrophysics Data System (ADS)
Tobias, B.; Yu, L.; Domier, C. W.; Luhmann, N. C., Jr.; Austin, M. E.; Paz-Soldan, C.; Turnbull, A. D.; Classen, I. G. J.; the DIII-D Team
2013-09-01
High confinement (H-mode) discharges on the DIII-D tokamak are routinely subject to the formation of long-lived, non-disruptive magnetic islands that degrade confinement and limit fusion performance. Simultaneous, 2D measurement of electron temperature fluctuations in the core and edge regions allows for reconstruction of the radially resolved poloidal mode number spectrum and phase of the global plasma response associated with these modes. Coherent, n = 2 excursions of the plasma boundary are found to be the result of coupling to an n = 2, kink-like mode which arises locked in phase to the 3/2 island chain. This coupling dictates the relative phase of the displacement at the boundary with respect to the tearing mode. This unambiguous phase relationship, for which no counter-examples are observed, is presented as a test for modeling of the perturbed fields to be expected outside the confined plasma.
Opportunities in low-level radiocarbon microtracing: applications and new technology
Vuong, Le Thuy; Song, Qi; Lee, Hee Joo; Roffel, Ad F; Shin, Seok-Ho; Shin, Young G; Dueker, Stephen R
2016-01-01
14C-radiolabeled (radiocarbon) drug studies are central to defining the disposition of therapeutics in clinical development. Concerns over radiation, however, have dissuaded investigators from conducting these studies as often as their utility may merit. Accelerator mass spectrometry (AMS), originally designed for carbon dating and geochronology, has changed the outlook for in-human radiolabeled testing. The high sensitivity of AMS affords human clinical testing with vastly reduced radiative (microtracing) and chemical exposures (microdosing). Early iterations of AMS were unsuitable for routine biomedical use due to the instruments’ large size and associated per sample costs. The situation is changing with advances in the core and peripheral instrumentation. We review the important milestones in applied AMS research and recent advances in the core technology platform. We also look ahead to an entirely new class of 14C detection systems that use lasers to measure carbon dioxide in small gas cells. PMID:28031933
Dong, Ming-Ming; Wang, Cheng-Wei; Wu, Zheng-Xiang; Zhang, Yang; Pan, Huai-Hai; Zhao, Quan-Zhong
2013-07-01
We report on the fabrication of stress-induced optical channel waveguides and waveguide splitters with laser-depressed cladding by femtosecond laser. The laser beam was focused into neodymium doped phosphate glass by an objective producing a destructive filament. By moving the sample along an enclosed routine in the horizontal plane followed by a minor descent less than the filament length in the vertical direction, a cylinder with rarified periphery and densified center region was fabricated. Lining up the segments in partially overlapping sequence enabled waveguiding therein. The refractive-index contrast, near- and far-field mode distribution and confocal microscope fluorescence image of the waveguide were obtained. 1-to-2, 1-to-3 and 1-to-4 splitters were also machined with adjustable splitting ratio. Compared with traditional femtosecond laser writing methods, waveguides prepared by this approach showed controllable mode conduction, strong field confinement, large numerical aperture, low propagation loss and intact core region.
Personal finances for the physician: a primer on maintaining and protecting your earnings.
Hill, Austin D; Ortega, Marc E; Williams, Anthony C
2014-07-01
Personal finance is a key component to your success as a physician. Your clinical practice does not exist in a vacuum unaffected by circumstances and decisions in your personal life. Though some events in your personal life that can negatively affect your practice are random and unavoidable, consistently making sound decisions regarding your personal life and finances will allow you to continue practicing at a high level. Most core principles of personal finance are common sense and do not involve high level math. Although the concepts are straightforward, people, including physicians, routinely fail to make good decisions at the most elementary level. The core common sense principles for financial success are: do not get divorced, manage your own money, live in a state without state income tax, and drive an old car. Follow these tenants and the path to successful and satisfactory retirement will be smooth.
Invasive meningococcal disease epidemiology and control measures: a framework for evaluation.
Caro, J Jaime; Möller, Jörgen; Getsios, Denis; Coudeville, L; El-Hadi, Wissam; Chevat, Catherine; Nguyen, Van Hung; Caro, Ingrid
2007-06-29
Meningococcal disease can have devastating consequences. As new vaccines emerge, it is necessary to assess their impact on public health. In the absence of long-term real world data, modeling the effects of different vaccination strategies is required. Discrete event simulation provides a flexible platform with which to conduct such evaluations. A discrete event simulation of the epidemiology of invasive meningococcal disease was developed to quantify the potential impact of implementing routine vaccination of adolescents in the United States with a quadrivalent conjugate vaccine protecting against serogroups A, C, Y, and W-135. The impact of vaccination is assessed including both the direct effects on individuals vaccinated and the indirect effects resulting from herd immunity. The simulation integrates a variety of epidemiologic and demographic data, with core information on the incidence of invasive meningococcal disease and outbreak frequency derived from data available through the Centers for Disease Control and Prevention. Simulation of the potential indirect benefits of vaccination resulting from herd immunity draw on data from the United Kingdom, where routine vaccination with a conjugate vaccine has been in place for a number of years. Cases of disease are modeled along with their health consequences, as are the occurrence of disease outbreaks. When run without a strategy of routine immunization, the simulation accurately predicts the age-specific incidence of invasive meningococcal disease and the site-specific frequency of outbreaks in the Unite States. 2,807 cases are predicted annually, resulting in over 14,000 potential life years lost due to invasive disease. In base case analyses of routine vaccination, life years lost due to infection are reduced by over 45% (to 7,600) when routinely vaccinating adolescents 12 years of age at 70% coverage. Sensitivity analyses indicate that herd immunity plays an important role when this population is targeted for vaccination. While 1,100 cases are avoided annually when herd immunity effects are included, in the absence of any herd immunity, the number of cases avoided with routine vaccination falls to 380 annually. The duration of vaccine protection also strongly influences results. In the absence of appropriate real world data on outcomes associated with large-scale vaccination programs, decisions on optimal immunization strategies can be aided by discrete events simulations such as the one described here. Given the importance of herd immunity on outcomes associated with routine vaccination, published estimates of the economic efficiency of routine vaccination with a quadrivalent conjugate vaccine in the United States may have considerably underestimated the benefits associated with a policy of routine immunization of adolescents.
Accelerating the Mining of Influential Nodes in Complex Networks through Community Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halappanavar, Mahantesh; Sathanur, Arun V.; Nandi, Apurba
Computing the set of influential nodes with a given size to ensure maximal spread of influence on a complex network is a challenging problem impacting multiple applications. A rigorous approach to influence maximization involves utilization of optimization routines that comes with a high computational cost. In this work, we propose to exploit the existence of communities in complex networks to accelerate the mining of influential seeds. We provide intuitive reasoning to explain why our approach should be able to provide speedups without significantly degrading the extent of the spread of influence when compared to the case of influence maximization withoutmore » using the community information. Additionally, we have parallelized the complete workflow by leveraging an existing parallel implementation of the Louvain community detection algorithm. We then conduct a series of experiments on a dataset with three representative graphs to first verify our implementation and then demonstrate the speedups. Our method achieves speedups ranging from 3x - 28x for graphs with small number of communities while nearly matching or even exceeding the activation performance on the entire graph. Complexity analysis reveals that dramatic speedups are possible for larger graphs that contain a correspondingly larger number of communities. In addition to the speedups obtained from the utilization of the community structure, scalability results show up to 6.3x speedup on 20 cores relative to the baseline run on 2 cores. Finally, current limitations of the approach are outlined along with the planned next steps.« less
NASA Astrophysics Data System (ADS)
Zhang, Penghui; Zhang, Jinliang; Wang, Jinkai; Li, Ming; Liang, Jie; Wu, Yingli
2018-05-01
Flow units classification can be used in reservoir characterization. In addition, characterizing the reservoir interval into flow units is an effective way to simulate the reservoir. Paraflow units (PFUs), the second level of flow units, are used to estimate the spatial distribution of continental clastic reservoirs at the detailed reservoir description stage. In this study, we investigate a nonroutine methodology to predict the external and internal distribution of PFUs. The methodology outlined enables the classification of PFUs using sandstone core samples and log data. The relationships obtained between porosity, permeability and pore throat aperture radii (r35) values were established for core and log data obtained from 26 wells from the Funing Formation, Gaoji Oilfield, Subei Basin, China. The present study refines predicted PFUs at logged (0.125-m) intervals, whose scale is much smaller than routine methods. Meanwhile, three-dimensional models are built using sequential indicator simulation to characterize PFUs in wells. Four distinct PFUs are classified and located based on the statistical methodology of cluster analysis, and each PFU has different seepage ability. The results of this study demonstrate the obtained models are able to quantify reservoir heterogeneity. Due to different petrophysical characteristics and seepage ability, PFUs have a significant impact on the distribution of the remaining oil. Considering these allows a more accurate understanding of reservoir quality, especially within non-marine sandstone reservoirs.
A New Generation of Real-Time Systems in the JET Tokamak
NASA Astrophysics Data System (ADS)
Alves, Diogo; Neto, Andre C.; Valcarcel, Daniel F.; Felton, Robert; Lopez, Juan M.; Barbalace, Antonio; Boncagni, Luca; Card, Peter; De Tommasi, Gianmaria; Goodyear, Alex; Jachmich, Stefan; Lomas, Peter J.; Maviglia, Francesco; McCullen, Paul; Murari, Andrea; Rainford, Mark; Reux, Cedric; Rimini, Fernanda; Sartori, Filippo; Stephen, Adam V.; Vega, Jesus; Vitelli, Riccardo; Zabeo, Luca; Zastrow, Klaus-Dieter
2014-04-01
Recently, a new recipe for developing and deploying real-time systems has become increasingly adopted in the JET tokamak. Powered by the advent of x86 multi-core technology and the reliability of JET's well established Real-Time Data Network (RTDN) to handle all real-time I/O, an official Linux vanilla kernel has been demonstrated to be able to provide real-time performance to user-space applications that are required to meet stringent timing constraints. In particular, a careful rearrangement of the Interrupt ReQuests' (IRQs) affinities together with the kernel's CPU isolation mechanism allows one to obtain either soft or hard real-time behavior depending on the synchronization mechanism adopted. Finally, the Multithreaded Application Real-Time executor (MARTe) framework is used for building applications particularly optimised for exploring multi-core architectures. In the past year, four new systems based on this philosophy have been installed and are now part of JET's routine operation. The focus of the present work is on the configuration aspects that enable these new systems' real-time capability. Details are given about the common real-time configuration of these systems, followed by a brief description of each system together with results regarding their real-time performance. A cycle time jitter analysis of a user-space MARTe based application synchronizing over a network is also presented. The goal is to compare its deterministic performance while running on a vanilla and on a Messaging Real time Grid (MRG) Linux kernel.
Publications - GMC 418 | Alaska Division of Geological & Geophysical
DGGS GMC 418 Publication Details Title: Porosity, permeability, grain density core analysis (CT scans , permeability, grain density core analysis (CT scans), and core photos from the ConocoPhillips N. Cook Inlet
Hopkins, Jammie M; Glenn, Beth A; Cole, Brian L; McCarthy, William; Yancey, Antronette
2012-06-01
Integrating organizationally targeted wellness strategies into the routine conduct of business has shown promise in engaging captive audiences at highest risk of obesity and obesity-related health consequences. This paper presents a process evaluation of the implementation of the University of California, Los Angeles, Working Out Regularly Keeps Individuals Nurtured and Going (WORKING) pilot study. WORKING focuses on integrating physical activity and nutrition practices into workplace routine during non-discretionary paid work time. The purpose of the evaluation was to assess the quality of implementation and to understand factors that facilitated or hindered organizations' full uptake of the intervention. Fifteen worksites were randomly assigned to an intervention condition. Qualitative data were gathered through routine site visits and informant interviews conducted throughout each worksite's intervention period. Worksites were classified into one of four implementation success categories based on their level of adoption and maintenance of core intervention strategies. Six key factors emerged that were related to implementation success: site layout and social climate, wellness infrastructure, number and influence of Program Champions, leadership involvement, site innovation and creativity. This pilot study has informed the conduct of WORKING II; a cluster randomized controlled trial aimed at enrolling 60-70 worksites in Los Angeles County.
Hopkins, Jammie M.; Glenn, Beth A.; Cole, Brian L.; McCarthy, William; Yancey, Antronette
2012-01-01
Integrating organizationally targeted wellness strategies into the routine conduct of business has shown promise in engaging captive audiences at highest risk of obesity and obesity-related health consequences. This paper presents a process evaluation of the implementation of the University of California, Los Angeles, Working Out Regularly Keeps Individuals Nurtured and Going (WORKING) pilot study. WORKING focuses on integrating physical activity and nutrition practices into workplace routine during non-discretionary paid work time. The purpose of the evaluation was to assess the quality of implementation and to understand factors that facilitated or hindered organizations’ full uptake of the intervention. Fifteen worksites were randomly assigned to an intervention condition. Qualitative data were gathered through routine site visits and informant interviews conducted throughout each worksite’s intervention period. Worksites were classified into one of four implementation success categories based on their level of adoption and maintenance of core intervention strategies. Six key factors emerged that were related to implementation success: site layout and social climate, wellness infrastructure, number and influence of Program Champions, leadership involvement, site innovation and creativity. This pilot study has informed the conduct of WORKING II; a cluster randomized controlled trial aimed at enrolling 60–70 worksites in Los Angeles County. PMID:22323279
An evaluation of a zero-heat-flux cutaneous thermometer in cardiac surgical patients.
Eshraghi, Yashar; Nasr, Vivian; Parra-Sanchez, Ivan; Van Duren, Albert; Botham, Mark; Santoscoy, Thomas; Sessler, Daniel I
2014-09-01
Although core temperature can be measured invasively, there are currently no widely available, reliable, noninvasive thermometers for its measurement. We thus compared a prototype zero-heat-flux thermometer with simultaneous measurements from a pulmonary artery catheter. Specifically, we tested the hypothesis that zero-heat-flux temperatures are sufficiently accurate for routine clinical use. Core temperature was measured from the thermistor of a standard pulmonary artery catheter and with a prototype zero-heat-flux deep-tissue thermometer in 105 patients having nonemergent cardiac surgery. Zero-heat-flux probes were positioned on the lateral forehead and lateral neck. Skin surface temperature probes were attached to the forehead just adjacent to the zero-heat-flux probe. Temperatures were recorded at 1-minute intervals, excluding the period of cardiopulmonary bypass, and for the first 4 postoperative hours. Zero-heat-flux and pulmonary artery temperatures were compared with bias analysis; differences exceeding 0.5°C were considered to be potentially clinically important. The mean duration in the operating room was 279 ± 75 minutes, and the mean cross-clamp time was 118 ± 50 minutes. All subjects were monitored for an additional 4 hours in the intensive care unit. The average overall difference between forehead zero-heat-flux and pulmonary artery temperatures (i.e., forehead minus pulmonary artery) was -0.23°C (95% limits of agreement of ±0.82); 78% of the differences were ≤0.5°C. The average intraoperative temperature difference was -0.08°C (95% limits of agreement of ±0.88); 84% of the differences were ≤0.5°C. The average postoperative difference was -0.32°C (95% limits of agreement of ±0.75); 84% of the differences were ≤0.5°C. Bias and precision values for neck site were similar to the forehead values. Uncorrected forehead skin temperature showed an increasing negative bias as core temperature decreased. Core temperature can be noninvasively measured using the zero-heat-flux method. Bias was small, but precision was slightly worse than our designated 0.5°C limits compared with measurements from a pulmonary artery catheter.
VESUVIO Data Analysis Goes MANTID
NASA Astrophysics Data System (ADS)
Jackson, S.; Krzystyniak, M.; Seel, A. G.; Gigg, M.; Richards, S. E.; Fernandez-Alonso, F.
2014-12-01
This paper describes ongoing efforts to implement the reduction and analysis of neutron Compton scattering data within the MANTID framework. Recently, extensive work has been carried out to integrate the bespoke data reduction and analysis routines written for VESUVIO with the MANTID framework. While the programs described in this document are designed to replicate the functionality of the Fortran and Genie routines already in use, most of them have been written from scratch and are not based on the original code base.
Method of Testing and Predicting Failures of Electronic Mechanical Systems
NASA Technical Reports Server (NTRS)
Iverson, David L.; Patterson-Hine, Frances A.
1996-01-01
A method employing a knowledge base of human expertise comprising a reliability model analysis implemented for diagnostic routines is disclosed. The reliability analysis comprises digraph models that determine target events created by hardware failures human actions, and other factors affecting the system operation. The reliability analysis contains a wealth of human expertise information that is used to build automatic diagnostic routines and which provides a knowledge base that can be used to solve other artificial intelligence problems.
Mookiah, M R K; Rohrmeier, A; Dieckmeyer, M; Mei, K; Kopp, F K; Noel, P B; Kirschke, J S; Baum, T; Subburaj, K
2018-04-01
This study investigated the feasibility of opportunistic osteoporosis screening in routine contrast-enhanced MDCT exams using texture analysis. The results showed an acceptable reproducibility of texture features, and these features could discriminate healthy/osteoporotic fracture cohort with an accuracy of 83%. This aim of this study is to investigate the feasibility of opportunistic osteoporosis screening in routine contrast-enhanced MDCT exams using texture analysis. We performed texture analysis at the spine in routine MDCT exams and investigated the effect of intravenous contrast medium (IVCM) (n = 7), slice thickness (n = 7), the long-term reproducibility (n = 9), and the ability to differentiate healthy/osteoporotic fracture cohort (n = 9 age and gender matched pairs). Eight texture features were extracted using gray level co-occurrence matrix (GLCM). The independent sample t test was used to rank the features of healthy/fracture cohort and classification was performed using support vector machine (SVM). The results revealed significant correlations between texture parameters derived from MDCT scans with and without IVCM (r up to 0.91) slice thickness of 1 mm versus 2 and 3 mm (r up to 0.96) and scan-rescan (r up to 0.59). The performance of the SVM classifier was evaluated using 10-fold cross-validation and revealed an average classification accuracy of 83%. Opportunistic osteoporosis screening at the spine using specific texture parameters (energy, entropy, and homogeneity) and SVM can be performed in routine contrast-enhanced MDCT exams.
The influence of geometric imperfections on the stability of three-layer beams with foam core
NASA Astrophysics Data System (ADS)
Wstawska, Iwona
2017-01-01
The main objective of this work is the numerical analysis (FE analysis) of stability of three-layer beams with metal foam core (alumina foam core). The beams were subjected to pure bending. The analysis of the local buckling was performed. Furthermore, the influence of geometric parameters of the beam and material properties of the core (linear and non-linear model) on critical loads values and buckling shape were also investigated. The calculations were made on a family of beams with different mechanical properties of the core (elastic and elastic-plastic material). In addition, the influence of geometric imperfections on deflection and normal stress values of the core and the faces has been evaluated.
Goldhahn, Jörg; Beaton, Dorcas; Ladd, Amy; Macdermid, Joy; Hoang-Kim, Amy
2014-02-01
Lack of standardization of outcome measurement has hampered an evidence-based approach to clinical practice and research. We adopted a process of reviewing evidence on current use of measures and appropriate theoretical frameworks for health and disability to inform a consensus process that was focused on deriving the minimal set of core domains in distal radius fracture. We agreed on the following seven core recommendations: (1) pain and function were regarded as the primary domains, (2) very brief measures were needed for routine administration in clinical practice, (3) these brief measures could be augmented by additional measures that provide more detail or address additional domains for clinical research, (4) measurement of pain should include measures of both intensity and frequency as core attributes, (5) a numeric pain scale, e.g. visual analogue scale or visual numeric scale or the pain subscale of the patient-reported wrist evaluation (PRWE) questionnaires were identified as reliable, valid and feasible measures to measure these concepts, (6) for function, either the Quick Disability of the arm, shoulder and hand questionnaire or PRWE-function subscale was identified as reliable, valid and feasible measures, and (7) a measure of participation and treatment complications should be considered core outcomes for both clinical practice and research. We used a sound methodological approach to form a comprehensive foundation of content for outcomes in the area of distal radius fractures. We recommend the use of symptom and function as separate domains in the ICF core set in clinical research or practice for patients with wrist fracture. Further research is needed to provide more definitive measurement properties of measures across all domains.
Daniel, Hubert D-J; David, Joel; Raghuraman, Sukanya; Gnanamony, Manu; Chandy, George M; Sridharan, Gopalan; Abraham, Priya
2017-05-01
Based on genetic heterogeneity, hepatitis C virus (HCV) is classified into seven major genotypes and 64 subtypes. In spite of the sequence heterogeneity, all genotypes share an identical complement of colinear genes within the large open reading frame. The genetic interrelationships between these genes are consistent among genotypes. Due to this property, complete sequencing of the HCV genome is not required. HCV genotypes along with subtypes are critical for planning antiviral therapy. Certain genotypes are also associated with higher progression to liver cirrhosis. In this study, 100 blood samples were collected from individuals who came for routine HCV genotype identification. These samples were used for the comparison of two different genotyping methods (5'NCR PCR-RFLP and HCV core type-specific PCR) with NS5b sequencing. Of the 100 samples genotyped using 5'NCR PCR-RFLP and HCV core type-specific PCR, 90% (κ = 0.913, P < 0.00) and 96% (κ = 0.794, P < 0.00) correlated with NS5b sequencing, respectively. Sixty percent and 75% of discordant samples by 5'NCR PCR-RFLP and HCV core type-specific PCR, respectively, belonged to genotype 6. All the HCV genotype 1 subtypes were classified accurately by both the methods. This study shows that the 5'NCR-based PCR-RFLP and the HCV core type-specific PCR-based assays correctly identified HCV genotypes except genotype 6 from this region. Direct sequencing of the HCV core region was able to identify all the genotype 6 from this region and serves as an alternative to NS5b sequencing. © 2016 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tasseff, Byron
2016-07-29
NUFLOOD Version 1.x is a surface-water hydrodynamic package designed for the simulation of overland flow of fluids. It consists of various routines to address a wide range of applications (e.g., rainfall-runoff, tsunami, storm surge) and real time, interactive visualization tools. NUFLOOD has been designed for general-purpose computers and workstations containing multi-core processors and/or graphics processing units. The software is easy to use and extensible, constructed in mind for instructors, students, and practicing engineers. NUFLOOD is intended to assist the water resource community in planning against water-related natural disasters.
Epidural analgesia during labour, routinely or on request: a cost-effectiveness analysis.
Bonouvrié, Kimberley; van den Bosch, Anouk; Roumen, Frans J M E; van Kuijk, Sander M; Nijhuis, Jan G; Evers, Silvia M A A; Wassen, Martine M L H
2016-12-01
To assess the cost-effectiveness of routine labour epidural analgesia (EA), from a societal perspective, as compared with labour analgesia on request. Women delivering of a singleton in cephalic presentation beyond 36+0 weeks' gestation were randomly allocated to routine labour EA or analgesia on request in one university and one non-university teaching hospital in the Netherlands. Costs included all medical, non-medical and indirect costs from randomisation to 6 weeks postpartum. Effectiveness was defined as a non-operative, spontaneous vaginal delivery without EA-related maternal adverse effects. Incremental cost-effectiveness ratio (ICER) was defined as the ratio of the difference in costs and the difference in effectiveness between both groups. Data were analysed according to intention to treat and divided into a base case analysis and a sensitivity analysis. Total delivery costs in the routine EA group (n=233) were higher than in the labour on request group (n=255) (difference -€ 322, 95% CI -€ 60 to € 355) due to more medication costs (including EA), a longer stay in the labour ward, and more operations including caesarean sections. Total postpartum hospital costs in the routine EA group were lower (difference -€ 344, 95% CI -€ 1338 to € 621) mainly due to less neonatal admissions (difference -€ 472, 95% CI -€ 1297 to € 331), whereas total postpartum home and others costs were comparable (difference -€ 20, 95% CI -€ 267 to € 248, and -€ 1, 95% CI -€ 67 to € 284, respectively). As a result, the overall mean costs per woman were comparable between the routine EA group and the analgesia on request group (€ 8.708 and € 8.710, respectively, mean difference -€ 2, 95% CI -€ 1.012 to € 916). Routine labour EA resulted in more deliveries with maternal adverse effects, nevertheless the ICER remained low (€ 8; bootstrap 95% CI -€ 6.120 to € 8.659). The cost-effectiveness acceptability curve indicated a low probability that routine EA is cost-effective. Routine labour EA generates comparable costs as analgesia on request, but results in more operative deliveries and more EA-related maternal adverse effects. Based on cost-effectiveness, no preference can be given to routine labour EA as compared with analgesia on request. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.; Jackson, Wade C.
2008-01-01
A simple analysis method has been developed for predicting the residual compressive strength of impact-damaged sandwich panels. The method is tailored for honeycomb core-based sandwich specimens that exhibit an indentation growth failure mode under axial compressive loading, which is driven largely by the crushing behavior of the core material. The analysis method is in the form of a finite element model, where the impact-damaged facesheet is represented using shell elements and the core material is represented using spring elements, aligned in the thickness direction of the core. The nonlinear crush response of the core material used in the analysis is based on data from flatwise compression tests. A comparison with a previous analysis method and some experimental data shows good agreement with results from this new approach.
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.; Jackson, Wade C.
2008-01-01
A simple analysis method has been developed for predicting the residual compression strength of impact-damaged sandwich panels. The method is tailored for honeycomb core-based sandwich specimens that exhibit an indentation growth failure mode under axial compression loading, which is driven largely by the crushing behavior of the core material. The analysis method is in the form of a finite element model, where the impact-damaged facesheet is represented using shell elements and the core material is represented using spring elements, aligned in the thickness direction of the core. The nonlinear crush response of the core material used in the analysis is based on data from flatwise compression tests. A comparison with a previous analysis method and some experimental data shows good agreement with results from this new approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porter, W.P.
1987-08-01
The vacuum gas-analysis system is running and we use doubly labeled water routinely in our research. We have the computer-controlled respirometer system running, and can routinely measure respiratory gases and body temperatures in small, living mammals. Our heat- and mass-balance model calculations of field metabolism and water loss agree exceptionally well with doubly labeled water measurements of the field metabolism and water loss for Sceloporus undulatus in the sandhill country of western Nebraska. We established theoretical links between our biophysical models of individual animals and population dynamics and community models in the literature. Our general, dry, porous-media model works formore » bird feathers as well as fur. Now heat and mass fluxes for any endotherm in any physical environment can be calculated with accuracy knowing only allometry, insulation properties, core or skin temperature and environmental conditions. We developed a general respiratory water-loss model using a coupled molar and heat balance. We developed a general counter-current heat exchange model with a peripheral heat generation and iterative skin temperature solution. We are testing our endotherm model on four species of temperate and tropical birds. We are completing parameter measurements for a test of the ectotherm model on an herbivorous lizard, like the test described above for the carnivorous lizard, Sceloporus undulatus. We have begun lab and field studies of the energetic costs of parasitism and climate on a small endotherm (using tapeworm infection of white-footed mice, Peromyscus leucopus). 8 figs.« less
Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders
NASA Technical Reports Server (NTRS)
Lovejoy, Andrew E.; Schultz, Marc R.
2012-01-01
Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.
Radner, Helga; Chatzidionysiou, Katerina; Nikiphorou, Elena; Gossec, Laure; Hyrich, Kimme L; Zabalan, Condruta; van Eijk-Hustings, Yvonne; Williamson, Paula R; Balanescu, Andra; Burmester, Gerd R; Carmona, Loreto; Dougados, Maxime; Finckh, Axel; Haugeberg, Glenn; Hetland, Merete Lund; Oliver, Susan; Porter, Duncan; Raza, Karim; Ryan, Patrick; Santos, Maria Jose; van der Helm-van Mil, Annette; van Riel, Piet; von Krause, Gabrielle; Zavada, Jakub; Dixon, William G; Askling, Johan
2018-04-01
Personalised medicine, new discoveries and studies on rare exposures or outcomes require large samples that are increasingly difficult for any single investigator to obtain. Collaborative work is limited by heterogeneities, both what is being collected and how it is defined. To develop a core set for data collection in rheumatoid arthritis (RA) research which (1) allows harmonisation of data collection in future observational studies, (2) acts as a common data model against which existing databases can be mapped and (3) serves as a template for standardised data collection in routine clinical practice to support generation of research-quality data. A multistep, international multistakeholder consensus process was carried out involving voting via online surveys and two face-to-face meetings. A core set of 21 items ('what to collect') and their instruments ('how to collect') was agreed: age, gender, disease duration, diagnosis of RA, body mass index, smoking, swollen/tender joints, patient/evaluator global, pain, quality of life, function, composite scores, acute phase reactants, serology, structural damage, treatment and comorbidities. The core set should facilitate collaborative research, allow for comparisons across studies and harmonise future data from clinical practice via electronic medical record systems. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Rakha, Emad A; Pigera, Marian; Shin, Sandra J; D'Alfonso, Timothy; Ellis, Ian O; Lee, Andrew H S
2016-07-01
The recent American Society of Clinical Oncology/College of American Pathologists guidelines for human epidermal growth factor receptor 2 (HER2) testing in breast cancer recommend repeat testing based on tumour grade, tumour type, and hormone receptor status. The aim of this study was to test the value of these criteria. HER2 status was concordant in the core biopsies and excision specimens in 392 of 400 invasive carcinomas. The major reasons for discordance were amplification around the cut-off for positivity and tumour heterogeneity. Of 116 grade 3 carcinomas that were HER2-negative in the core biopsy, four were HER2-positive in the excision specimen. Three of these four either showed borderline negative amplification in the core biopsy or were heterogeneous. None of the 55 grade 1 carcinomas were HER2-positive. Review of repeat testing of HER2 in routine practice suggested that it may also be of value for multifocal tumours and if recommended by the person assessing the in-situ hybridization. Mandatory repeat HER2 testing of grade 3 HER2-negative carcinomas is not appropriate. This is particularly true if repeat testing is performed after borderline negative amplification in the core biopsy or in HER2-negative heterogeneous carcinomas. © 2015 John Wiley & Sons Ltd.
Islam, Anwarul
2018-06-01
The optimal clinical evaluation of the bone marrow requires an examination of air-dried and well-stained films of the aspirated tissue along with a histopathological evaluation of adequately processed and properly stained core biopsy specimens. A bone marrow evaluation can be essential in establishing a diagnosis, determining the efficacy of treatment in haematological disorders and to monitor haematological status of patients following bone marrow/stem cell transplantation. It is also an essential component of the staging process for newly diagnosed malignancies. Currently available bone marrow aspiration needles are quite satisfactory and if properly used provide good-quality specimens for morphological evaluation. However, if a bone marrow core biopsy is concerned, several needles are currently in use but not all of them provide good-quality biopsy specimens for histological evaluation or are user friendly. We have compared the recently introduced Moeller Medical single use bone marrow core biopsy needle with the Jamshidi needle with marrow acquisition cradle (CareFusion), J-needle (Cardinal Health) and OnControl device (Vidacare). It is concluded that the Moeller Medical needle system has definite advantages over others and is recommended for routine use. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Honda, Takayuki; Tozuka, Minoru
2015-09-01
In the reversed clinicopathological conference (R-CPC), three specialists in laboratory medicine interpreted routine laboratory data independently in order to understand the detailed state of a patient. R-CPC is an educational method to use laboratory data appropriately, and it is also important to select differential diagnoses in a process of clinical reasoning in addition to the present illness and physical examination. Routine laboratory tests can be performed repeatedly at a relatively low cost, and their time-series analysis can be performed. Interpretation of routine laboratory data is almost the same as taking physical findings. General findings are initially checked and then the state of each organ is examined. Although routine laboratory tests cost little, we can gain much more information from them about the patient than physical examinations.
Ruskova, Lenka; Raclavsky, Vladislav
2011-09-01
Routine medical microbiology diagnostics relies on conventional cultivation followed by phenotypic techniques for identification of pathogenic bacteria and fungi. This is not only due to tradition and economy but also because it provides pure culture needed for antibiotic susceptibility testing. This review focuses on the potential of High Resolution Melting Analysis (HRMA) of double-stranded DNA for future routine medical microbiology. Search of MEDLINE database for publications showing the advantages of HRMA in routine medical microbiology for identification, strain typing and further characterization of pathogenic bacteria and fungi in particular. The results show increasing numbers of newly-developed and more tailor-made assays in this field. For microbiologists unfamiliar with technical aspects of HRMA, we also provide insight into the technique from the perspective of microbial characterization. We can anticipate that the routine availability of HRMA in medical microbiology laboratories will provide a strong stimulus to this field. This is already envisioned by the growing number of medical microbiology applications published recently. The speed, power, convenience and cost effectiveness of this technology virtually predestine that it will advance genetic characterization of microbes and streamline, facilitate and enrich diagnostics in routine medical microbiology without interfering with the proven advantages of conventional cultivation.
Hogue, Aaron; Dauber, Sarah
2013-04-01
This study describes a multimethod evaluation of treatment fidelity to the family therapy (FT) approach demonstrated by front-line therapists in a community behavioral health clinic that utilized FT as its routine standard of care. Study cases (N=50) were adolescents with conduct and/or substance use problems randomly assigned to routine family therapy (RFT) or to a treatment-as-usual clinic not aligned with the FT approach (TAU). Observational analyses showed that RFT therapists consistently achieved a level of adherence to core FT techniques comparable to the adherence benchmark established during an efficacy trial of a research-based FT. Analyses of therapist-report measures found that compared to TAU, RFT demonstrated strong adherence to FT and differentiation from three other evidence-based practices: cognitive-behavioral therapy, motivational interviewing, and drug counseling. Implications for rigorous fidelity assessments of evidence-based practices in usual care settings are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Philosophical Provocation: The Lifeblood of Clinical Ethics.
McCullough, Laurence B
2017-02-01
The daily work of the clinical ethics teacher and clinical ethics consultant falls into the routine of classifying clinical cases by ethical type and proposing ethically justified alternatives for the professionally responsible management of a specific type of case. Settling too far into this routine creates the risk of philosophical inertia, which is not good either for the clinical ethicist or for the field of clinical ethics. The antidote to this philosophical inertia and resultant blinkered vision of clinical ethics is sustained, willing exposure to philosophical provocation. The papers in this clinical ethics issue of the Journal of Medicine and Philosophy provide just such philosophical provocation related to core topics in clinical ethics: the distinction between clinical practice and clinical research; telemedicine, or medicine at a distance; illness narratives; the concept of the placebo effect; and sex reassignment. © The Author 2017. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Equipment for linking the AutoAnalyzer on-line to a computer
Simpson, D.; Sims, G. E.; Harrison, M. I.; Whitby, L. G.
1971-01-01
An Elliott 903 computer with 8K central core store and magnetic tape backing store has been operated for approximately 20 months in a clinical chemistry laboratory. Details of the equipment designed for linking AutoAnalyzers on-line to the computer are described, and data presented concerning the time required by the computer for different processes. The reliability of the various components in daily operation is discussed. Limitations in the system's capabilities have been defined, and ways of overcoming these are delineated. At present, routine operations include the preparation of worksheets for a limited range of tests (five channels), monitoring of up to 11 AutoAnalyzer channels at a time on a seven-day week basis (with process control and automatic calculation of results), and the provision of quality control data. Cumulative reports can be printed out on those analyses for which computer-prepared worksheets are provided but the system will require extension before these can be issued sufficiently rapidly for routine use. PMID:5551384
Macinnes, Marlene; Macpherson, Gary; Austin, Jessica; Schwannauer, Matthias
2016-12-30
Previous research has found an association between childhood trauma and insecure attachment and psychological distress, risk of violence and engagement in therapy. The aim of this study was to investigate the relationships between these factors in a forensic population. Sixty-four participants from three secure psychiatric hospitals completed the Childhood Trauma Questionnaire (CTQ), the Relationship Scales Questionnaire (RSQ) and the Clinical Outcomes in Routine Evaluation - Outcome Measure (CORE-OM). Overall scores from participants' Historical Clinical Risk Management Violence Risk Assessment Scheme, (HCR-20) were calculated. Staff evaluated participants' engagement in therapy via completion of the Service Engagement Scale (SES). This retrospective study found childhood trauma and insecure attachment significantly predicted psychological distress and risk of violence. No associations with engagement were found, but methodological reasons for this outcome were acknowledged. The importance of routinely assessing for a history of childhood trauma and insecure attachment was highlighted. Crown Copyright © 2016. Published by Elsevier Ireland Ltd. All rights reserved.
Held, Christian; Wenzel, Jens; Webel, Rike; Marschall, Manfred; Lang, Roland; Palmisano, Ralf; Wittenberg, Thomas
2011-01-01
In order to improve reproducibility and objectivity of fluorescence microscopy based experiments and to enable the evaluation of large datasets, flexible segmentation methods are required which are able to adapt to different stainings and cell types. This adaption is usually achieved by the manual adjustment of the segmentation methods parameters, which is time consuming and challenging for biologists with no knowledge on image processing. To avoid this, parameters of the presented methods automatically adapt to user generated ground truth to determine the best method and the optimal parameter setup. These settings can then be used for segmentation of the remaining images. As robust segmentation methods form the core of such a system, the currently used watershed transform based segmentation routine is replaced by a fast marching level set based segmentation routine which incorporates knowledge on the cell nuclei. Our evaluations reveal that incorporation of multimodal information improves segmentation quality for the presented fluorescent datasets.
Quantum supercharger library: hyper-parallelism of the Hartree-Fock method.
Fernandes, Kyle D; Renison, C Alicia; Naidoo, Kevin J
2015-07-05
We present here a set of algorithms that completely rewrites the Hartree-Fock (HF) computations common to many legacy electronic structure packages (such as GAMESS-US, GAMESS-UK, and NWChem) into a massively parallel compute scheme that takes advantage of hardware accelerators such as Graphical Processing Units (GPUs). The HF compute algorithm is core to a library of routines that we name the Quantum Supercharger Library (QSL). We briefly evaluate the QSL's performance and report that it accelerates a HF 6-31G Self-Consistent Field (SCF) computation by up to 20 times for medium sized molecules (such as a buckyball) when compared with mature Central Processing Unit algorithms available in the legacy codes in regular use by researchers. It achieves this acceleration by massive parallelization of the one- and two-electron integrals and optimization of the SCF and Direct Inversion in the Iterative Subspace routines through the use of GPU linear algebra libraries. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
s-core network decomposition: A generalization of k-core analysis to weighted networks
NASA Astrophysics Data System (ADS)
Eidsaa, Marius; Almaas, Eivind
2013-12-01
A broad range of systems spanning biology, technology, and social phenomena may be represented and analyzed as complex networks. Recent studies of such networks using k-core decomposition have uncovered groups of nodes that play important roles. Here, we present s-core analysis, a generalization of k-core (or k-shell) analysis to complex networks where the links have different strengths or weights. We demonstrate the s-core decomposition approach on two random networks (ER and configuration model with scale-free degree distribution) where the link weights are (i) random, (ii) correlated, and (iii) anticorrelated with the node degrees. Finally, we apply the s-core decomposition approach to the protein-interaction network of the yeast Saccharomyces cerevisiae in the context of two gene-expression experiments: oxidative stress in response to cumene hydroperoxide (CHP), and fermentation stress response (FSR). We find that the innermost s-cores are (i) different from innermost k-cores, (ii) different for the two stress conditions CHP and FSR, and (iii) enriched with proteins whose biological functions give insight into how yeast manages these specific stresses.
Widely applicable MATLAB routines for automated analysis of saccadic reaction times.
Leppänen, Jukka M; Forssman, Linda; Kaatiala, Jussi; Yrttiaho, Santeri; Wass, Sam
2015-06-01
Saccadic reaction time (SRT) is a widely used dependent variable in eye-tracking studies of human cognition and its disorders. SRTs are also frequently measured in studies with special populations, such as infants and young children, who are limited in their ability to follow verbal instructions and remain in a stable position over time. In this article, we describe a library of MATLAB routines (Mathworks, Natick, MA) that are designed to (1) enable completely automated implementation of SRT analysis for multiple data sets and (2) cope with the unique challenges of analyzing SRTs from eye-tracking data collected from poorly cooperating participants. The library includes preprocessing and SRT analysis routines. The preprocessing routines (i.e., moving median filter and interpolation) are designed to remove technical artifacts and missing samples from raw eye-tracking data. The SRTs are detected by a simple algorithm that identifies the last point of gaze in the area of interest, but, critically, the extracted SRTs are further subjected to a number of postanalysis verification checks to exclude values contaminated by artifacts. Example analyses of data from 5- to 11-month-old infants demonstrated that SRTs extracted with the proposed routines were in high agreement with SRTs obtained manually from video records, robust against potential sources of artifact, and exhibited moderate to high test-retest stability. We propose that the present library has wide utility in standardizing and automating SRT-based cognitive testing in various populations. The MATLAB routines are open source and can be downloaded from http://www.uta.fi/med/icl/methods.html .
Lucyshyn, Joseph M; Fossett, Brenda; Bakeman, Roger; Cheremshynski, Christy; Miller, Lynn; Lohrmann, Sharon; Binnendyk, Lauren; Khan, Sophia; Chinn, Stephen; Kwon, Samantha; Irvin, Larry K
2015-12-01
The efficacy and consequential validity of an ecological approach to behavioral intervention with families of children with developmental disabilities was examined. The approach aimed to transform coercive into constructive parent-child interaction in family routines. Ten families participated, including 10 mothers and fathers and 10 children 3-8 years old with developmental disabilities. Thirty-six family routines were selected (2 to 4 per family). Dependent measures included child problem behavior, routine steps completed, and coercive and constructive parent-child interaction. For each family, a single case, multiple baseline design was employed with three phases: baseline, intervention, and follow-up. Visual analysis evaluated the functional relation between intervention and improvements in child behavior and routine participation. Nonparametric tests across families evaluated the statistical significance of these improvements. Sequential analyses within families and univariate analyses across families examined changes from baseline to intervention in the percentage and odds ratio of coercive and constructive parent-child interaction. Multiple baseline results documented functional or basic effects for 8 of 10 families. Nonparametric tests showed these changes to be significant. Follow-up showed durability at 11 to 24 months postintervention. Sequential analyses documented the transformation of coercive into constructive processes for 9 of 10 families. Univariate analyses across families showed significant improvements in 2- and 4-step coercive and constructive processes but not in odds ratio. Results offer evidence of the efficacy of the approach and consequential validity of the ecological unit of analysis, parent-child interaction in family routines. Future studies should improve efficiency, and outcomes for families experiencing family systems challenges.
Optimizing zonal advection of the Advanced Research WRF (ARW) dynamics for Intel MIC
NASA Astrophysics Data System (ADS)
Mielikainen, Jarno; Huang, Bormin; Huang, Allen H.
2014-10-01
The Weather Research and Forecast (WRF) model is the most widely used community weather forecast and research model in the world. There are two distinct varieties of WRF. The Advanced Research WRF (ARW) is an experimental, advanced research version featuring very high resolution. The WRF Nonhydrostatic Mesoscale Model (WRF-NMM) has been designed for forecasting operations. WRF consists of dynamics code and several physics modules. The WRF-ARW core is based on an Eulerian solver for the fully compressible nonhydrostatic equations. In the paper, we will use Intel Intel Many Integrated Core (MIC) architecture to substantially increase the performance of a zonal advection subroutine for optimization. It is of the most time consuming routines in the ARW dynamics core. Advection advances the explicit perturbation horizontal momentum equations by adding in the large-timestep tendency along with the small timestep pressure gradient tendency. We will describe the challenges we met during the development of a high-speed dynamics code subroutine for MIC architecture. Furthermore, lessons learned from the code optimization process will be discussed. The results show that the optimizations improved performance of the original code on Xeon Phi 5110P by a factor of 2.4x.
Use of standard hypodermic needles for accessing laparoscopic adjustable gastric band ports.
Bewsher, Samuel Mark; Azzi, Anthony; Wright, Timothy
2010-06-01
Laparoscopic adjustable gastric banding is a common and successful method of surgically treating morbid obesity. A recipient will have to attend their surgeon's rooms a number of times to optimally adjust the amount of fluid in the band and hence the amount of restriction. Manufacturers suggest that the ports should be accessed with special non-coring needles that may not always be available in regional or remote centers, and this could create a safety risk in cases where urgent band deflation is required. Ports of two different brands were repeatedly accessed over 100 times in the same location while containing fluid under pressure, using a standard beveled 21 g hypodermic needle (SN) and a 20 g Huber tipped non-coring needle (NCN). The path the needles types took through the port septum was also examined. There was no leakage of fluid from any of the ports tested. Neither SN nor NCN passed through the port septum down their axis, but rather in a direction closer to that of their beveled surface. There is no more risk of "coring" the septum with a SN than with a NCN. SN can be used safely and routinely to access laparoscopic adjustable gastric band ports.
Safety pharmacology — Current and emerging concepts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamdam, Junnat; Sethu, Swaminathan; Smith, Trevor
2013-12-01
Safety pharmacology (SP) is an essential part of the drug development process that aims to identify and predict adverse effects prior to clinical trials. SP studies are described in the International Conference on Harmonisation (ICH) S7A and S7B guidelines. The core battery and supplemental SP studies evaluate effects of a new chemical entity (NCE) at both anticipated therapeutic and supra-therapeutic exposures on major organ systems, including cardiovascular, central nervous, respiratory, renal and gastrointestinal. This review outlines the current practices and emerging concepts in SP studies including frontloading, parallel assessment of core battery studies, use of non-standard species, biomarkers, and combiningmore » toxicology and SP assessments. Integration of the newer approaches to routine SP studies may significantly enhance the scope of SP by refining and providing mechanistic insight to potential adverse effects associated with test compounds. - Highlights: • SP — mandatory non-clinical risk assessments performed during drug development. • SP organ system studies ensure the safety of clinical participants in FiH trials. • Frontloading in SP facilitates lead candidate drug selection. • Emerging trends: integrating SP-Toxicological endpoints; combined core battery tests.« less
Bertazzoni, Stefania; Williams, Angela; Jones, Darcy A B; Syme, Robert A; Tan, Kar-Chun; Hane, James Kyawzwar
2018-04-17
Fungal pathogen genomes can often be divided into core and accessory regions. Accessory regions may be comprised of either accessory regions (ARs) within core chromosomes (CCs), or wholly-dispensable (accessory) chromosomes (ACs). Fungal ACs and ARs typically accumulate mutations and structural rearrangements more rapidly over time than CCs, and many harbour genes relevant to host-pathogen interactions. These regions are of particular interest in plant pathology and include host-specific virulence factors and secondary metabolite synthesis gene clusters. This review outlines known ACs and ARs in fungal genomes, methods used for their detection, their common properties that differentiate them from the core genome, and what is currently known of their various roles in pathogenicity. Reports on the evolutionary processes generating and shaping AC/AR compartments are discussed, including repeat induced point mutation (RIP) and breakage-fusion-bridge (BFB) cycles. Previously ACs have been studied extensively within key genera including Fusarium, Zymoseptoria and Alternaria, but are growing in their frequency of observation and perceived importance across a wider range of fungal species. Recent advances in sequencing technologies permit affordable genome assembly and re-sequencing of populations that will facilitate further discovery and routine screening of ACs.
Porosity, permeability and 3D fracture network characterisation of dolomite reservoir rock samples
Voorn, Maarten; Exner, Ulrike; Barnhoorn, Auke; Baud, Patrick; Reuschlé, Thierry
2015-01-01
With fractured rocks making up an important part of hydrocarbon reservoirs worldwide, detailed analysis of fractures and fracture networks is essential. However, common analyses on drill core and plug samples taken from such reservoirs (including hand specimen analysis, thin section analysis and laboratory porosity and permeability determination) however suffer from various problems, such as having a limited resolution, providing only 2D and no internal structure information, being destructive on the samples and/or not being representative for full fracture networks. In this paper, we therefore explore the use of an additional method – non-destructive 3D X-ray micro-Computed Tomography (μCT) – to obtain more information on such fractured samples. Seven plug-sized samples were selected from narrowly fractured rocks of the Hauptdolomit formation, taken from wellbores in the Vienna basin, Austria. These samples span a range of different fault rocks in a fault zone interpretation, from damage zone to fault core. We process the 3D μCT data in this study by a Hessian-based fracture filtering routine and can successfully extract porosity, fracture aperture, fracture density and fracture orientations – in bulk as well as locally. Additionally, thin sections made from selected plug samples provide 2D information with a much higher detail than the μCT data. Finally, gas- and water permeability measurements under confining pressure provide an important link (at least in order of magnitude) towards more realistic reservoir conditions. This study shows that 3D μCT can be applied efficiently on plug-sized samples of naturally fractured rocks, and that although there are limitations, several important parameters can be extracted. μCT can therefore be a useful addition to studies on such reservoir rocks, and provide valuable input for modelling and simulations. Also permeability experiments under confining pressure provide important additional insights. Combining these and other methods can therefore be a powerful approach in microstructural analysis of reservoir rocks, especially when applying the concepts that we present (on a small set of samples) in a larger study, in an automated and standardised manner. PMID:26549935
Porosity, permeability and 3D fracture network characterisation of dolomite reservoir rock samples.
Voorn, Maarten; Exner, Ulrike; Barnhoorn, Auke; Baud, Patrick; Reuschlé, Thierry
2015-03-01
With fractured rocks making up an important part of hydrocarbon reservoirs worldwide, detailed analysis of fractures and fracture networks is essential. However, common analyses on drill core and plug samples taken from such reservoirs (including hand specimen analysis, thin section analysis and laboratory porosity and permeability determination) however suffer from various problems, such as having a limited resolution, providing only 2D and no internal structure information, being destructive on the samples and/or not being representative for full fracture networks. In this paper, we therefore explore the use of an additional method - non-destructive 3D X-ray micro-Computed Tomography (μCT) - to obtain more information on such fractured samples. Seven plug-sized samples were selected from narrowly fractured rocks of the Hauptdolomit formation, taken from wellbores in the Vienna basin, Austria. These samples span a range of different fault rocks in a fault zone interpretation, from damage zone to fault core. We process the 3D μCT data in this study by a Hessian-based fracture filtering routine and can successfully extract porosity, fracture aperture, fracture density and fracture orientations - in bulk as well as locally. Additionally, thin sections made from selected plug samples provide 2D information with a much higher detail than the μCT data. Finally, gas- and water permeability measurements under confining pressure provide an important link (at least in order of magnitude) towards more realistic reservoir conditions. This study shows that 3D μCT can be applied efficiently on plug-sized samples of naturally fractured rocks, and that although there are limitations, several important parameters can be extracted. μCT can therefore be a useful addition to studies on such reservoir rocks, and provide valuable input for modelling and simulations. Also permeability experiments under confining pressure provide important additional insights. Combining these and other methods can therefore be a powerful approach in microstructural analysis of reservoir rocks, especially when applying the concepts that we present (on a small set of samples) in a larger study, in an automated and standardised manner.
NASA Technical Reports Server (NTRS)
Perez, Christopher E.; Berg, Melanie D.; Friendlich, Mark R.
2011-01-01
Motivation for this work is: (1) Accurately characterize digital signal processor (DSP) core single-event effect (SEE) behavior (2) Test DSP cores across a large frequency range and across various input conditions (3) Isolate SEE analysis to DSP cores alone (4) Interpret SEE analysis in terms of single-event upsets (SEUs) and single-event transients (SETs) (5) Provide flight missions with accurate estimate of DSP core error rates and error signatures.
NASA Astrophysics Data System (ADS)
Wardaya, P. D.; Noh, K. A. B. M.; Yusoff, W. I. B. W.; Ridha, S.; Nurhandoko, B. E. B.
2014-09-01
This paper discusses a new approach for investigating the seismic wave velocity of rock, specifically carbonates, as affected by their pore structures. While the conventional routine of seismic velocity measurement highly depends on the extensive laboratory experiment, the proposed approach utilizes the digital rock physics view which lies on the numerical experiment. Thus, instead of using core sample, we use the thin section image of carbonate rock to measure the effective seismic wave velocity when travelling on it. In the numerical experiment, thin section images act as the medium on which wave propagation will be simulated. For the modeling, an advanced technique based on artificial neural network was employed for building the velocity and density profile, replacing image's RGB pixel value with the seismic velocity and density of each rock constituent. Then, ultrasonic wave was simulated to propagate in the thin section image by using finite difference time domain method, based on assumption of an acoustic-isotropic medium. Effective velocities were drawn from the recorded signal and being compared to the velocity modeling from Wyllie time average model and Kuster-Toksoz rock physics model. To perform the modeling, image analysis routines were undertaken for quantifying the pore aspect ratio that is assumed to represent the rocks pore structure. In addition, porosity and mineral fraction required for velocity modeling were also quantified by using integrated neural network and image analysis technique. It was found that the Kuster-Toksoz gives the closer prediction to the measured velocity as compared to the Wyllie time average model. We also conclude that Wyllie time average that does not incorporate the pore structure parameter deviates significantly for samples having more than 40% porosity. Utilizing this approach we found a good agreement between numerical experiment and theoretically derived rock physics model for estimating the effective seismic wave velocity of rock.
An improved method for field extraction and laboratory analysis of large, intact soil cores
Tindall, J.A.; Hemmen, K.; Dowd, J.F.
1992-01-01
Various methods have been proposed for the extraction of large, undisturbed soil cores and for subsequent analysis of fluid movement within the cores. The major problems associated with these methods are expense, cumbersome field extraction, and inadequate simulation of unsaturated flow conditions. A field and laboratory procedure is presented that is economical, convenient, and simulates unsaturated and saturated flow without interface flow problems and can be used on a variety of soil types. In the field, a stainless steel core barrel is hydraulically pressed into the soil (30-cm diam. and 38 cm high), the barrel and core are extracted from the soil, and after the barrel is removed from the core, the core is then wrapped securely with flexible sheet metal and a stainless mesh screen is attached to the bottom of the core for support. In the laboratory the soil core is set atop a porous ceramic plate over which a soil-diatomaceous earth slurry has been poured to assure good contact between plate and core. A cardboard cylinder (mold) is fastened around the core and the empty space filled with paraffin wax. Soil cores were tested under saturated and unsaturated conditions using a hanging water column for potentials ???0. Breakthrough curves indicated that no interface flow occurred along the edge of the core. This procedure proved to be reliable for field extraction of large, intact soil cores and for laboratory analysis of solute transport.
Ontology of gaps in content-based image retrieval.
Deserno, Thomas M; Antani, Sameer; Long, Rodney
2009-04-01
Content-based image retrieval (CBIR) is a promising technology to enrich the core functionality of picture archiving and communication systems (PACS). CBIR has a potential for making a strong impact in diagnostics, research, and education. Research as reported in the scientific literature, however, has not made significant inroads as medical CBIR applications incorporated into routine clinical medicine or medical research. The cause is often attributed (without supporting analysis) to the inability of these applications in overcoming the "semantic gap." The semantic gap divides the high-level scene understanding and interpretation available with human cognitive capabilities from the low-level pixel analysis of computers, based on mathematical processing and artificial intelligence methods. In this paper, we suggest a more systematic and comprehensive view of the concept of "gaps" in medical CBIR research. In particular, we define an ontology of 14 gaps that addresses the image content and features, as well as system performance and usability. In addition to these gaps, we identify seven system characteristics that impact CBIR applicability and performance. The framework we have created can be used a posteriori to compare medical CBIR systems and approaches for specific biomedical image domains and goals and a priori during the design phase of a medical CBIR application, as the systematic analysis of gaps provides detailed insight in system comparison and helps to direct future research.
Benefits of utilizing CellProfiler as a characterization tool for U-10Mo nuclear fuel
Collette, R.; Douglas, J.; Patterson, L.; ...
2015-05-01
Automated image processing techniques have the potential to aid in the performance evaluation of nuclear fuels by eliminating judgment calls that may vary from person-to-person or sample-to-sample. Analysis of in-core fuel performance is required for design and safety evaluations related to almost every aspect of the nuclear fuel cycle. This study presents a methodology for assessing the quality of uranium-molybdenum fuel images and describes image analysis routines designed for the characterization of several important microstructural properties. The analyses are performed in CellProfiler, an open-source program designed to enable biologists without training in computer vision or programming to automatically extract cellularmore » measurements from large image sets. The quality metric scores an image based on three parameters: the illumination gradient across the image, the overall focus of the image, and the fraction of the image that contains scratches. The metric presents the user with the ability to ‘pass’ or ‘fail’ an image based on a reproducible quality score. Passable images may then be characterized through a separate CellProfiler pipeline, which enlists a variety of common image analysis techniques. The results demonstrate the ability to reliably pass or fail images based on the illumination, focus, and scratch fraction of the image, followed by automatic extraction of morphological data with respect to fission gas voids, interaction layers, and grain boundaries.« less
NASA Technical Reports Server (NTRS)
Dietz, J. B.
1973-01-01
The environmental heat flux routine version 4, (EHFR-4) is a generalized computer program which calculates the steady state and/or transient thermal environments experienced by a space system during lunar surface, deep space, or thermal vacuum chamber operation. The specific environments possible for EHFR analysis include: lunar plain, lunar crater, combined lunar plain and crater, lunar plain in the region of spacecraft surfaces, intervehicular, deep space in the region of spacecraft surfaces, and thermal vacuum chamber generation. The EHFR was used for Extra Vehicular Mobility Unit environment analysis of the Apollo 11-17 missions, EMU manned and unmanned thermal vacuum qualification testing, and EMU-LRV interface environmental analyses.
NASA Astrophysics Data System (ADS)
Kinash, N.; Cook, A.; Sawyer, D.; Heber, R.
2017-12-01
In May 2017 the University of Texas led a drilling and pressure coring expedition in the northern Gulf of Mexico, UT-GOM2-01. The holes were located in Green Canyon Block 955, where the Gulf of Mexico Joint Industry Project Leg II identified an approximately 100m thick hydrate-filled course-grained levee unit in 2009. Two separate wells were drilled into this unit: Holes H002 and H005. In Hole H002, a cutting shoe drill bit was used to collect the pressure cores, and only 1 of the 8 cores collected was pressurized during recovery. The core recovery in Hole H002 was generally poor, about 34%, while the only pressurized core had 45% recovery. In Hole H005, a face bit was used during pressure coring where 13 cores were collected and 9 cores remained pressurized. Core recovery in Hole H005 was much higher, at about 75%. The type of bit was not the only difference between the holes, however. Drilling mud was used throughout the drilling and pressure coring of Hole H002, while only seawater was used during the first 80m of pressure cores collected in Hole H005. Herein we focus on lithologic analysis of Hole H002 with the goal of documenting and understanding core recovery in Hole H002 to compare with Hole H005. X-ray Computed Tomography (XCT) images were collected by Geotek on pressurized cores, mostly from Hole H005, and at Ohio State on unpressurized cores, mostly from Hole H002. The XCT images of unpressurized cores show minimal sedimentary structures and layering, unlike the XCT images acquired on the pressurized, hydrate-bearing cores. Only small sections of the unpressurized cores remained intact. The unpressurized cores appear to have two prominent facies: 1) silt that did not retain original sedimentary fabric and often was loose within the core barrel, and 2) dense mud sections with some sedimentary structures and layering present. On the XCT images, drilling mud appears to be concentrated on the sides of cores, but also appears in layers and fractures within intact core sections. On microscope images, the drilling mud also appears to saturate the pores in some silt intervals. Further analysis of the unpressurized cores is planned, including X-ray diffraction, grain size analysis and porosity measurements. These results will be compared to the pressurized cores to understand if further lithologic factors could have affected core recovery.
Improvement of Speckle Contrast Image Processing by an Efficient Algorithm.
Steimers, A; Farnung, W; Kohl-Bareis, M
2016-01-01
We demonstrate an efficient algorithm for the temporal and spatial based calculation of speckle contrast for the imaging of blood flow by laser speckle contrast analysis (LASCA). It reduces the numerical complexity of necessary calculations, facilitates a multi-core and many-core implementation of the speckle analysis and enables an independence of temporal or spatial resolution and SNR. The new algorithm was evaluated for both spatial and temporal based analysis of speckle patterns with different image sizes and amounts of recruited pixels as sequential, multi-core and many-core code.
Gordetsky, Jennifer B; Schultz, Luciana; Porter, Kristin K; Nix, Jeffrey W; Thomas, John V; Del Carmen Rodriguez Pena, Maria; Rais-Bahrami, Soroush
2018-06-01
Magnetic resonance (MR)/ultrasound fusion-targeted biopsy (TB) routinely samples multiple cores from each MR lesion of interest. Pathologists can evaluate the extent of cancer involvement and grade using an individual core (IC) or aggregate (AG) method, which could potentially lead to differences in reporting. We reviewed patients who underwent TB followed by radical prostatectomy (RP). TB cores were evaluated for grade and tumor extent by 2 methods. In the IC method, the grade for each TB lesion was based on the core with the highest Gleason score. Tumor extent for each TB was based on the core with the highest percent of tumor involvement. In the AG method, the tumor from all cores within each TB lesion was aggregated to determine the final composite grade and percentage of tumor involvement. Each method was compared with MR lesional volume, MR lesional density (lesion volume/prostate volume), and RP. Fifty-five patients underwent TB followed by RP. Extent of tumor by the AG method showed a better correlation with target lesion volume (r= 0.27,P= .022) and lesional density (r = 0.32, P = .008) than did the IC method (r= 0.19 [P = .103] andr= 0.22 [P = .062]), respectively. Extent of tumor on TB was associated with extraprostatic extension on RP by the AG method (P= .04), but not by the IC method. This association was significantly higher in patients with a grade group (GG) of 3 or higher (P= .03). A change in cancer grade occurred in 3 patients when comparing methods (2 downgraded GG3 to GG2, 1 downgraded GG4 to GG3 by the AG method). For multiple cores obtained via TB, the AG method better correlates with target lesion volume, lesional density, and extraprostatic extension. Copyright © 2018 Elsevier Inc. All rights reserved.
Linear Discriminant Analysis on a Spreadsheet.
ERIC Educational Resources Information Center
Busbey, Arthur Bresnahan III
1989-01-01
Described is a software package, "Trapeze," within which a routine called LinDis can be used. Discussed are teaching methods, the linear discriminant model and equations, the LinDis worksheet, and an example. The set up for this routine is included. (CW)
Vroblesky, Don A.
2008-01-01
Analysis of the volatile organic compound content of tree cores is an inexpensive, rapid, simple approach to examining the distribution of subsurface volatile organic compound contaminants. The method has been shown to detect several volatile petroleum hydrocarbons and chlorinated aliphatic compounds associated with vapor intrusion and ground-water contamination. Tree cores, which are approximately 3 inches long, are obtained by using an increment borer. The cores are placed in vials and sealed. After a period of equilibration, the cores can be analyzed by headspace analysis gas chromatography. Because the roots are exposed to volatile organic compound contamination in the unsaturated zone or shallow ground water, the volatile organic compound concentrations in the tree cores are an indication of the presence of subsurface volatile organic compound contamination. Thus, tree coring can be used to detect and map subsurface volatile organic compound contamination. For comparison of tree-core data at a particular site, it is important to maintain consistent methods for all aspects of tree-core collection, handling, and analysis. Factors affecting the volatile organic compound concentrations in tree cores include the type of volatile organic compound, the tree species, the rooting depth, ground-water chemistry, the depth to the contaminated horizon, concentration differences around the trunk related to variations in the distribution of subsurface volatile organic compounds, concentration differences with depth of coring related to volatilization loss through the bark and possibly other unknown factors, dilution by rain, seasonal influences, sorption, vapor-exchange rates, and within-tree volatile organic compound degradation.
Deeny, Sarah R; Steventon, Adam
2015-08-01
Socrates described a group of people chained up inside a cave, who mistook shadows of objects on a wall for reality. This allegory comes to mind when considering 'routinely collected data'-the massive data sets, generated as part of the routine operation of the modern healthcare service. There is keen interest in routine data and the seemingly comprehensive view of healthcare they offer, and we outline a number of examples in which they were used successfully, including the Birmingham OwnHealth study, in which routine data were used with matched control groups to assess the effect of telephone health coaching on hospital utilisation.Routine data differ from data collected primarily for the purposes of research, and this means that analysts cannot assume that they provide the full or accurate clinical picture, let alone a full description of the health of the population. We show that major methodological challenges in using routine data arise from the difficulty of understanding the gap between patient and their 'data shadow'. Strategies to overcome this challenge include more extensive data linkage, developing analytical methods and collecting more data on a routine basis, including from the patient while away from the clinic. In addition, creating a learning health system will require greater alignment between the analysis and the decisions that will be taken; between analysts and people interested in quality improvement; and between the analysis undertaken and public attitudes regarding appropriate use of data. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
X-ray CT core imaging of Oman Drilling Project on D/V CHIKYU
NASA Astrophysics Data System (ADS)
Michibayashi, K.; Okazaki, K.; Leong, J. A. M.; Kelemen, P. B.; Johnson, K. T. M.; Greenberger, R. N.; Manning, C. E.; Harris, M.; de Obeso, J. C.; Abe, N.; Hatakeyama, K.; Ildefonse, B.; Takazawa, E.; Teagle, D. A. H.; Coggon, J. A.
2017-12-01
We obtained X-ray computed tomography (X-ray CT) images for all cores (GT1A, GT2A, GT3A and BT1A) in Oman Drilling Project Phase 1 (OmanDP cores), since X-ray CT scanning is a routine measurement of the IODP measurement plan onboard Chikyu, which enables the non-destructive observation of the internal structure of core samples. X-ray CT images provide information about chemical compositions and densities of the cores and is useful for assessing sample locations and the quality of the whole-round samples. The X-ray CT scanner (Discovery CT 750HD, GE Medical Systems) on Chikyu scans and reconstructs the image of a 1.4 m section in 10 minutes and produces a series of scan images, each 0.625 mm thick. The X-ray tube (as an X-ray source) and the X-ray detector are installed inside of the gantry at an opposing position to each other. The core sample is scanned in the gantry with the scanning rate of 20 mm/sec. The distribution of attenuation values mapped to an individual slice comprises the raw data that are used for subsequent image processing. Successive two-dimensional (2-D) slices of 512 x 512 pixels yield a representation of attenuation values in three-dimensional (3-D) voxels of 512 x 512 by 1600 in length. Data generated for each core consist of core-axis-normal planes (XY planes) of X-ray attenuation values with dimensions of 512 × 512 pixels in 9 cm × 9 cm cross-section, meaning at the dimensions of a core section, the resolution is 0.176 mm/pixel. X-ray intensity varies as a function of X-ray path length and the linear attenuation coefficient (LAC) of the target material is a function of the chemical composition and density of the target material. The basic measure of attenuation, or radiodensity, is the CT number given in Hounsfield units (HU). CT numbers of air and water are -1000 and 0, respectively. Our preliminary results show that CT numbers of OmanDP cores are well correlated to gamma ray attenuation density (GRA density) as a function of chemical composition and mineral density, so that their profiles with respect to the core depth provide quick lithological information such as mineral identification and phase boundary etc. Moreover, X-ray CT images can be used for 3-D fabric analyses of the whole core even after core cutting into halves for individual analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fountain, Matthew S.; Fiskum, Sandra K.; Baldwin, David L.
This data package contains the K Basin sludge characterization results obtained by Pacific Northwest National Laboratory during processing and analysis of four sludge core samples collected from Engineered Container SCS-CON-210 in 2010 as requested by CH2M Hill Plateau Remediation Company. Sample processing requirements, analytes of interest, detection limits, and quality control sample requirements are defined in the KBC-33786, Rev. 2. The core processing scope included reconstitution of a sludge core sample distributed among four to six 4-L polypropylene bottles into a single container. The reconstituted core sample was then mixed and subsampled to support a variety of characterization activities. Additionalmore » core sludge subsamples were combined to prepare a container composite. The container composite was fractionated by wet sieving through a 2,000 micron mesh and a 500-micron mesh sieve. Each sieve fraction was sampled to support a suite of analyses. The core composite analysis scope included density determination, radioisotope analysis, and metals analysis, including the Waste Isolation Pilot Plant Hazardous Waste Facility Permit metals (with the exception of mercury). The container composite analysis included most of the core composite analysis scope plus particle size distribution, particle density, rheology, and crystalline phase identification. A summary of the received samples, core sample reconstitution and subsampling activities, container composite preparation and subsampling activities, physical properties, and analytical results are presented. Supporting data and documentation are provided in the appendices. There were no cases of sample or data loss and all of the available samples and data are reported as required by the Quality Assurance Project Plan/Sampling and Analysis Plan.« less
ENKI - An Open Source environmental modelling platfom
NASA Astrophysics Data System (ADS)
Kolberg, S.; Bruland, O.
2012-04-01
The ENKI software framework for implementing spatio-temporal models is now released under the LGPL license. Originally developed for evaluation and comparison of distributed hydrological model compositions, ENKI can be used for simulating any time-evolving process over a spatial domain. The core approach is to connect a set of user specified subroutines into a complete simulation model, and provide all administrative services needed to calibrate and run that model. This includes functionality for geographical region setup, all file I/O, calibration and uncertainty estimation etc. The approach makes it easy for students, researchers and other model developers to implement, exchange, and test single routines and various model compositions in a fixed framework. The open-source license and modular design of ENKI will also facilitate rapid dissemination of new methods to institutions engaged in operational water resource management. ENKI uses a plug-in structure to invoke separately compiled subroutines, separately built as dynamic-link libraries (dlls). The source code of an ENKI routine is highly compact, with a narrow framework-routine interface allowing the main program to recognise the number, types, and names of the routine's variables. The framework then exposes these variables to the user within the proper context, ensuring that distributed maps coincide spatially, time series exist for input variables, states are initialised, GIS data sets exist for static map data, manually or automatically calibrated values for parameters etc. By using function calls and memory data structures to invoke routines and facilitate information flow, ENKI provides good performance. For a typical distributed hydrological model setup in a spatial domain of 25000 grid cells, 3-4 time steps simulated per second should be expected. Future adaptation to parallel processing may further increase this speed. New modifications to ENKI include a full separation of API and user interface, making it possible to run ENKI from GIS programs and other software environments. ENKI currently compiles under Windows and Visual Studio only, but ambitions exist to remove the platform and compiler dependencies.
Damman, Peter; Clayton, Tim; Wallentin, Lars; Lagerqvist, Bo; Fox, Keith A A; Hirsch, Alexander; Windhausen, Fons; Swahn, Eva; Pocock, Stuart J; Tijssen, Jan G P; de Winter, Robbert J
2012-02-01
To perform a patient-pooled analysis of a routine invasive versus a selective invasive strategy in elderly patients with non-ST segment elevation acute coronary syndrome. A meta-analysis was performed of patient-pooled data from the FRISC II-ICTUS-RITA-3 (FIR) studies. (Un)adjusted HRs were calculated by Cox regression, with adjustments for variables associated with age and outcomes. The main outcome was 5-year cardiovascular death or myocardial infarction (MI) following routine invasive versus selective invasive management. Regarding the 5-year composite of cardiovascular death or MI, the routine invasive strategy was associated with a lower hazard in patients aged 65-74 years (HR 0.72, 95% CI 0.58 to 0.90) and those aged ≥75 years (HR 0.71, 95% CI 0.55 to 0.91), but not in those aged <65 years (HR 1.11, 95% CI 0.90 to 1.38), p=0.001 for interaction between treatment strategy and age. The interaction was driven by an excess of early MIs in patients <65 years of age; there was no heterogeneity between age groups concerning cardiovascular death. The benefits were smaller for women than for men (p=0.009 for interaction). After adjustment for other clinical risk factors the HRs remained similar. The current analysis of the FIR dataset shows that the long-term benefit of the routine invasive strategy over the selective invasive strategy is attenuated in younger patients aged <65 years and in women by the increased risk of early events which seem to have no consequences for long-term cardiovascular mortality. No other clinical risk factors were able to identify patients with differential responses to a routine invasive strategy. Trial registration http://www.controlled-trials.com/ISRCTN82153174 (ICTUS), http://www.controlled-trials.com/ISRCTN07752711 (RITA-3).
qtcm 0.1.2: A Python Implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation model
NASA Astrophysics Data System (ADS)
Lin, J. W.-B.
2008-10-01
Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.
qtcm 0.1.2: a Python implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation Model
NASA Astrophysics Data System (ADS)
Lin, J. W.-B.
2009-02-01
Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.
NASA Astrophysics Data System (ADS)
Lin, J. W. B.
2015-12-01
Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiledlanguages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionalityavailable with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Pythonto create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran, to optimize model performance but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.
ON THE FOURIER AND WAVELET ANALYSIS OF CORONAL TIME SERIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auchère, F.; Froment, C.; Bocchialini, K.
Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causingmore » false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myhra, S., E-mail: sverre.myhra@materials.ox.ac.uk; Chakalova, R.; Falzone, N.
A method for detection and characterization of single MeV α-particle and recoil tracks in PMMA photoresist by atomic force microscopy (AFM) analysis has been demonstrated. The energy deposition along the track is shown to lead to a latent pattern in the resist due to contrast reversal. It has been shown that the pattern, consisting of conical spikes, can be developed by conventional processing as a result of the dissolution rate of poly(methyl methacrylate) (PMMA) being greater than that for the modified material in the cylindrical volume of the track core. The spikes can be imaged and counted by routine AFMmore » analysis. Investigations by angular-resolved near-grazing incidence reveal additional tracks that correspond to recoil tracks. The observations have been correlated with modelling, and shown to be in qualitative agreement with prevailing descriptions of collision cascades. The results may be relevant to technologies that are based on detection and characterization of single energetic ions. In particular, the direct visualization of the collision cascade may allow more accurate estimates of the actual interaction volume, which in turn will permit more precise assessment of dose distribution of α-emitting radionuclides used for targeted radiotherapy. The results could also be relevant to other diagnostic or process technologies based on interaction of energetic ions with matter.« less
On the Fourier and Wavelet Analysis of Coronal Time Series
NASA Astrophysics Data System (ADS)
Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.
2016-07-01
Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence & Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence & Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.
JAMS - a software platform for modular hydrological modelling
NASA Astrophysics Data System (ADS)
Kralisch, Sven; Fischer, Christian
2015-04-01
Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.
Used Solvent Testing and Reclamation. Volume 1. Cold-Cleaning Solvents
1988-12-01
spectrometer, and specific gravity meter involve buying routine cleaning supplies , and should not exceed $50. Consequently, these methods were...in addition to routine cleaning supplies . The K13V measurement requires periodic supplies of Kauri-butanol solution. TLC analysis requires glass
Analysis of aerobatic aircraft noise using the FAA's Integrated Noise Model
DOT National Transportation Integrated Search
2012-09-30
This project has three main objectives. The first objective is to model noise from complete aerobatic routines for a range of aircraft. The second is to compare modeled and previously measured aircraft noise from complete aerobatic routines for a ran...
A Full Virial Analysis of the Prestellar Cores in the Ophiuchus Molecular Cloud
NASA Astrophysics Data System (ADS)
Pattle, Kate; Ward-Thompson, Derek
We use SCUBA-2, HARP C18O J= 3 -> 2, Herschel and IRAM N2H+ J= 1 -> 0 observations of the Ophiuchus molecular cloud to identify and characterise the properties of the starless cores in the region. The SCUBA-2, HARP and Herschel data were taken as part of the JCMT and Herschel Gould Belt Surveys. We determine masses and temperatures and perform a full virial analysis on our cores, and find that our cores are all either bound or virialised, with gravitational energy and external pressure energy on average of similar importance in confining the cores. There is wide variation from region to region, with cores in the region influenced by B stars (Oph A) being substantially gravitationally bound, and cores in the most quiescent region (Oph C) being pressure-confined. We observe dissipation of turbulence in all our cores, and find that this dissipation is more effective in regions which do not contain outflow-driving protostars. Full details of this analysis are presented by Pattle et al. (2015).
Core Physics and Kinetics Calculations for the Fissioning Plasma Core Reactor
NASA Technical Reports Server (NTRS)
Butler, C.; Albright, D.
2007-01-01
Highly efficient, compact nuclear reactors would provide high specific impulse spacecraft propulsion. This analysis and numerical simulation effort has focused on the technical feasibility issues related to the nuclear design characteristics of a novel reactor design. The Fissioning Plasma Core Reactor (FPCR) is a shockwave-driven gaseous-core nuclear reactor, which uses Magneto Hydrodynamic effects to generate electric power to be used for propulsion. The nuclear design of the system depends on two major calculations: core physics calculations and kinetics calculations. Presently, core physics calculations have concentrated on the use of the MCNP4C code. However, initial results from other codes such as COMBINE/VENTURE and SCALE4a. are also shown. Several significant modifications were made to the ISR-developed QCALC1 kinetics analysis code. These modifications include testing the state of the core materials, an improvement to the calculation of the material properties of the core, the addition of an adiabatic core temperature model and improvement of the first order reactivity correction model. The accuracy of these modifications has been verified, and the accuracy of the point-core kinetics model used by the QCALC1 code has also been validated. Previously calculated kinetics results for the FPCR were described in the ISR report, "QCALC1: A code for FPCR Kinetics Model Feasibility Analysis" dated June 1, 2002.
iSAP: Interactive Sparse Astronomical Data Analysis Packages
NASA Astrophysics Data System (ADS)
Fourt, O.; Starck, J.-L.; Sureau, F.; Bobin, J.; Moudden, Y.; Abrial, P.; Schmitt, J.
2013-03-01
iSAP consists of three programs, written in IDL, which together are useful for spherical data analysis. MR/S (MultiResolution on the Sphere) contains routines for wavelet, ridgelet and curvelet transform on the sphere, and applications such denoising on the sphere using wavelets and/or curvelets, Gaussianity tests and Independent Component Analysis on the Sphere. MR/S has been designed for the PLANCK project, but can be used for many other applications. SparsePol (Polarized Spherical Wavelets and Curvelets) has routines for polarized wavelet, polarized ridgelet and polarized curvelet transform on the sphere, and applications such denoising on the sphere using wavelets and/or curvelets, Gaussianity tests and blind source separation on the Sphere. SparsePol has been designed for the PLANCK project. MS-VSTS (Multi-Scale Variance Stabilizing Transform on the Sphere), designed initially for the FERMI project, is useful for spherical mono-channel and multi-channel data analysis when the data are contaminated by a Poisson noise. It contains routines for wavelet/curvelet denoising, wavelet deconvolution, multichannel wavelet denoising and deconvolution.
Economic analysis of routine neuromonitoring of recurrent laryngeal nerve in total thyroidectomy.
Sanabria, Álvaro; Ramírez, Adonis
2015-01-01
Thyroidectomy is a common surgery. Routine searching of the recurrent laryngeal nerve is the most important strategy to avoid palsy. Neuromonitoring has been recommended to decrease recurrent laryngeal nerve palsy. To assess if neuromonitoring of recurrent laryngeal nerve during thyroidectomy is cost-effective in a developing country. We designed a decision analysis to assess the cost-effectiveness of recurrent laryngeal nerve neuromonitoring. For probabilities, we used data from a meta-analysis. Utility was measured using preference values. We considered direct costs. We conducted a deterministic and a probabilistic analysis. We did not find differences in utility between arms. The frequency of recurrent laryngeal nerve injury was 1% in the neuromonitor group and 1.6% for the standard group. Thyroidectomy without monitoring was the less expensive alternative. The incremental cost-effectiveness ratio was COP$ 9,112,065. Routine neuromonitoring in total thyroidectomy with low risk of recurrent laryngeal nerve injury is neither cost-useful nor cost-effective in the Colombian health system.
Willems, Sander; Fraiture, Marie-Alice; Deforce, Dieter; De Keersmaecker, Sigrid C J; De Loose, Marc; Ruttink, Tom; Herman, Philippe; Van Nieuwerburgh, Filip; Roosens, Nancy
2016-02-01
Because the number and diversity of genetically modified (GM) crops has significantly increased, their analysis based on real-time PCR (qPCR) methods is becoming increasingly complex and laborious. While several pioneers already investigated Next Generation Sequencing (NGS) as an alternative to qPCR, its practical use has not been assessed for routine analysis. In this study a statistical framework was developed to predict the number of NGS reads needed to detect transgene sequences, to prove their integration into the host genome and to identify the specific transgene event in a sample with known composition. This framework was validated by applying it to experimental data from food matrices composed of pure GM rice, processed GM rice (noodles) or a 10% GM/non-GM rice mixture, revealing some influential factors. Finally, feasibility of NGS for routine analysis of GM crops was investigated by applying the framework to samples commonly encountered in routine analysis of GM crops. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Economic Analysis of Classical Swine Fever Surveillance in the Netherlands.
Guo, X; Claassen, G D H; Oude Lansink, A G J M; Loeffen, W; Saatkamp, H W
2016-06-01
Classical swine fever (CSF) is a highly contagious pig disease that causes economic losses and impaired animal welfare. Improving the surveillance system for CSF can help to ensure early detection of the virus, thereby providing a better initial situation for controlling the disease. Economic analysis is required to compare the benefits of improved surveillance with the costs of implementing a more intensive system. This study presents a comprehensive economic analysis of CSF surveillance in the Netherlands, taking into account the specialized structure of Dutch pig production, differences in virulence of CSF strains and a complete list of possible surveillance activities. The starting point of the analysis is the current Dutch surveillance system (i.e. the default surveillance-setup scenario), including the surveillance activities 'daily clinical observation by the farmer', 'veterinarian inspection after a call', 'routine veterinarian inspection', 'pathology in AHS', 'PCR on tonsil in AHS', 'PCR on grouped animals in CVI' and 'confirmatory PCR by NVWA'. Alternative surveillance-setup scenarios were proposed by adding 'routine serology in slaughterhouses', 'routine serology on sow farms' and 'PCR on rendered animals'. The costs and benefits for applying the alternative surveillance-setup scenarios were evaluated by comparing the annual mitigated economic losses because of intensified CSF surveillance with the annual additional surveillance costs. The results of the cost-effectiveness analysis show that the alternative surveillance-setup scenarios with 'PCR on rendered animals' are effective for the moderately virulent CSF strain, whereas the scenarios with 'routine serology in slaughterhouses' or 'routine serology on sow farms' are effective for the low virulent strain. Moreover, the current CSF surveillance system in the Netherlands is cost-effective for both moderately virulent and low virulent CSF strains. The results of the cost-benefit analysis for the moderately virulent CSF strain indicate that the current surveillance system in the Netherlands is adequate. From an economic perspective, there is little to be gained from intensifying surveillance. © 2014 Blackwell Verlag GmbH.
CDKL5 deficiency entails sleep apneas in mice.
Lo Martire, Viviana; Alvente, Sara; Bastianini, Stefano; Berteotti, Chiara; Silvani, Alessandro; Valli, Alice; Viggiano, Rocchina; Ciani, Elisabetta; Zoccoli, Giovanna
2017-08-01
A recently discovered neurodevelopmental disorder caused by the mutation of the cyclin-dependent kinase-like 5 gene (CDKL5) entails complex autistic-like behaviours similar to Rett syndrome, but its impact upon physiological functions remains largely unexplored. Sleep-disordered breathing is common and potentially life-threatening in patients with Rett syndrome; however, evidence is limited in children with CDKL5 disorder, and is lacking altogether in adults. The aim of this study was to test whether the breathing pattern during sleep differs between adult Cdkl5 knockout (Cdkl5-KO) and wild-type (WT) mice. Using whole-body plethysmography, sleep and breathing were recorded non-invasively for 8 h during the light period. Sleep apneas occurred more frequently in Cdkl5-KO than in WT mice. A receiver operating characteristic (ROC) analysis discriminated Cdkl5-KO significantly from WT mice based on sleep apnea occurrence. These data demonstrate that sleep apneas are a core feature of CDKL5 disorder and a respiratory biomarker of CDKL5 deficiency in mice, and suggest that sleep-disordered breathing should be evaluated routinely in CDKL5 patients. © 2017 European Sleep Research Society.
SEDA: A software package for the Statistical Earthquake Data Analysis
NASA Astrophysics Data System (ADS)
Lombardi, A. M.
2017-03-01
In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.
Lelorain, Sophie; Bachelet, Adeline; Bertin, Nicole; Bourgoin, Maryline
2017-09-01
Therapeutic patient education is effective for various patient outcomes; however, healthcare professionals sometimes lack the motivation to carry out patient education. Surprisingly, this issue has rarely been addressed in research. Therefore, this study explores healthcare professionals' perceived barriers to and motivation for therapeutic patient education. Healthcare professionals, mainly nurses, working in different French hospitals were interviewed. Thematic content analysis was performed. Findings included a lack of skills, knowledge, and disillusionment of the effectiveness of therapeutic patient education were features of a demotivated attitude. In contrast, a positive attitude was observed when therapeutic patient education met a need to work differently and more effectively. A key factor motivating professionals was the integration of therapeutic patient education in routine care within a multidisciplinary team. To keep healthcare professionals motivated, managers should ensure that therapeutic patient education is implemented in accordance with its core principles: a patient-centered approach within a trained multidisciplinary team. In the latter case, therapeutic patient education is viewed as an efficient and rewarding way to work with patients, which significantly motivates healthcare professionals. © 2017 John Wiley & Sons Australia, Ltd.
Beyond journalism: Theorizing the transformation of journalism.
Deuze, Mark; Witschge, Tamara
2018-02-01
Journalism has enjoyed a rich and relatively stable history of professionalization. Scholars coming from a variety of disciplines have theorized this history, forming a consistent body of knowledge codified in national and international handbooks and canonical readers. However, recent work and analysis suggest that the supposed core of journalism and the assumed consistency of the inner workings of news organizations are problematic starting points for journalism studies. In this article, we challenge the consensual (self-)presentation of journalism - in terms of its occupational ideology, its professional culture, and its sedimentation in routines and organizational structures (cf. the newsroom) in the context of its reconfiguration as a post-industrial , entrepreneurial , and atypical way of working and of being at work. We outline a way beyond individualist or institutional approaches to do justice to the current complex transformation of the profession. We propose a framework to bring together these approaches in a dialectic attempt to move through and beyond journalism as it has traditionally been conceptualized and practiced, allowing for a broader definition and understanding of the myriad of practices that make up journalism.
Beyond journalism: Theorizing the transformation of journalism
Deuze, Mark; Witschge, Tamara
2017-01-01
Journalism has enjoyed a rich and relatively stable history of professionalization. Scholars coming from a variety of disciplines have theorized this history, forming a consistent body of knowledge codified in national and international handbooks and canonical readers. However, recent work and analysis suggest that the supposed core of journalism and the assumed consistency of the inner workings of news organizations are problematic starting points for journalism studies. In this article, we challenge the consensual (self-)presentation of journalism – in terms of its occupational ideology, its professional culture, and its sedimentation in routines and organizational structures (cf. the newsroom) in the context of its reconfiguration as a post-industrial, entrepreneurial, and atypical way of working and of being at work. We outline a way beyond individualist or institutional approaches to do justice to the current complex transformation of the profession. We propose a framework to bring together these approaches in a dialectic attempt to move through and beyond journalism as it has traditionally been conceptualized and practiced, allowing for a broader definition and understanding of the myriad of practices that make up journalism. PMID:29417952
NASA Astrophysics Data System (ADS)
van Buuren, A.; Gerrits, L.; Teisman, G. R.
2010-11-01
This article analyzes the relationship between the processes of policy making, management and research and the way in which the Westerschelde estuary developed between 1985 and 2006. The Westerschelde has three core functions: economically: it makes the port of Antwerp accessible; ecologically: it generates habitats for certain unique species; and in terms of safety: its morphology helps preventing the hinterland from being flooded. We analyze how the processes of policymaking, management and analysis focused on these three aspects, and how they in turn affected the physical system of the Westerschelde. We proceed to develop a framework for evaluating the policy making, management and esearch and how this impacts the Westerschelde. We apply this framework to twenty years of policy making on, management of, and research about the Westerschelde. We conclude that policy, management and research, due to learning effects, take the dynamics of the Westerschelde into account to a greater extent than they have in the past, but there exist a real probability for old routines to return.
SEDA: A software package for the Statistical Earthquake Data Analysis
Lombardi, A. M.
2017-01-01
In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package. PMID:28290482
Bosomworth, Karyn; Owen, Christine; Curnin, Steven
2017-04-01
The mounting frequency and intensity of natural hazards, alongside growing interdependencies between social-technical and ecological systems, are placing increased pressure on emergency management. This is particularly true at the strategic level of emergency management, which involves planning for and managing non-routine, high-consequence events. Drawing on the literature, a survey, and interviews and workshops with Australia's senior emergency managers, this paper presents an analysis of five core challenges that these pressures are creating for strategic-level emergency management. It argues that emphasising 'emergency management' as a primary adaptation strategy is a retrograde step that ignores the importance of addressing socio-political drivers of vulnerabilities. Three key suggestions are presented that could assist the country's strategic-level emergency management in tackling these challenges: (i) reframe emergency management as a component of disaster risk reduction rather than them being one and the same; (ii) adopt a network governance approach; and (iii) further develop the capacities of strategic-level emergency managers. © 2017 The Author(s). Disasters © Overseas Development Institute, 2017.
Gassas, Adam; Krueger, Joerg; Alvi, Saima; Sung, Lillian; Hitzler, Johanne; Lieberman, Lani
2014-12-01
Despite the success of central nervous system (CNS) directed therapy in pediatric acute lymphoblastic leukemia (ALL), relapse involving the CNS continues to be observed in 5-10% of children when utilizing standard intrathecal prophylactic chemotherapy. While most pediatric ALL treatment protocols mandate regular lumbar punctures (LP) for the intrathecal injection of chemotherapy, the value of routine cytological analysis of cerebrospinal fluid (CSF) during therapy is unknown. Our objective was to assess the diagnostic value of routine CSF analysis during ALL therapy. To allow for at least 10 years of follow up from ALL diagnosis, children (0-18 years) with ALL diagnosed and treated at SickKids, Toronto, Canada between 1994-2004 were studied. Medical records of patients with CNS relapse were examined to determine whether CNS relapse was diagnosed based on cytology of a routinely obtained CSF sample, a CSF sample obtained because of signs and symptoms or a CSF sample obtained after the diagnosis of a bone marrow relapse. Of 494 children treated for ALL, 31 (6.6%) developed a relapse of ALL involving the CNS. Twenty-two had an isolated CNS relapse and nine had a combined bone marrow and CNS relapse. Among patients with isolated CNS relapse, 73% (16/22) were diagnosed based on routine CSF samples obtained from asymptomatic children. Conversely, 89% (8/9) of children with combined bone marrow and CNS relapse presented with symptoms and signs that prompted CSF examination. Routine CSF examination at the time of LP for intrathecal chemotherapy is useful in detecting CNS relapse. © 2014 Wiley Periodicals, Inc.
Possible Rickettsia massiliae Infection in Greece: an Imported Case.
Chochlakis, Dimosthenis; Bongiorni, Christine; Partalis, Nikolaos; Tselentis, Yannis; Psaroulaki, Anna
2016-07-22
Tick-borne rickettsioses are endemic in Greece; however, until recently, only Rickettsia typhi and R. conorii were tested routinely in human samples arriving at the National Reference Center. During the last few years, the identification of different rickettsia species in ticks led to the introduction of other spotted fever group rickettsiae in routine analysis. Under the new scheme, R. massiliae is now tested routinely in human samples; herein, we describe a human case of this infection.
Users manual for the IMA program. Appendix C: Profile design program listing
NASA Technical Reports Server (NTRS)
1991-01-01
The source code for the Profile Design Program (PDP) for the Impulsive Mission Analysis (IMA) program is divided into several files. In a similar manner, the FORTRAN listings of the PDP's subroutines and function routines are organized into several groups in this appendix. Within each group, the FORTRAN listings are ordered alphabetically by routine name. Names and brief descriptions of each routine are listed in the same order as the Fortran listings.
Brown, Cameron S.; Zhang, Hongbin; Kucukboyaci, Vefa; ...
2016-09-07
VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was used to simulate a typical pressurized water reactor (PWR) full core response with 17x17 fuel assemblies for a main steam line break (MSLB) accident scenario with the most reactive rod cluster control assembly stuck out of the core. The accident scenario was initiated at the hot zero power (HZP) at the end of the first fuel cycle with return to power state points that were determined by amore » system analysis code and the most limiting state point was chosen for core analysis. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this way, 59 full core simulations were performed to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. The results show that this typical PWR core remains within MDNBR safety limits for the MSLB accident.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Cameron S.; Zhang, Hongbin; Kucukboyaci, Vefa
VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was used to simulate a typical pressurized water reactor (PWR) full core response with 17x17 fuel assemblies for a main steam line break (MSLB) accident scenario with the most reactive rod cluster control assembly stuck out of the core. The accident scenario was initiated at the hot zero power (HZP) at the end of the first fuel cycle with return to power state points that were determined by amore » system analysis code and the most limiting state point was chosen for core analysis. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this way, 59 full core simulations were performed to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. The results show that this typical PWR core remains within MDNBR safety limits for the MSLB accident.« less
NASA Astrophysics Data System (ADS)
Clay, M. P.; Buaria, D.; Gotoh, T.; Yeung, P. K.
2017-10-01
A new dual-communicator algorithm with very favorable performance characteristics has been developed for direct numerical simulation (DNS) of turbulent mixing of a passive scalar governed by an advection-diffusion equation. We focus on the regime of high Schmidt number (S c), where because of low molecular diffusivity the grid-resolution requirements for the scalar field are stricter than those for the velocity field by a factor √{ S c }. Computational throughput is improved by simulating the velocity field on a coarse grid of Nv3 points with a Fourier pseudo-spectral (FPS) method, while the passive scalar is simulated on a fine grid of Nθ3 points with a combined compact finite difference (CCD) scheme which computes first and second derivatives at eighth-order accuracy. A static three-dimensional domain decomposition and a parallel solution algorithm for the CCD scheme are used to avoid the heavy communication cost of memory transposes. A kernel is used to evaluate several approaches to optimize the performance of the CCD routines, which account for 60% of the overall simulation cost. On the petascale supercomputer Blue Waters at the University of Illinois, Urbana-Champaign, scalability is improved substantially with a hybrid MPI-OpenMP approach in which a dedicated thread per NUMA domain overlaps communication calls with computational tasks performed by a separate team of threads spawned using OpenMP nested parallelism. At a target production problem size of 81923 (0.5 trillion) grid points on 262,144 cores, CCD timings are reduced by 34% compared to a pure-MPI implementation. Timings for 163843 (4 trillion) grid points on 524,288 cores encouragingly maintain scalability greater than 90%, although the wall clock time is too high for production runs at this size. Performance monitoring with CrayPat for problem sizes up to 40963 shows that the CCD routines can achieve nearly 6% of the peak flop rate. The new DNS code is built upon two existing FPS and CCD codes. With the grid ratio Nθ /Nv = 8, the disparity in the computational requirements for the velocity and scalar problems is addressed by splitting the global communicator MPI_COMM_WORLD into disjoint communicators for the velocity and scalar fields, respectively. Inter-communicator transfer of the velocity field from the velocity communicator to the scalar communicator is handled with discrete send and non-blocking receive calls, which are overlapped with other operations on the scalar communicator. For production simulations at Nθ = 8192 and Nv = 1024 on 262,144 cores for the scalar field, the DNS code achieves 94% strong scaling relative to 65,536 cores and 92% weak scaling relative to Nθ = 1024 and Nv = 128 on 512 cores.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo Henriques
A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.
Voltage-spike analysis for a free-running parallel inverter
NASA Technical Reports Server (NTRS)
Lee, F. C. Y.; Wilson, T. G.
1974-01-01
Unwanted and sometimes damaging high-amplitude voltage spikes occur during each half cycle in many transistor saturable-core inverters at the moment when the core saturates and the transistors switch. The analysis shows that spikes are an intrinsic characteristic of certain types of inverters even with negligible leakage inductance and purely resistive load. The small but unavoidable after-saturation inductance of the saturable-core transformer plays an essential role in creating these undesired thigh-voltage spikes. State-plane analysis provides insight into the complex interaction between core and transistors, and shows the circuit parameters upon which the magnitude of these spikes depends.
Particle-in-cell simulation study on halo formation in anisotropic beams
NASA Astrophysics Data System (ADS)
Ikegami, Masanori
2000-11-01
In a recent paper (M. Ikegami, Nucl. Instr. and Meth. A 435 (1999) 284), we investigated halo formation processes in transversely anisotropic beams based on the particle-core model. The effect of simultaneous excitation of two normal modes of core oscillation, i.e., high- and low-frequency modes, was examined. In the present study, self-consistent particle simulations are performed to confirm the results obtained in the particle-core analysis. In these simulations, it is confirmed that the particle-core analysis can predict the halo extent accurately even in anisotropic situations. Furthermore, we find that the halo intensity is enhanced in some cases where two normal modes of core oscillation are simultaneously excited as expected in the particle-core analysis. This result is of practical importance because pure high-frequency mode oscillation has frequently been assumed in preceding halo studies. The dependence of halo intensity on the 2:1 fixed point locations is also discussed.
Damage Tolerance of Sandwich Plates With Debonded Face Sheets
NASA Technical Reports Server (NTRS)
Sankar, Bhavani V.
2001-01-01
A nonlinear finite element analysis was performed to simulate axial compression of sandwich beams with debonded face sheets. The load - end-shortening diagrams were generated for a variety of specimens used in a previous experimental study. The energy release rate at the crack tip was computed using the J-integral, and plotted as a function of the load. A detailed stress analysis was performed and the critical stresses in the face sheet and the core were computed. The core was also modeled as an isotropic elastic-perfectly plastic material and a nonlinear post buckling analysis was performed. A Graeco-Latin factorial plan was used to study the effects of debond length, face sheet and core thicknesses, and core density on the load carrying capacity of the sandwich composite. It has been found that a linear buckling analysis is inadequate in determining the maximum load a debonded sandwich beam can carry. A nonlinear post-buckling analysis combined with an elastoplastic model of the core is required to predict the compression behavior of debonded sandwich beams.
Routine admission laboratory testing for general medical patients.
Hubbell, F A; Frye, E B; Akin, B V; Rucker, L
1988-06-01
We evaluated the usefulness of commonly ordered routine admission laboratory tests in 301 patients admitted consecutively to the internal medicine wards of a university teaching hospital. Using a consensus analysis approach, three Department of Medicine faculty members reviewed the charts of admitted patients to determine the impact of the test results on patient care. The evaluated tests were the urinalysis, hematocrit, white blood cell count, platelet count, six-factor automated multiple analysis (serum sodium, potassium, chloride, bicarbonate, glucose, and blood urea nitrogen), prothrombin time, partial thromboplastin time, chest x-ray, and electrocardiogram. Forty-five percent of the 3,684 tests were ordered for patients without recognizable medical indications. Twelve percent of these routine tests were abnormal, 5% led to additional laboratory testing, but only 0.5% led to change in the treatment of patients. We conclude that the impact of routine admission laboratory testing on patient care is very small and that there is little justification for ordering tests solely because of hospital admission.
Cost-effectiveness of conjugate meningococcal vaccination strategies in the United States.
Shepard, Colin W; Ortega-Sanchez, Ismael R; Scott, R Douglas; Rosenstein, Nancy E
2005-05-01
The US Food and Drug Administration approved a meningococcal conjugate A/C/Y/W-135 vaccine (MCV-4) for use in persons aged 11 to 55 years in January, 2005; licensure for use in younger age groups is expected in 2 to 4 years. To evaluate and compare the projected health and economic impact of MCV-4 vaccination of US adolescents, toddlers, and infants. Cost-effectiveness analysis from a societal perspective based on data from Active Bacterial Core Surveillance (ABCs) and other published and unpublished sources. Sensitivity analyses in which key input measures were varied over plausible ranges were performed. A hypothetical 2003 US population cohort of children 11 years of age and a 2003 US birth cohort. Hypothetical routine vaccination of adolescents (1 dose at 11 years of age), toddlers (1 dose at 1 year of age), and infants (3 doses at 2, 4, and 6 months of age). Each vaccination scenario was compared with a "no-vaccination" scenario. Meningococcal cases and deaths prevented, cost per case prevented, cost per life-year saved, and cost per quality-adjusted life-year saved. Routine MCV-4 vaccination of US adolescents (11 years of age) would prevent 270 meningococcal cases and 36 deaths in the vaccinated cohort over 22 years, a decrease of 46% in the expected burden of disease. Before program costs are counted, adolescent vaccination would reduce direct disease costs by $18 million and decrease productivity losses by $50 million. At a cost per vaccination (average public-private price per dose plus administration fees) of $82.50, adolescent vaccination would cost society $633000 per meningococcal case prevented and $121000 per life-year saved. Key variables influencing results were disease incidence, case-fatality ratio, and cost per vaccination. The cost-effectiveness of toddler vaccination is essentially equivalent to adolescent vaccination, whereas infant vaccination would be much less cost-effective. Routine MCV-4 vaccination of US children would reduce the burden of disease in vaccinated cohorts but at a relatively high net societal cost. The projected cost-effectiveness of adolescent vaccination approaches that of recently adopted childhood vaccines under conditions of above-average meningococcal disease incidence or at a lower cost per vaccination.
Mediaprocessors in medical imaging for high performance and flexibility
NASA Astrophysics Data System (ADS)
Managuli, Ravi; Kim, Yongmin
2002-05-01
New high performance programmable processors, called mediaprocessors, have been emerging since the early 1990s for various digital media applications, such as digital TV, set-top boxes, desktop video conferencing, and digital camcorders. Modern mediaprocessors, e.g., TI's TMS320C64x and Hitachi/Equator Technologies MAP-CA, can offer high performance utilizing both instruction-level and data-level parallelism. During this decade, with continued performance improvement and cost reduction, we believe that the mediaprocessors will become a preferred choice in designing imaging and video systems due to their flexibility in incorporating new algorithms and applications via programming and faster-time-to-market. In this paper, we will evaluate the suitability of these mediaprocessors in medical imaging. We will review the core routines of several medical imaging modalities, such as ultrasound and DR, and present how these routines can be mapped to mediaprocessors and their resultant performance. We will analyze the architecture of several leading mediaprocessors. By carefully mapping key imaging routines, such as 2D convolution, unsharp masking, and 2D FFT, to the mediaprocessor, we have been able to achieve comparable (if not better) performance to that of traditional hardwired approaches. Thus, we believe that future medical imaging systems will benefit greatly from these advanced mediaprocessors, offering significantly increased flexibility and adaptability, reducing the time-to-market, and improving the cost/performance ratio compared to the existing systems while meeting the high computing requirements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yung-Chen Andrew; Engelhard, Mark H.; Baer, Donald R.
2016-03-07
Abstract or short description: Spectral modeling of photoelectrons can serve as a valuable tool when combined with X-ray photoelectron spectroscopy (XPS) analysis. Herein, a new version of the NIST Simulation of Electron Spectra for Surface Analysis (SESSA 2.0) software, capable of directly simulating spherical multilayer NPs, was applied to model citrate stabilized Au/Ag-core/shell nanoparticles (NPs). The NPs were characterized using XPS and scanning transmission electron microscopy (STEM) to determine the composition and morphology of the NPs. The Au/Ag-core/shell NPs were observed to be polydispersed in size, non-circular, and contain off-centered Au-cores. Using the average NP dimensions determined from STEM analysis,more » SESSA spectral modeling indicated that washed Au/Ag-core shell NPs were stabilized with a 0.8 nm l« less
Yang, Lei; Zhou, Weihua; Xue, Kaihua; Wei, Rupeng; Ling, Zheng
2018-05-01
The enormous potential as an alternative energy resource has made natural gas hydrates a material of intense research interest. Their exploration and sample characterization require a quick and effective analysis of the hydrate-bearing cores recovered under in situ pressures. Here a novel Pressure Core Ultrasonic Test System (PCUTS) for on-board analysis of sediment cores containing gas hydrates at in situ pressures is presented. The PCUTS is designed to be compatible with an on-board pressure core transfer device and a long gravity-piston pressure-retained corer. It provides several advantages over laboratory core analysis including quick and non-destructive detection, in situ and successive acoustic property acquisition, and remission of sample storage and transportation. The design of the unique assembly units to ensure the in situ detection is demonstrated, involving the U-type protecting jackets, transducer precession device, and pressure stabilization system. The in situ P-wave velocity measurements make the detection of gas hydrate existence in the sediments possible on-board. Performance tests have verified the feasibility and sensitivity of the ultrasonic test unit, showing the dependence of P-wave velocity on gas hydrate saturation. The PCUTS has been successfully applied for analysis of natural samples containing gas hydrates recovered from the South China Sea. It is indicated that on-board P-wave measurements could provide a quick and effective understanding of the hydrate occurrence in natural samples, which can assist further resource exploration, assessment, and subsequent detailed core analysis.
NASA Astrophysics Data System (ADS)
Yang, Lei; Zhou, Weihua; Xue, Kaihua; Wei, Rupeng; Ling, Zheng
2018-05-01
The enormous potential as an alternative energy resource has made natural gas hydrates a material of intense research interest. Their exploration and sample characterization require a quick and effective analysis of the hydrate-bearing cores recovered under in situ pressures. Here a novel Pressure Core Ultrasonic Test System (PCUTS) for on-board analysis of sediment cores containing gas hydrates at in situ pressures is presented. The PCUTS is designed to be compatible with an on-board pressure core transfer device and a long gravity-piston pressure-retained corer. It provides several advantages over laboratory core analysis including quick and non-destructive detection, in situ and successive acoustic property acquisition, and remission of sample storage and transportation. The design of the unique assembly units to ensure the in situ detection is demonstrated, involving the U-type protecting jackets, transducer precession device, and pressure stabilization system. The in situ P-wave velocity measurements make the detection of gas hydrate existence in the sediments possible on-board. Performance tests have verified the feasibility and sensitivity of the ultrasonic test unit, showing the dependence of P-wave velocity on gas hydrate saturation. The PCUTS has been successfully applied for analysis of natural samples containing gas hydrates recovered from the South China Sea. It is indicated that on-board P-wave measurements could provide a quick and effective understanding of the hydrate occurrence in natural samples, which can assist further resource exploration, assessment, and subsequent detailed core analysis.
Optimization of the Brillouin operator on the KNL architecture
NASA Astrophysics Data System (ADS)
Dürr, Stephan
2018-03-01
Experiences with optimizing the matrix-times-vector application of the Brillouin operator on the Intel KNL processor are reported. Without adjustments to the memory layout, performance figures of 360 Gflop/s in single and 270 Gflop/s in double precision are observed. This is with Nc = 3 colors, Nv = 12 right-hand-sides, Nthr = 256 threads, on lattices of size 323 × 64, using exclusively OMP pragmas. Interestingly, the same routine performs quite well on Intel Core i7 architectures, too. Some observations on the much harderWilson fermion matrix-times-vector optimization problem are added.
Investigating La Réunion Hot Spot From Crust to Core
NASA Astrophysics Data System (ADS)
Barruol, Guilhem; Sigloch, Karin
2013-06-01
Whether volcanic intraplate hot spots are underlain by deep mantle plumes continues to be debated 40 years after the hypothesis was proposed by Morgan [1972]. Arrivals of buoyant plume heads may have been among the most disruptive agents in Earth's history, initiating continental breakup, altering global climate, and triggering mass extinctions. Further, with the temporary shutdown of European air traffic in 2010 caused by the eruption of Eyjafjallajökull, a geologically routine eruption in the tail end of the presumed Iceland plume, the world witnessed an intrusion of hot spot activity into modern-day life.
Gholami, Hadi; Anyika, Mercy; Zhang, Jun; Vasileiou, Chrysoula; Borhan, Babak
2016-06-27
The absolute stereochemistry of cyanohydrins, derived from ketones and aldehydes, is obtained routinely, in a microscale and derivatization-free manner, upon their complexation with Zn-MAPOL, a zincated porphyrin host with a binding pocket comprised of a biphenol core. The host-guest complex leads to observable exciton-coupled circular dichroism (ECCD), the sign of which is easily correlated to the absolute stereochemistry of the bound cyanohydrin. A working model, based on the ECCD signal of cyanohydrins with known configuration, is proposed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Impact of culture on the expression of pain and suffering.
Wein, Simon
2011-10-01
A primary human challenge is how to alleviate suffering and loss. One way is through culture. The core characteristics of culture are symbols, sharing and groups. These three factors enable society to help the individual cope with loss. In the modern age traditional culture is disintegrating and is being replaced. Often it is outstanding individuals who provide the impetus and tools with which to change the culture and to adapt to new challenges. One lesson to be drawn from the discussion is the idea of using our culture more pro-actively to routinely contemplate loss, ageing and death.
Granovsky, Alexander A
2015-12-21
We present a new, very efficient semi-numerical approach for the computation of state-specific nuclear gradients of a generic state-averaged multi-configuration self consistent field wavefunction. Our approach eliminates the costly coupled-perturbed multi-configuration Hartree-Fock step as well as the associated integral transformation stage. The details of the implementation within the Firefly quantum chemistry package are discussed and several sample applications are given. The new approach is routinely applicable to geometry optimization of molecular systems with 1000+ basis functions using a standalone multi-core workstation.
[High altitude related health problems in travel medicine].
Neumayr, Andreas
2013-06-01
Giving travel advice to travellers visiting high-altitude destinations around the globe is daily routine in travel clinics. However, with the classical focus on vaccinations, traveller's diarrhoea, mosquito protection and malaria prophylaxis, altitude-related health problems are often neglected at counselling. The importance to communicate these problems when giving travel advice is impressively reflected by the high prevalence of altitude-related health problems among tourists visiting high-lying tourist destinations. This article aims at providing an overview of core aspects of acclimatization to altitude and prevention of altitude-related health problems and to exemplarily address practice-oriented problem destinations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granovsky, Alexander A., E-mail: alex.granovsky@gmail.com
We present a new, very efficient semi-numerical approach for the computation of state-specific nuclear gradients of a generic state-averaged multi-configuration self consistent field wavefunction. Our approach eliminates the costly coupled-perturbed multi-configuration Hartree-Fock step as well as the associated integral transformation stage. The details of the implementation within the Firefly quantum chemistry package are discussed and several sample applications are given. The new approach is routinely applicable to geometry optimization of molecular systems with 1000+ basis functions using a standalone multi-core workstation.
Temporal information processing in short- and long-term memory of patients with schizophrenia.
Landgraf, Steffen; Steingen, Joerg; Eppert, Yvonne; Niedermeyer, Ulrich; van der Meer, Elke; Krueger, Frank
2011-01-01
Cognitive deficits of patients with schizophrenia have been largely recognized as core symptoms of the disorder. One neglected factor that contributes to these deficits is the comprehension of time. In the present study, we assessed temporal information processing and manipulation from short- and long-term memory in 34 patients with chronic schizophrenia and 34 matched healthy controls. On the short-term memory temporal-order reconstruction task, an incidental or intentional learning strategy was deployed. Patients showed worse overall performance than healthy controls. The intentional learning strategy led to dissociable performance improvement in both groups. Whereas healthy controls improved on a performance measure (serial organization), patients improved on an error measure (inappropriate semantic clustering) when using the intentional instead of the incidental learning strategy. On the long-term memory script-generation task, routine and non-routine events of everyday activities (e.g., buying groceries) had to be generated in either chronological or inverted temporal order. Patients were slower than controls at generating events in the chronological routine condition only. They also committed more sequencing and boundary errors in the inverted conditions. The number of irrelevant events was higher in patients in the chronological, non-routine condition. These results suggest that patients with schizophrenia imprecisely access temporal information from short- and long-term memory. In short-term memory, processing of temporal information led to a reduction in errors rather than, as was the case in healthy controls, to an improvement in temporal-order recall. When accessing temporal information from long-term memory, patients were slower and committed more sequencing, boundary, and intrusion errors. Together, these results suggest that time information can be accessed and processed only imprecisely by patients who provide evidence for impaired time comprehension. This could contribute to symptomatic cognitive deficits and strategic inefficiency in schizophrenia.
Roberts, Tracy E; Tsourapas, Angelos; Sutcliffe, Lorna; Cassell, Jackie; Estcourt, Claudia
2012-02-01
To undertake a cost-consequence analysis to assess two new models of partner notification (PN), known as Accelerated Partner Therapy (APT Hotline and APT Pharmacy), as compared with routine patient referral PN, for sex partners of people with chlamydia, gonorrhoea and non-gonococcal urethritis. Comparison of costs and outcomes alongside an exploratory trial involving two genitourinary medicine clinics and six community pharmacies. Index patients selected the PN method (APT Hotline, APT Pharmacy or routine PN) for their partners. Clinics and pharmacies recorded cost and resource use data including duration of consultation and uptake of treatment pack. Cost data were collected prospectively for two out of three interventions, and data were synthesised and compared in terms of effectiveness and costs. Routine PN had the lowest average cost per partner treated (approximately £46) compared with either APT Hotline (approximately £54) or APT Pharmacy (approximately £53) strategies. The cost-consequence analysis revealed that APT strategies were more costly but also more effective at treating partners compared to routine PN. The hotline strategy costs more than both the alternative PN strategies. If we accept that strategies which identify and treat partners the fastest are likely to be the most effective in reducing reinfection and onward transmission, then APT Hotline appears an effective PN strategy by treating the highest number of partners in the shortest duration. Whether the additional benefit is worth the additional cost cannot be determined in this preliminary analysis. These data will be useful for informing development of future randomised controlled trials of APT.
Introduction in IND and recursive partitioning
NASA Technical Reports Server (NTRS)
Buntine, Wray; Caruana, Rich
1991-01-01
This manual describes the IND package for learning tree classifiers from data. The package is an integrated C and C shell re-implementation of tree learning routines such as CART, C4, and various MDL and Bayesian variations. The package includes routines for experiment control, interactive operation, and analysis of tree building. The manual introduces the system and its many options, gives a basic review of tree learning, contains a guide to the literature and a glossary, and lists the manual pages for the routines and instructions on installation.
Tank 241-B-108, cores 172 and 173 analytical results for the final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nuzum, J.L., Fluoro Daniel Hanford
1997-03-04
The Data Summary Table (Table 3) included in this report compiles analytical results in compliance with all applicable DQOS. Liquid subsamples that were prepared for analysis by an acid adjustment of the direct subsample are indicated by a `D` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a fusion digest are indicated by an `F` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a water digest are indicated by a I.wl. or an `I` in the A column of Table 3. Due to poormore » precision and accuracy in original analysis of both Lower Half Segment 2 of Core 173 and the core composite of Core 173, fusion and water digests were performed for a second time. Precision and accuracy improved with the repreparation of Core 173 Composite. Analyses with the repreparation of Lower Half Segment 2 of Core 173 did not show improvement and suggest sample heterogeneity. Results from both preparations are included in Table 3.« less
Pteros: fast and easy to use open-source C++ library for molecular analysis.
Yesylevskyy, Semen O
2012-07-15
An open-source Pteros library for molecular modeling and analysis of molecular dynamics trajectories for C++ programming language is introduced. Pteros provides a number of routine analysis operations ranging from reading and writing trajectory files and geometry transformations to structural alignment and computation of nonbonded interaction energies. The library features asynchronous trajectory reading and parallel execution of several analysis routines, which greatly simplifies development of computationally intensive trajectory analysis algorithms. Pteros programming interface is very simple and intuitive while the source code is well documented and easily extendible. Pteros is available for free under open-source Artistic License from http://sourceforge.net/projects/pteros/. Copyright © 2012 Wiley Periodicals, Inc.
Low Velocity Blunt Impact on Lightweight Composite Sandwich Panels
NASA Astrophysics Data System (ADS)
Chan, Monica Kar
There is an increased desire to incorporate more composite sandwich structures into modern aircrafts. Because in-service aircrafts routinely experience impact damage during maintenance due to ground vehicle collision, dropped equipment, or foreign object damage (FOD) impact, it is necessary to understand their impact characteristics, particularly when blunt impact sources create internal damage with little or no external visibility. The objective of this investigation is to explore damage formation in lightweight composite sandwich panels due to low-velocity impacts of variable tip radius and energy level. The correlation between barely visible external dent formation and internal core damage was explored as a function of impact tip radius. A pendulum impactor was used to impact composite sandwich panels having honeycomb core while held in a 165 mm square window fixture. The panels were impacted by hardened steel tips with radii of 12.7, 25.4, 50.8, and 76.2 mm at energy levels ranging from 2 to 14 J. Experimental data showed little dependence of external dent depth on tip radius at very low energies of 2 to 6 J, and thus, there was also little variation in visibility due to tip radius. Four modes of internal core damage were identified. Internal damage span and depth were dependent on impact tip radius. Damage depth was also radius-dependent, but stabilized at constant depth independent of kinetic energy. Internal damage span increased with increasing impact energy, but not with increasing tip radius, suggesting a relationship between maximum damage tip radius with core density/size.
Tank 241-AP-105, cores 208, 209 and 210, analytical results for the final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nuzum, J.L.
1997-10-24
This document is the final laboratory report for Tank 241-AP-105. Push mode core segments were removed from Risers 24 and 28 between July 2, 1997, and July 14, 1997. Segments were received and extruded at 222-S Laboratory. Analyses were performed in accordance with Tank 241-AP-105 Push Mode Core Sampling and Analysis Plan (TSAP) (Hu, 1997) and Tank Safety Screening Data Quality Objective (DQO) (Dukelow, et al., 1995). None of the subsamples submitted for total alpha activity (AT), differential scanning calorimetry (DSC) analysis, or total organic carbon (TOC) analysis exceeded the notification limits as stated in TSAP and DQO. The statisticalmore » results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group, and are not considered in this report. Appearance and Sample Handling Two cores, each consisting of four segments, were expected from Tank 241-AP-105. Three cores were sampled, and complete cores were not obtained. TSAP states core samples should be transported to the laboratory within three calendar days from the time each segment is removed from the tank. This requirement was not met for all cores. Attachment 1 illustrates subsamples generated in the laboratory for analysis and identifies their sources. This reference also relates tank farm identification numbers to their corresponding 222-S Laboratory sample numbers.« less
Nash, J. Thomas; Frishman, David
1983-01-01
Analytical results for 61 elements in 370 samples from the Ranger Mine area are reported. Most of the rocks come from drill core in the Ranger No. 1 and Ranger No. 3 deposits, but 20 samples are from unmineralized drill core more than 1 km from ore. Statistical tests show that the elements Mg, Fe, F, Be, Co, Li, Ni, Pb, Sc, Th, Ti, V, CI, As, Br, Au, Ce, Dy, La Sc, Eu, Tb, Yb, and Tb have positive association with uranium, and Si, Ca, Na, K, Sr, Ba, Ce, and Cs have negative association. For most lithologic subsets Mg, Fe, Li, Cr, Ni, Pb, V, Y, Sm, Sc, Eu, and Yb are significantly enriched in ore-bearing rocks, whereas Ca, Na, K, Sr, Ba, Mn, Ce, and Cs are significantly depleted. These results are consistent with petrographic observations on altered rocks. Lithogeochemistry can aid exploration, but for these rocks requires methods that are expensive and not amenable to routine use.
An efficient solver for large structured eigenvalue problems in relativistic quantum chemistry
NASA Astrophysics Data System (ADS)
Shiozaki, Toru
2017-01-01
We report an efficient program for computing the eigenvalues and symmetry-adapted eigenvectors of very large quaternionic (or Hermitian skew-Hamiltonian) matrices, using which structure-preserving diagonalisation of matrices of dimension N > 10, 000 is now routine on a single computer node. Such matrices appear frequently in relativistic quantum chemistry owing to the time-reversal symmetry. The implementation is based on a blocked version of the Paige-Van Loan algorithm, which allows us to use the Level 3 BLAS subroutines for most of the computations. Taking advantage of the symmetry, the program is faster by up to a factor of 2 than state-of-the-art implementations of complex Hermitian diagonalisation; diagonalising a 12, 800 × 12, 800 matrix took 42.8 (9.5) and 85.6 (12.6) minutes with 1 CPU core (16 CPU cores) using our symmetry-adapted solver and Intel Math Kernel Library's ZHEEV that is not structure-preserving, respectively. The source code is publicly available under the FreeBSD licence.
Results from Testing of Two Rotary Percussive Drilling Systems
NASA Technical Reports Server (NTRS)
Kriechbaum, Kristopher; Brown, Kyle; Cady, Ian; von der Heydt, Max; Klein, Kerry; Kulczycki, Eric; Okon, Avi
2010-01-01
The developmental test program for the MSL (Mars Science Laboratory) rotary percussive drill examined the e ect of various drill input parameters on the drill pene- tration rate. Some of the input parameters tested were drill angle with respect to gravity and percussive impact energy. The suite of rocks tested ranged from a high strength basalt to soft Kaolinite clay. We developed a hole start routine to reduce high sideloads from bit walk. The ongoing development test program for the IMSAH (Integrated Mars Sample Acquisition and Handling) rotary percussive corer uses many of the same rocks as the MSL suite. An additional performance parameter is core integrity. The MSL development test drill and the IMSAH test drill use similar hardware to provide rotation and percussion. However, the MSL test drill uses external stabilizers, while the IMSAH test drill does not have external stabilization. In addition the IMSAH drill is a core drill, while the MSL drill uses a solid powdering bit. Results from the testing of these two related drilling systems is examined.
2018-01-01
The heat exchange properties of aircrew clothing including a Constant Wear Immersion Suit (CWIS), and the environmental conditions in which heat strain would impair operational performance, were investigated. The maximum evaporative potential (im/clo) of six clothing ensembles (three with a flight suit (FLY) and three with a CWIS) of varying undergarment layers were measured with a heated sweating manikin. Biophysical modelling estimated the environmental conditions in which body core temperature would elevate above 38.0°C during routine flight. The im/clo was reduced with additional undergarment layers, and was more restricted in CWIS compared to FLY ensembles. A significant linear relationship (r2 = 0.98, P<0.001) was observed between im/clo and the highest wet-bulb globe temperature in which the flight scenario could be completed without body core temperature exceeding 38.0°C. These findings provide a valuable tool for clothing manufacturers and mission planners for the development and selection of CWIS’s for aircrew. PMID:29723267
Recommended patient-reported core set of symptoms to measure in prostate cancer treatment trials.
Chen, Ronald C; Chang, Peter; Vetter, Richard J; Lukka, Himansu; Stokes, William A; Sanda, Martin G; Watkins-Bruner, Deborah; Reeve, Bryce B; Sandler, Howard M
2014-07-01
The National Cancer Institute (NCI) Symptom Management and Health-Related Quality of Life Steering Committee convened four working groups to recommend core sets of patient-reported outcomes to be routinely incorporated in clinical trials. The Prostate Cancer Working Group included physicians, researchers, and a patient advocate. The group's process included 1) a systematic literature review to determine the prevalence and severity of symptoms, 2) a multistakeholder meeting sponsored by the NCI to review the evidence and build consensus, and 3) a postmeeting expert panel synthesis of findings to finalize recommendations. Five domains were recommended for localized prostate cancer: urinary incontinence, urinary obstruction and irritation, bowel-related symptoms, sexual dysfunction, and hormonal symptoms. Four domains were recommended for advanced prostate cancer: pain, fatigue, mental well-being, and physical well-being. Additional domains for consideration include decisional regret, satisfaction with care, and anxiety related to prostate cancer. These recommendations have been endorsed by the NCI for implementation. © The Author 2014. Published by Oxford University Press. All rights reserved.
Riberto, M; Chiappetta, L M; Lopes, K A; Chiappetta, L R
2014-04-01
Low back pain is a leading cause of disability in Brazil. The multiple aspects of disability in these patients require comprehensive tools for their assessment. The International Classification of Functioning, Disability, and Health (ICF) core set for low back pain is designed to comprehensively describe the experience of such patients with their functioning. This study aimed to describe functioning and contextual factors and to empirically validate the ICF core set for low back pain. Cross sectional study. Three outpatient clinics in Manaus, Maceio and São Paulo, Brazil. Population. 135 low back pain outpatients under rehabilitation. Data concerning diagnosis, personal features, and the 78 ICF core set categories for low back pain were collected from clinical charts, physical examinations, tests, and interviews with patients from rehabilitation services in three parts of Brazil. 7.7% of the categories (6 body functions and 10 activity and participation) were affected in less than 20% of the sample, and were thus considered not validated. Pain and other sensations related to the musculoskeletal system were the body most frequently impaired functions. Mobility and domestic life were the chapters of activity and limitation most often described as limited. All environmental factors were qualified as either facilitators or barriers and acted as modulators of disability. The comprehensive ICF core sets for low back pain can be used to describe the living experience of such individuals, although efforts to make it operational and enhance the reproducibility of the results are needed to warrant its reliable routine use. This study highlights the importance of a complete assessment of chronic low back pain and demonstrate the need for multidisciplinary approach.
Frank, Leslie Smith; Frank, James L; March, David; Makari-Judson, Grace; Barham, Ruth B; Mertens, Wilson C
2007-01-01
To determine whether therapeutic touch administered at the time of stereotactic core biopsy of suspicious breast lesions results in a reduction in anxiety and pain. Randomized, patient-blinded, controlled trial of either Krieger-Kunz therapeutic touch administered by a trained practitioner or a sham intervention mimicking therapeutic touch delivered during core biopsy. Stereotactic breast biopsy unit of a comprehensive breast center. Women with mammographically detected, nonpalpable breast lesions requiring biopsy. Changes in pain and anxiety measured by visual analog scales immediately before and after stereotactic core biopsy. A total of 82 patients were accrued: 42 received actual therapeutic touch and 40 sham therapeutic touch. No significant differences were found between the arms for age, ethnicity, educational background, or other demographic data. The sham arm had a preponderance of left breast biopsies (48% vs 58%; P = 0.07) and received a slightly higher volume of epinephrine-containing local anesthetic (6.5 +/- 6.1 vs 4.5 +/- 4.5 mL; P = 0.09). Therapeutic touch patients were more likely to have an upper breast lesion location (57% vs 53%; P = 0.022). No significant differences between the arms were seen regarding postbiopsy pain (P = 0.95), anxiety (P = 0.66), fearfulness, or physiological parameters. Similarly, no differences were seen between the arms when change in parameters from prebiopsy to postbiopsy was considered for any of the psychological or physiological variables measured. These findings persisted when confounding variables were controlled for. Women undergoing stereotactic core breast biopsy received no significant benefit from therapeutic touch administered during the procedure. Therapeutic touch cannot be routinely recommended for patients in this setting.
NASA Astrophysics Data System (ADS)
Murray, R.; Lascu, I.; Plank, C.
2007-12-01
Deming Lake is a small (<1 square km), deep (about 17m), meromictic kettle lake situated near the prairie- forest boundary, in Itasca State Park, MN. Because of the lake's location and morphology, the accumulated sediments comprise a high-resolution record of limnological and ecological changes in response to Holocene climate variations. We used a shore perpendicular transect of three cores (located in littoral, mid-slope, and profundal settings) and ground penetrating radar (GPR) profiles to investigate Holocene lake-level variability at Deming. Cores were sampled continuously at a 1-2 cm resolution and sediment composition (in terms of percent organic matter, carbonate material, and minerogenic residue) was determined via loss on ignition (LOI). Isothermal remanent magnetization (IRM) and anhysteretic remanent magnetization (ARM) were used as proxies of magnetic mineral concentration and grain size. Four lithostratigraphic units were identified and correlated between cores based on these analyses. Changes in GPR facies corroborate the correlation between the two shallow cores. In order to inform our interpretation of down-core variations in magnetic properties and LOI values in terms of variations in lake depth, a suite of over 70 modern sediment samples were collected from the basin and analyzed. LOI compositional variability across the basin was high, with no clear trends related to depth or distance from shore. A sharp decrease in minerogenic content was observed at depths consistent with a predicted wave-base of 0.5 m, but aside from this trend it appears the steep slopes of much of the basin promote gravity driven slumping and mixing of sediments at depth. In the profundal sediments IRM values are routinely 5% higher than in the slope and littoral environments, while ARM/IRM ratios indicate an increase in magnetic grain size with water depth. We infer that an increase in coarse organic material in the shallow-water cores of Deming records a period of aridity (associated with a decrease lake-level less than 2m based on GPR profiles) and/or increased water clarity during the regionally expansive mid-Holocene dry period. We do not see clear evidence of late-Holocene lake level change of a significant magnitude (i.e. >1m). While remanence measurements (especially IRM) often correlate with the LOI residue, interference in the IRM resulting from the dissolution of magnetic minerals casts uncertainty into the reliability of our magnetic measurements as a signal of climate driven limnological change. Additional measurements must be performed before definite interpretations about the lake-level changes at Deming can be made. We suggest that future studies look more closely at the near-shore record (water depths <1m), as our results indicate shoreline migration in response to moisture balance fluctuations during the last 1000 years (as recorded at numerous sites in the great plains and upper Midwest) may have been subtle.
3D Magnetic Field Analysis of a Turbine Generator Stator Core-end Region
NASA Astrophysics Data System (ADS)
Wakui, Shinichi; Takahashi, Kazuhiko; Ide, Kazumasa; Takahashi, Miyoshi; Watanabe, Takashi
In this paper we calculated magnetic flux density and eddy current distributions of a 71MVA turbine generator stator core-end using three-dimensional numerical magnetic field analysis. Subsequently, the magnetic flux densities and eddy current densities in the stator core-end region on the no-load and three-phase short circuit conditions obtained by the analysis have good agreements with the measurements. Furthermore, the differences of eddy current and eddy current loss in the stator core-end region for various load conditions are shown numerically. As a result, the facing had an effect that decrease the eddy current loss of the end plate about 84%.
A Nursing Intelligence System to Support Secondary Use of Nursing Routine Data
Rauchegger, F.; Ammenwerth, E.
2015-01-01
Summary Background Nursing care is facing exponential growth of information from nursing documentation. This amount of electronically available data collected routinely opens up new opportunities for secondary use. Objectives To present a case study of a nursing intelligence system for reusing routinely collected nursing documentation data for multiple purposes, including quality management of nursing care. Methods The SPIRIT framework for systematically planning the reuse of clinical routine data was leveraged to design a nursing intelligence system which then was implemented using open source tools in a large university hospital group following the spiral model of software engineering. Results The nursing intelligence system is in routine use now and updated regularly, and includes over 40 million data sets. It allows the outcome and quality analysis of data related to the nursing process. Conclusions Following a systematic approach for planning and designing a solution for reusing routine care data appeared to be successful. The resulting nursing intelligence system is useful in practice now, but remains malleable for future changes. PMID:26171085
A Gendered Lifestyle-Routine Activity Approach to Explaining Stalking Victimization in Canada.
Reyns, Bradford W; Henson, Billy; Fisher, Bonnie S; Fox, Kathleen A; Nobles, Matt R
2016-05-01
Research into stalking victimization has proliferated over the last two decades, but several research questions related to victimization risk remain unanswered. Accordingly, the present study utilized a lifestyle-routine activity theoretical perspective to identify risk factors for victimization. Gender-based theoretical models also were estimated to assess the possible moderating effects of gender on the relationship between lifestyle-routine activity concepts and victimization risk. Based on an analysis of a representative sample of more than 15,000 residents of Canada from the Canadian General Social Survey (GSS), results suggested conditional support for lifestyle-routine activity theory and for the hypothesis that predictors of stalking victimization may be gender based. © The Author(s) 2015.
Rasch Analysis of the Routines-Based Interview Implementation Checklist
ERIC Educational Resources Information Center
Boavida, Tânia; Akers, Kate; McWilliam, R. A.; Jung, Lee Ann
2015-01-01
The Routines-Based Interview (RBI) is useful for developing functional outcomes/goals, for establishing strong relationships with families, and for assessing the family's true needs. In this study, the authors investigated the psychometric properties of the RBI Implementation Checklist, conducted by 120 early intervention professionals,…
Isaac, Arnold Emerson; Sinha, Sitabhra
2015-10-01
The representation of proteins as networks of interacting amino acids, referred to as protein contact networks (PCN), and their subsequent analyses using graph theoretic tools, can provide novel insights into the key functional roles of specific groups of residues. We have characterized the networks corresponding to the native states of 66 proteins (belonging to different families) in terms of their core-periphery organization. The resulting hierarchical classification of the amino acid constituents of a protein arranges the residues into successive layers - having higher core order - with increasing connection density, ranging from a sparsely linked periphery to a densely intra-connected core (distinct from the earlier concept of protein core defined in terms of the three-dimensional geometry of the native state, which has least solvent accessibility). Our results show that residues in the inner cores are more conserved than those at the periphery. Underlining the functional importance of the network core, we see that the receptor sites for known ligand molecules of most proteins occur in the innermost core. Furthermore, the association of residues with structural pockets and cavities in binding or active sites increases with the core order. From mutation sensitivity analysis, we show that the probability of deleterious or intolerant mutations also increases with the core order. We also show that stabilization centre residues are in the innermost cores, suggesting that the network core is critically important in maintaining the structural stability of the protein. A publicly available Web resource for performing core-periphery analysis of any protein whose native state is known has been made available by us at http://www.imsc.res.in/ ~sitabhra/proteinKcore/index.html.
Advanced Materials and Solids Analysis Research Core (AMSARC)
The Advanced Materials and Solids Analysis Research Core (AMSARC), centered at the U.S. Environmental Protection Agency's (EPA) Andrew W. Breidenbach Environmental Research Center in Cincinnati, Ohio, is the foundation for the Agency's solids and surfaces analysis capabilities. ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kryanev, A. V.; Udumyan, D. K.; Kurchenkov, A. Yu., E-mail: s327@vver.kiae.ru
2014-12-15
Problems associated with determining the power distribution in the VVER-440 core on the basis of a neutron-physics calculation and data from in-core monitors are considered. A new mathematical scheme is proposed for this on the basis of a metric analysis. In relation to the existing mathematical schemes, the scheme in question improves the accuracy and reliability of the resulting power distribution.
Görlitz, Anja; Ebert, Thomas; Bauer, Daniel; Grasl, Matthäus; Hofer, Matthias; Lammerding-Köppel, Maria; Fabry, Götz
2015-01-01
Recent developments in medical education have created increasing challenges for medical teachers which is why the majority of German medical schools already offer educational and instructional skills trainings for their teaching staff. However, to date no framework for educational core competencies for medical teachers exists that might serve as guidance for the qualification of the teaching faculty. Against the background of the discussion about competency based medical education and based upon the international literature, the GMA Committee for Faculty and Organizational Development in Teaching developed a model of core teaching competencies for medical teachers. This framework is designed not only to provide guidance with regard to individual qualification profiles but also to support further advancement of the content, training formats and evaluation of faculty development initiatives and thus, to establish uniform quality criteria for such initiatives in German-speaking medical schools. The model comprises a framework of six competency fields, subdivided into competency components and learning objectives. Additional examples of their use in medical teaching scenarios illustrate and clarify each specific teaching competency. The model has been designed for routine application in medical schools and is thought to be complemented consecutively by additional competencies for teachers with special duties and responsibilities in a future step.
A microcomputer-based whole-body counter for personnel routine monitoring.
Chou, H P; Tsai, T M; Lan, C Y
1993-05-01
The paper describes a cost-effective NaI(Tl) whole-body counter developed for routine examinations of worker intakes at an isotope production facility. Signal processing, data analysis and system operation are microcomputer-controlled for minimum human interactions. The pulse height analyzer is developed as an microcomputer add-on card for easy manipulation. The scheme for radionuclide analysis is aimed for fast running according to a knowledge base established from background samples and phantom experiments in conjunction with a multivariate regression analysis. Long-term stability and calibration with standards and in vivo measurements are reported.
Publications - GMC 385 | Alaska Division of Geological & Geophysical
DGGS GMC 385 Publication Details Title: Porosity, permeability, and capillary pressure core analysis Shimer, G., 2011, Porosity, permeability, and capillary pressure core analysis results (2,124'-2,193 -capilar.xls (108.0 K) gmc385-cores-water.xls (19.0 K) Keywords Oil and Gas; Permeability; Porosity Top of Page
ERIC Educational Resources Information Center
Alase, Abayomi
2017-01-01
This interpretative phenomenological analysis (IPA) study investigated and interpreted the Common Core State Standards program (the phenomenon) that has been the dominating topic of discussions amongst educators all across the country since the inauguration of the program in 2014/2015 school session. Common Core State Standards (CCSS) was a…
TRAC-BF1 thermal-hydraulic, ANSYS stress analysis for core shroud cracking phenomena
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shoop, U.; Feltus, M.A.; Baratta, A.J.
1996-12-31
The U.S. Nuclear Regulatory Commission sent Generic Letter 94-03 informing all licensees about the intergranular stress corrosion cracking (IGSCC) of core shrouds found in both Dresden unit I and Quad Cities unit 1. The letter directed all licensees to perform safety analysis of their boiling water reactor (BWR) units. Two transients of special concern for the core shroud safety analysis include the main steam line break (MSLB) and recirculation line break transient.
GALFIT-CORSAIR: Implementing the Core-Sérsic Model Into GALFIT
NASA Astrophysics Data System (ADS)
Bonfini, Paolo
2014-10-01
We introduce GALFIT-CORSAIR: a publicly available, fully retro-compatible modification of the 2D fitting software GALFIT (v.3) which adds an implementation of the core-Sersic model. We demonstrate the software by fitting the images of NGC 5557 and NGC 5813, which have been previously identified as core-Sersic galaxies by their 1D radial light profiles. These two examples are representative of different dust obscuration conditions, and of bulge/disk decomposition. To perform the analysis, we obtained deep Hubble Legacy Archive (HLA) mosaics in the F555W filter (~V-band). We successfully reproduce the results of the previous 1D analysis, modulo the intrinsic differences between the 1D and the 2D fitting procedures. The code and the analysis procedure described here have been developed for the first coherent 2D analysis of a sample of core-Sersic galaxies, which will be presented in a forth-coming paper. As the 2D analysis provides better constraining on multi-component fitting, and is fully seeing-corrected, it will yield complementary constraints on the missing mass in depleted galaxy cores.
Organisation of services for managing ADHD.
Coghill, D R
2017-10-01
There is considerable variation in practice, both between and with different countries in the management of attention deficit hyperactivity disorder (ADHD). Whilst there is no one optimal model of service organisation there are general principles of care that can be introduced to reduce this variability. There are frequent debates and discussions about which professional group is best placed to manage ADHD at different points in the life cycle. Who delivers care is however less important than ensuring that training schemes provide adequate exposure, training and experience to both the core and non-core skills required to provide a comprehensive package of care. Most evidence-based guidelines recommend a multi-modal, multi-professional and multi-agency approach. Many also promote the use of both stepped care and shared care approaches for the management of ADHD. As most of those with ADHD continue to have ADHD-related problems into adulthood, it is important to consider how best to transition care into adulthood and think about who should deliver care to adults with ADHD. Young people with ADHD should generally be transferred to adult mental health services if they continue to have significant symptoms of ADHD or other coexisting conditions that require treatment. Unfortunately services for adults with ADHD remain relatively scarce across much of the world and some adult psychiatrists remain unsure of the diagnosis and uncertain about the appropriate use of ADHD medications in adults, but there is a strong case for increased services for adults. ADHD is on the one hand easy to treat; it is much more difficult to treat well. Although optimised care for ADHD requires routine measurement of outcomes, this often does not happen in routine clinical practice. Focusing on optimising symptoms and minimising adverse effects can significantly improve both short- and long-term outcomes.
Nitrogen partitioning during core-mantle differentiation
NASA Astrophysics Data System (ADS)
Speelmanns, I. M.; Schmidt, M. W.; Liebske, C.
2016-12-01
This study investiagtes nitrogen partitioing between metal and silicate melts as relevant for core segregation during the accretion of planetesimals into the Earth. On present day Earth, N belongs to the most important elements, as it is one of the key constituents of our atmosphere and forms the basis of life. However, the geochemistry of N, i.e. its distribution and isotopic fractionation between Earth's deep reservoirs is not well constrained. In order to determine the partitioning behaviour of N, a centrifuging piston cylinder was used to euqilibrate and then gravitationally separate metal-silicate melt pairs at 1250 °C, 1 GPa over the range of oxygen fugacities thought to have prevailied druing core segreagtion (IW-4 to IW). Complete segregation of the two melts was reached within 3 hours at 1000 g, the interface showing a nice meniscus The applied double capsule technique, using an outer metallic and inner non-metallic (mostly graphite) capsule, minimizes volatile loss over the course of the experiment compared to single non-metallic capsules. The two quenched melts were cut apart, cleaned at the outside and N concentrations of the melts were analysed on bulk samples by an elemental analyser. Nevertheless, the low amount of sample material and the N yield in the high pressure experiments required the developement of new analytical routines. Despite these experimental and analytical difficulties, we were able to determine a DNmetal/silicateof 13±0.25 at IW-1, N partitioning into the core froming metal. The few availible literature data [1],[2] suggest that N changes its compatibility favoring the silicate melt or magma ocean at around IW-2.5. In order to asses how much N may effectively be contained in the core and the silicate Earth, experiments characterizing N behaviour over the entire range of core formation condtitions are well under way. [1] Kadik et al., (2011) Geochemistry International 49.5: 429-438. [2] Roskosz et al., (2013) GCA 121: 15-28.
Birdwell, Justin E.; Boehlke, Adam; Paxton, Stanley T.; Whidden, Katherine J.; Pearson, Ofori N.
2017-01-01
The Eagle Ford shale is a major continuous oil and gas resource play in southcentral Texas and a source for other oil accumulations in the East Texas Basin. As part of the U.S. Geological Survey’s (USGS) petroleum system assessment and research efforts, a coring program to obtain several immature, shallow cores from near the outcrop belt in central Texas has been undertaken. The first of these cores, USGS Gulf Coast #1 West Woodway, was collected near Waco, Texas, in September 2015 and has undergone extensive geochemical and mineralogical characterization using routine methods to ascertain variations in the lithologies and chemofacies present in the Eagle Ford at this locale. Approximately 270 ft of core was examined for this study, focusing on the Eagle Ford Group interval between the overlying Austin Chalk and underlying Buda Limestone (~20 ft of each). Based on previous work to identify the stratigraphy of the Eagle Ford Group in the Waco area and elsewhere (Liro et al., 1994; Robison, 1997; Ratcliffe et al., 2012; Boling and Dworkin, 2015; Fairbanks et al., 2016, and references therein), several lithological units were expected to be present, including the Pepper Shale (or Woodbine), the Lake Waco Formation (or Lower Eagle Ford, including the Bluebonnet, Cloice, and Bouldin or Flaggy Cloice members), and the South Bosque Member (Upper Eagle Ford). The results presented here indicate that there are three major chemofacies present in the cored interval, which are generally consistent with previous descriptions of the Eagle Ford Group in this area. The relatively high-resolution sampling (every two ft above the Buda, 432.8 ft depth, and below the Austin Chalk, 163.5 ft depth) provides great detail in terms of geochemical and mineralogical properties supplementing previous work on immature Eagle Ford Shale near the outcrop belt.
[Management of aflibercept in routine clinical practice].
Cabrera López, F
2015-03-01
Aflibercept is a new anti-vegf drug that, unlike ranibizumab and bevacizumab blocks both vegf-A and placental growth factor. Moreover, it binds with much greater strength and affinity to human VEGF-A165 than other endogenous vegf receptors, conferring it with a more extended effect and allowing a lower frequency of intravitreal injections. This facilitates the adoption of fixed treatment regimens other than monthly or individual regimens such as "treat and extend". Aflibercept is indicated for the treatment of neovascular (exudative) age-related macular degeneration (ARMD), visual alteration due to macular edema secondary to central retinal vein occlusion (CRVO) and visual alteration due to diabetic macular edema (DME). The present article reviews the management of aflibercept in routine clinical practice, based on the specifications of its new core data sheet, which includes all the therapeutic indications in which its use has been approved and evaluating the distinct alternatives and treatment regimens after the initial loading doses. Copyright © 2015 Sociedad Española de Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.
NASA Technical Reports Server (NTRS)
Rawson, R. F.; Hamilton, R. E.; Liskow, C. L.; Dias, A. R.; Jackson, P. L.
1981-01-01
An analysis of synthetic aperture radar data of SP Mountain was undertaken to demonstrate the use of digital image processing techniques to aid in geologic interpretation of SAR data. These data were collected with the ERIM X- and L-band airborne SAR using like- and cross-polarizations. The resulting signal films were used to produce computer compatible tapes, from which four-channel imagery was generated. Slant range-to-ground range and range-azimuth-scale corrections were made in order to facilitate image registration; intensity corrections were also made. Manual interpretation of the imagery showed that L-band represented the geology of the area better than X-band. Several differences between the various images were also noted. Further digital analysis of the corrected data was done for enhancement purposes. This analysis included application of an MSS differencing routine and development of a routine for removal of relief displacement. It was found that accurate registration of the SAR channels is critical to the effectiveness of the differencing routine. Use of the relief displacement algorithm on the SP Mountain data demonstrated the feasibility of the technique.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bessho, Yasunori; Yokomizo, Osamu; Yoshimoto, Yuichiro
1997-03-01
Development and qualification results are described for a three-dimensional, time-domain core dynamics analysis program for commercial boiling water reactors (BWRs). The program allows analysis of the reactor core with a detailed mesh division, which eliminates calculational ambiguity in the nuclear-thermal-hydraulic stability analysis caused by reactor core regional division. During development, emphasis was placed on high calculational speed and large memory size as attained by the latest supercomputer technology. The program consists of six major modules, namely a core neutronics module, a fuel heat conduction/transfer module, a fuel channel thermal-hydraulic module, an upper plenum/separator module, a feedwater/recirculation flow module, and amore » control system module. Its core neutronics module is based on the modified one-group neutron kinetics equation with the prompt jump approximation and with six delayed neutron precursor groups. The module is used to analyze one fuel bundle of the reactor core with one mesh (region). The fuel heat conduction/transfer module solves the one-dimensional heat conduction equation in the radial direction with ten nodes in the fuel pin. The fuel channel thermal-hydraulic module is based on separated three-equation, two-phase flow equations with the drift flux correlation, and it analyzes one fuel bundle of the reactor core with one channel to evaluate flow redistribution between channels precisely. Thermal margin is evaluated by using the GEXL correlation, for example, in the module.« less
[The family in the Pediatric Unit: living with rules and hospital routines].
Xavier, Daiani Modernel; Gomes, Giovana Calcagno; Santos, Silvana Sidney Costa; Lunardi, Valéria Lerch; Pintanel, Aline Campelo; Erdmann, Alacoque Lorenzini
2014-01-01
The study aimed to know, in Foucault's view, how the family caregiver of the child deals with the rules and routines in the hospital. Descriptive qualitative study, conducted in the second half of 2011. It had the Grounded Theory as methodological framework. It was developed in the pediatric unit of a university hospital in southern Brazil, with eighteen family caregivers. The data collection was performed by semi-structured interviews and the analysis through open, axial and selective coding. It was noticed that the family tends to conform to such rules and routines in the hospital, but recognizes the importance of its flexibility, exercising endurance, as dialoguing, or as trespassing such rules and routines, in search of autonomy, when they realize that these do not address their needs. It is important to use rules and routines to enable the family practices and spaces of freedom, autonomy and resistance.
Eigenvalue routines in NASTRAN: A comparison with the Block Lanczos method
NASA Technical Reports Server (NTRS)
Tischler, V. A.; Venkayya, Vipperla B.
1993-01-01
The NASA STRuctural ANalysis (NASTRAN) program is one of the most extensively used engineering applications software in the world. It contains a wealth of matrix operations and numerical solution techniques, and they were used to construct efficient eigenvalue routines. The purpose of this paper is to examine the current eigenvalue routines in NASTRAN and to make efficiency comparisons with a more recent implementation of the Block Lanczos algorithm by Boeing Computer Services (BCS). This eigenvalue routine is now available in the BCS mathematics library as well as in several commercial versions of NASTRAN. In addition, CRAY maintains a modified version of this routine on their network. Several example problems, with a varying number of degrees of freedom, were selected primarily for efficiency bench-marking. Accuracy is not an issue, because they all gave comparable results. The Block Lanczos algorithm was found to be extremely efficient, in particular, for very large size problems.
Motion of the Mantle in the Translational Modes of the Earth and Mercury
NASA Technical Reports Server (NTRS)
Grinfeld, Pavel; Wisdom, Jack
2005-01-01
Slichter modes refer to the translational motion of the inner core with respect to the outer core and the mantle. The polar Slichter mode is the motion of the inner core along the axis of rotation. Busse presented an analysis of the polar mode which yielded an expression for its period. Busse's analysis included the assumption that the mantle was stationary. This approximation is valid for planets with small inner cores, such as the Earth whose inner core is about 1/60 of the total planet mass. On the other hand, many believe that Mercury's core may be enormous. If so, the motion of the mantle should be expected to produce a significant effect. We present a formal framework for including the motion of the mantle in the analysis of the translational motion of the inner core. We analyze the effect of the motion of the mantle on the Slichter modes for a non-rotating planet with an inner core of arbitrary size. We omit the effects of viscosity in the outer core, magnetic effects, and solid tides. Our approach is perturbative and is based on a linearization of Euler's equations for the motion of the fluid and Newton's second law for the motion of the inner core. We find an analytical expression for the period of the Slichter mode. Our result agrees with Busse's in the limiting case of small inner core. We present the unexpected result that even for Mercury the motion of the mantle does not significantly change the period of oscillation.
Yang, Bo; Xu, Zhe-Qi; Zhang, Hao; Xu, Feng-Ying; Shi, Xue-Yin; Zou, Zui; Ling, Chang-Quan; Tang, Ling
2014-01-01
Yunnan Baiyao (YNBY) is widely used to treat rhexis haemorrhage and ulcer in China. This meta-analysis was conducted to determine the efficacy of YNBY on local haemostasis and antiulcer. Randomized controlled trials were included on condition that assessing the effects of YNBY with/without routine drugs versus the same routine drugs on haemorrhage or ulcer after searching major databases. Data were validated, extracted and synthesized using relative risk (RR) for dichotomous data using random effects models. Fifty-five studies involving 5,150 patients were identified. (1) YNBY alone for haemorrhage (RR = 1.16; 95% CI 1.06 to 1.28) (2) YNBY alone for antiulcer (RR = 1.26; 95% CI 1.03 to 1.53). We found certain effects on ulcerative colitis (RR = 1.22) and skin ulcer (RR = 1.20) in subgroup analysis. (3) YNBY plus routine haemostatic drugs for haemorrhage (RR = 1.23; 95% CI 1.17 to 1.29) with a significant funnel plot asymmetry (Begg’s test, p = 0). (4) YNBY plus routine antiulcer drugs for antiulcer (RR = 1.18; 95% CI 1.05 to 1.33). Treatment effect in the 2nd and 4th group was unstable when RCTs at high risk of bias were excluded. Great heterogeneities and possible publication bias were found among the trials which preclude certain conclusions. The existing data showed that YNBY alone was helpful in treating uterine haemorrhage, ulcerative colitis and skin ulcer. YNBY plus routine antiulcer drugs was more effective in treating ulcerative colitis versus antiulcer drugs alone. PMID:24753739
A New Capability for Nuclear Thermal Propulsion Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amiri, Benjamin W.; Nuclear and Radiological Engineering Department, University of Florida, Gainesville, FL 32611; Kapernick, Richard J.
2007-01-30
This paper describes a new capability for Nuclear Thermal Propulsion (NTP) design that has been developed, and presents the results of some analyses performed with this design tool. The purpose of the tool is to design to specified mission and material limits, while maximizing system thrust to weight. The head end of the design tool utilizes the ROCket Engine Transient Simulation (ROCETS) code to generate a system design and system design requirements as inputs to the core analysis. ROCETS is a modular system level code which has been used extensively in the liquid rocket engine industry for many years. Themore » core design tool performs high-fidelity reactor core nuclear and thermal-hydraulic design analysis. At the heart of this process are two codes TMSS-NTP and NTPgen, which together greatly automate the analysis, providing the capability to rapidly produce designs that meet all specified requirements while minimizing mass. A PERL based command script, called CORE DESIGNER controls the execution of these two codes, and checks for convergence throughout the process. TMSS-NTP is executed first, to produce a suite of core designs that meet the specified reactor core mechanical, thermal-hydraulic and structural requirements. The suite of designs consists of a set of core layouts and, for each core layout specific designs that span a range of core fuel volumes. NTPgen generates MCNPX models for each of the core designs from TMSS-NTP. Iterative analyses are performed in NTPgen until a reactor design (fuel volume) is identified for each core layout that meets cold and hot operation reactivity requirements and that is zoned to meet a radial core power distribution requirement.« less
NASA Technical Reports Server (NTRS)
Kandula, Max; Pearce, Daniel
1989-01-01
A steady incompressible three-dimensional (3-D) viscous flow analysis was conducted for the Space Shuttle Main Propulsion External Tank (ET)/Orbiter (ORB) propellant feed line quick separable 17-inch disconnect flapper valves for liquid oxygen (LO2) and liquid hydrogen (LH2). The main objectives of the analysis were to predict and correlate the hydrodynamic stability of the flappers and pressure drop with available water test data. Computational Fluid Dynamics (CFD) computer codes were procured at no cost from the public domain, and were modified and extended to carry out the disconnect flow analysis. The grid generator codes SVTGD3D and INGRID were obtained. NASA Ames Research Center supplied the flow solution code INS3D, and the color graphics code PLOT3D. A driver routine was developed to automate the grid generation process. Components such as pipes, elbows, and flappers can be generated with simple commands, and flapper angles can be varied easily. The flow solver INS3D code was modified to treat interior flappers, and other interfacing routines were developed, which include a turbulence model, a force/moment routine, a time-step routine, and initial and boundary conditions. In particular, an under-relaxation scheme was implemented to enhance the solution stability. Major physical assumptions and simplifications made in the analysis include the neglect of linkages, slightly reduced flapper diameter, and smooth solid surfaces. A grid size of 54 x 21 x 25 was employed for both the LO2 and LH2 units. Mixing length theory applied to turbulent shear flow in pipes formed the basis for the simple turbulence model. Results of the analysis are presented for LO2 and LH2 disconnects.
Antoine, Clemence; Benfari, Giovanni; Michelena, Hector I; Malouf, Joseph F; Nkomo, Vuyisile T; Thapa, Prabin; Enriquez-Sarano, Maurice
2018-05-31
Background -Echocardiographic quantitation of degenerative mitral regurgitation (DMR) is recommended whenever possible in clinical guidelines but is criticized and its scalability to routine clinical practice doubted. We hypothesized that echocardiographic DMR quantitation, performed in routine clinical practice by multiple practitioners predicts independently long-term survival, and thus is essential to DMR management. Methods -We included patients diagnosed with isolated mitral-valve-prolapse 2003-2011 and any degree of MR quantified by any physician/sonographer in routine clinical practice. Clinical/echocardiographic data acquired at diagnosis were retrieved electronically. Endpoint was mortality under medical treatment analyzed by Kaplan-Meir method and Proportional-Hazard models. Results -The cohort included 3914 patients (55% male) aged 62±17 years, with left ventricular ejection fraction (LVEF) 63±8% and routinely measured effective regurgitant orifice area (EROA) 19[0-40] mm 2 During follow-up (6.7±3.1 years) 696 patients died under medical management and 1263 underwent mitral surgery. In multivariate analysis, routinely measured EROA was associated with mortality (adjusted-hazard-ratio 1.19[1.13-1.24] p<0.0001 per-10mm 2 ) independently of LVEF and end-systolic diameter, symptoms and age/comorbidities. The association between routinely measured EROA and mortality persisted with competitive risk modeling (adjusted hazard-ratio 1.15[1.10-1.20] per 10mm 2 p<0.0001), or in patients without guideline-based Class I/II surgical triggers (adjusted hazard ratio 1.19[1.10-1.28] per 10mm 2 p<0.0001) and in all subgroups examined (all p<0.01). Spline curve analysis showed that, compared with general population mortality, excess mortality appears for moderate DMR (EROA ≥20mm 2 ) becomes notable ≥EROA 30mm 2 and steadily increases with higher EROA levels, > 40 mm 2 threshold. Conclusions -Echocardiographic DMR quantitation is scalable to routine practice and is independently associated with clinical outcome. Routinely measured EROA is strongly associated with long-term survival under medical treatment. Excess mortality vs. the general population appears in the "moderate" DMR range and steadily increases with higher EROA. Hence, individual EROA values should be integrated into therapeutic considerations, additionally to categorical DMR grading.
Hackl, W O; Ammenwerth, E
2016-01-01
Secondary use of clinical routine data is receiving an increasing amount of attention in biomedicine and healthcare. However, building and analysing integrated clinical routine data repositories are nontrivial, challenging tasks. As in most evolving fields, recognized standards, well-proven methodological frameworks, or accurately described best-practice approaches for the systematic planning of solutions for secondary use of routine medical record data are missing. We propose a conceptual best-practice framework and procedure model for the systematic planning of intelligent reuse of integrated clinical routine data (SPIRIT). SPIRIT was developed based on a broad literature overview and further refined in two case studies with different kinds of clinical routine data, including process-oriented nursing data from a large hospital group and high-volume multimodal clinical data from a neurologic intensive care unit. SPIRIT aims at tailoring secondary use solutions to specific needs of single departments without losing sight of the institution as a whole. It provides a general conceptual best-practice framework consisting of three parts: First, a secondary use strategy for the whole organization is determined. Second, comprehensive analyses are conducted from two different viewpoints to define the requirements regarding a clinical routine data reuse solution at the system level from the data perspective (BOTTOM UP) and at the strategic level from the future users perspective (TOP DOWN). An obligatory clinical context analysis (IN BETWEEN) facilitates refinement, combination, and integration of the different requirements. The third part of SPIRIT is dedicated to implementation, which comprises design and realization of clinical data integration and management as well as data analysis solutions. The SPIRIT framework is intended to be used to systematically plan the intelligent reuse of clinical routine data for multiple purposes, which often was not intended when the primary clinical documentation systems were implemented. SPIRIT helps to overcome this gap. It can be applied in healthcare institutions of any size or specialization and allows a stepwise setup and evolution of holistic clinical routine data reuse solutions.
Introduction to IND and recursive partitioning, version 1.0
NASA Technical Reports Server (NTRS)
Buntine, Wray; Caruana, Rich
1991-01-01
This manual describes the IND package for learning tree classifiers from data. The package is an integrated C and C shell re-implementation of tree learning routines such as CART, C4, and various MDL and Bayesian variations. The package includes routines for experiment control, interactive operation, and analysis of tree building. The manual introduces the system and its many options, gives a basic review of tree learning, contains a guide to the literature and a glossary, lists the manual pages for the routines, and instructions on installation.
Thermal and flow analysis subroutines for the SINDA-version 9 computer routine
NASA Technical Reports Server (NTRS)
Oren, J. A.; Williams, D. R.
1973-01-01
Fluid flow analysis, special thermal analysis and input/output capabilities of the MOTAR routine were incorporated into the SINDA routine. All the capabilities were added in the form of user subroutines so that they may be added to different versions of SINDA with a minimum of programmer effort. Two modifications were made to the existing subroutines of SINDA/8 to incorporate the above subroutines. These were: (1) A modification to the preprocessor to permit actual values of array numbers, conductor numbers, node numbers or constant numbers supplied as array data to be converted to relative numbers. (2) Modifications to execution subroutine CNFAST to make it compatible with the radiant interchange user subroutine, RADIR. This modified version of SINDA has been designated SINDA/version 9. A detailed discussion of the methods used for the capabilities added is presented. The modifications for the SINDA subroutines are described, as well as user subroutines. All subroutines added or modified are listed.
Kokaly, Raymond F.
2011-01-01
This report describes procedures for installing and using the U.S. Geological Survey Processing Routines in IDL for Spectroscopic Measurements (PRISM) software. PRISM provides a framework to conduct spectroscopic analysis of measurements made using laboratory, field, airborne, and space-based spectrometers. Using PRISM functions, the user can compare the spectra of materials of unknown composition with reference spectra of known materials. This spectroscopic analysis allows the composition of the material to be identified and characterized. Among its other functions, PRISM contains routines for the storage of spectra in database files, import/export of ENVI spectral libraries, importation of field spectra, correction of spectra to absolute reflectance, arithmetic operations on spectra, interactive continuum removal and comparison of spectral features, correction of imaging spectrometer data to ground-calibrated reflectance, and identification and mapping of materials using spectral feature-based analysis of reflectance data. This report provides step-by-step instructions for installing the PRISM software and running its functions.
Carvalho, J J; Jerónimo, P C A; Gonçalves, C; Alpendurada, M F
2008-11-01
European Council Directive 98/83/EC on the quality of water intended for human consumption brought a new challenge for water-quality control routine laboratories, mainly on pesticides analysis. Under the guidelines of ISO/IEC 17025:2005, a multiresidue method was developed, validated, implemented in routine, and studied with real samples during a one-year period. The proposed method enables routine laboratories to handle a large number of samples, since 28 pesticides of 14 different chemical groups can be quantitated in a single procedure. The method comprises a solid-phase extraction step and subsequent analysis by liquid chromatography-mass spectrometry (LC-MS-MS). The accuracy was established on the basis of participation in interlaboratory proficiency tests, with encouraging results (majority |z-score| <2), and the precision was consistently analysed over one year. The limits of quantitation (below 0.050 microg L(-1)) are in agreement with the enforced threshold value for pesticides of 0.10 microg L(-1). Overall method performance is suitable for routine use according to accreditation rules, taking into account the data collected over one year.
Factors Associated with Routine Dental Attendance among Aboriginal Australians.
Amarasena, Najith; Kapellas, Kostas; Skilton, Michael R; Maple-Brown, Louise J; Brown, Alex; Bartold, Mark; O'Dea, Kerin; Celermajer, David; Jamieson, Lisa M
2016-01-01
To determine factors associated with routine dental attendance in Aboriginal Australians. Data of 271 Aboriginal adults residing in Australia's Northern Territory were used. Routine dental attendance was defined as last visiting a dentist less than one year ago or visiting a dentist for a check-up. Both bivariate and multivariable analytical techniques were used. While 27% visited a dentist in the past year, 29% of these visited for a check-up. In bivariate analysis, being female, low psychological distress, and low clinical attachment loss (CAL) were associated with visiting a dentist within last year. Being aged younger than 39 years, male, no oral health impairment, being caries-free, low CAL, and low apolipoprotein B were associated with visiting for a check-up. Clinical attachment loss remained associated with visiting a dentist less than one year ago while being younger than 39 years and having no oral health impairment remained associated with usually visiting for a check-up in multivariable analysis. Younger age, no oral health impairment, and low CAL were associated with routine dental attendance among Indigenous Australians.
Factors Associated with Routine Dental Attendance among Aboriginal Australians.
Amarasena, Najith; Kapellas, Kostas; Skilton, Michael R; Maple-Brown, Louise J; Brown, Alex; Bartold, Mark; O'Dea, Kerin; Celermajer, David; Jamieson, Lisa M
2016-02-01
To determine factors associated with routine dental attendance in Aboriginal Australians. Data of 271 Aboriginal adults residing in Australia's Northern Territory were used. Routine dental attendance was defined as last visiting a dentist less than one year ago or visiting a dentist for a check-up. Both bivariate and multivariable analytical techniques were used. While 27% visited a dentist in the past year, 29% of these visited for a check-up. In bivariate analysis, being female, low psychological distress, and low clinical attachment loss (CAL) were associated with visiting a dentist within last year. Being aged younger than 39 years, male, no oral health impairment, being caries-free, low CAL, and low apolipoprotein B were associated with visiting for a check-up. Clinical attachment loss remained associated with visiting a dentist less than one year ago while being younger than 39 years and having no oral health impairment remained associated with usually visiting for a check-up in multivariable analysis. Younger age, no oral health impairment, and low CAL were associated with routine dental attendance among Indigenous Australians.
Lee, Ki-Sun; Shin, Joo-Hee; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Won-Chang; Shin, Sang-Wan; Lee, Jeong-Yol
2017-01-01
The aim of this study was to evaluate the biomechanical behavior and long-term safety of high performance polymer PEKK as an intraradicular dental post-core material through comparative finite element analysis (FEA) with other conventional post-core materials. A 3D FEA model of a maxillary central incisor was constructed. A cyclic loading force of 50 N was applied at an angle of 45° to the longitudinal axis of the tooth at the palatal surface of the crown. For comparison with traditionally used post-core materials, three materials (gold, fiberglass, and PEKK) were simulated to determine their post-core properties. PEKK, with a lower elastic modulus than root dentin, showed comparably high failure resistance and a more favorable stress distribution than conventional post-core material. However, the PEKK post-core system showed a higher probability of debonding and crown failure under long-term cyclic loading than the metal or fiberglass post-core systems.
Shin, Joo-Hee; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Won-Chang; Shin, Sang-Wan
2017-01-01
The aim of this study was to evaluate the biomechanical behavior and long-term safety of high performance polymer PEKK as an intraradicular dental post-core material through comparative finite element analysis (FEA) with other conventional post-core materials. A 3D FEA model of a maxillary central incisor was constructed. A cyclic loading force of 50 N was applied at an angle of 45° to the longitudinal axis of the tooth at the palatal surface of the crown. For comparison with traditionally used post-core materials, three materials (gold, fiberglass, and PEKK) were simulated to determine their post-core properties. PEKK, with a lower elastic modulus than root dentin, showed comparably high failure resistance and a more favorable stress distribution than conventional post-core material. However, the PEKK post-core system showed a higher probability of debonding and crown failure under long-term cyclic loading than the metal or fiberglass post-core systems. PMID:28386547
NASA Astrophysics Data System (ADS)
Shahiruddin; Singh, Dharmendra K.; Hassan, M. A.
2018-02-01
A comparative study of five ring solid core and nitrobenzene filled hollow core liquid filled photonic crystal fiber (PCF) are presented. Considering the same structure, one is used as solid silica and another one is filled with nitrobenzene in the core. Here the paper elaborates the confinement loss, dispersion properties and birefringence of an index-guiding PCF with asymmetric cladding designed and analyzed by the finite-element method. The proposed structure shows the low confinement loss in case of solid silica, negative dispersion in nitrobenzene filled hollow core PCF and high birefringence in both the cases. The calculated values shows flat zero confinement loss in 0.7 µm to 1.54 µm range, flat zero dispersion is achieved in solid core and -2000 ps/km-nm in nitrobenzene filled hollow core PCF and high birefringence in the range of 10-3 in nitrobenzene filled hollow core PCF. Results show the relative analysis at different air fill fraction.
Frerichs, Inéz; Amato, Marcelo B P; van Kaam, Anton H; Tingay, David G; Zhao, Zhanqi; Grychtol, Bartłomiej; Bodenstein, Marc; Gagnon, Hervé; Böhm, Stephan H; Teschner, Eckhard; Stenqvist, Ola; Mauri, Tommaso; Torsani, Vinicius; Camporota, Luigi; Schibler, Andreas; Wolf, Gerhard K; Gommers, Diederik; Leonhardt, Steffen; Adler, Andy
2017-01-01
Electrical impedance tomography (EIT) has undergone 30 years of development. Functional chest examinations with this technology are considered clinically relevant, especially for monitoring regional lung ventilation in mechanically ventilated patients and for regional pulmonary function testing in patients with chronic lung diseases. As EIT becomes an established medical technology, it requires consensus examination, nomenclature, data analysis and interpretation schemes. Such consensus is needed to compare, understand and reproduce study findings from and among different research groups, to enable large clinical trials and, ultimately, routine clinical use. Recommendations of how EIT findings can be applied to generate diagnoses and impact clinical decision-making and therapy planning are required. This consensus paper was prepared by an international working group, collaborating on the clinical promotion of EIT called TRanslational EIT developmeNt stuDy group. It addresses the stated needs by providing (1) a new classification of core processes involved in chest EIT examinations and data analysis, (2) focus on clinical applications with structured reviews and outlooks (separately for adult and neonatal/paediatric patients), (3) a structured framework to categorise and understand the relationships among analysis approaches and their clinical roles, (4) consensus, unified terminology with clinical user-friendly definitions and explanations, (5) a review of all major work in thoracic EIT and (6) recommendations for future development (193 pages of online supplements systematically linked with the chief sections of the main document). We expect this information to be useful for clinicians and researchers working with EIT, as well as for industry producers of this technology. PMID:27596161
SU-G-BRB-02: An Open-Source Software Analysis Library for Linear Accelerator Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerns, J; Yaldo, D
Purpose: Routine linac quality assurance (QA) tests have become complex enough to require automation of most test analyses. A new data analysis software library was built that allows physicists to automate routine linear accelerator quality assurance tests. The package is open source, code tested, and benchmarked. Methods: Images and data were generated on a TrueBeam linac for the following routine QA tests: VMAT, starshot, CBCT, machine logs, Winston Lutz, and picket fence. The analysis library was built using the general programming language Python. Each test was analyzed with the library algorithms and compared to manual measurements taken at the timemore » of acquisition. Results: VMAT QA results agreed within 0.1% between the library and manual measurements. Machine logs (dynalogs & trajectory logs) were successfully parsed; mechanical axis positions were verified for accuracy and MLC fluence agreed well with EPID measurements. CBCT QA measurements were within 10 HU and 0.2mm where applicable. Winston Lutz isocenter size measurements were within 0.2mm of TrueBeam’s Machine Performance Check. Starshot analysis was within 0.2mm of the Winston Lutz results for the same conditions. Picket fence images with and without a known error showed that the library was capable of detecting MLC offsets within 0.02mm. Conclusion: A new routine QA software library has been benchmarked and is available for use by the community. The library is open-source and extensible for use in larger systems.« less
U.S. Naval Observatory VLBI Analysis Center
NASA Technical Reports Server (NTRS)
Boboltz, David A.; Fey, Alan L.; Geiger, Nicole; Dieck, Chris; Hall, David M.
2013-01-01
This report summarizes the activities of the VLBI Analysis Center at the United States Naval Observatory for the 2012 calendar year. Over the course of the year, Analysis Center personnel continued analysis and timely submission of IVS-R4 databases for distribution to the IVS. During the 2012 calendar year, the USNO VLBI Analysis Center produced two VLBI global solutions designated as usn2012a and usn2012b. Earth orientation parameters (EOP) based on this solution and updated by the latest diurnal (IVS-R1 and IVS-R4) experiments were routinely submitted to the IVS. Sinex files based upon the bi-weekly 24-hour experiments were also submitted to the IVS. During the 2012 calendar year, Analysis Center personnel continued a program to use the Very Long Baseline Array (VLBA) operated by the NRAO for the purpose of measuring UT1-UTC. Routine daily 1-hour duration Intensive observations were initiated using the VLBA antennas at Pie Town, NM and Mauna Kea, HI. High-speed network connections to these two antennas are now routinely used for electronic transfer of VLBI data over the Internet to a USNO point of presence. A total of 270 VLBA Intensive experiments were observed and electronically transferred to and processed at USNO in 2012.
NASA Astrophysics Data System (ADS)
Sboev, A. G.; Ilyashenko, A. S.; Vetrova, O. A.
1997-02-01
The method of bucking evaluation, realized in the MOnte Carlo code MCS, is described. This method was applied for calculational analysis of well known light water experiments TRX-1 and TRX-2. The analysis of this comparison shows, that there is no coincidence between Monte Carlo calculations, obtained by different ways: the MCS calculations with given experimental bucklings; the MCS calculations with given bucklings evaluated on base of full core MCS direct simulations; the full core MCNP and MCS direct simulations; the MCNP and MCS calculations, where the results of cell calculations are corrected by the coefficients taking into the account the leakage from the core. Also the buckling values evaluated by full core MCS calculations have differed from experimental ones, especially in the case of TRX-1, when this difference has corresponded to 0.5 percent increase of Keff value.
Lashkari, A; Khalafi, H; Kazeminejad, H
2013-05-01
In this work, kinetic parameters of Tehran research reactor (TRR) mixed cores have been calculated. The mixed core configurations are made by replacement of the low enriched uranium control fuel elements with highly enriched uranium control fuel elements in the reference core. The MTR_PC package, a nuclear reactor analysis tool, is used to perform the analysis. Simulations were carried out to compute effective delayed neutron fraction and prompt neutron lifetime. Calculation of kinetic parameters is necessary for reactivity and power excursion transient analysis. The results of this research show that effective delayed neutron fraction decreases and prompt neutron lifetime increases with the fuels burn-up. Also, by increasing the number of highly enriched uranium control fuel elements in the reference core, the prompt neutron lifetime increases, but effective delayed neutron fraction does not show any considerable change.
Effective delayed neutron fraction and prompt neutron lifetime of Tehran research reactor mixed-core
Lashkari, A.; Khalafi, H.; Kazeminejad, H.
2013-01-01
In this work, kinetic parameters of Tehran research reactor (TRR) mixed cores have been calculated. The mixed core configurations are made by replacement of the low enriched uranium control fuel elements with highly enriched uranium control fuel elements in the reference core. The MTR_PC package, a nuclear reactor analysis tool, is used to perform the analysis. Simulations were carried out to compute effective delayed neutron fraction and prompt neutron lifetime. Calculation of kinetic parameters is necessary for reactivity and power excursion transient analysis. The results of this research show that effective delayed neutron fraction decreases and prompt neutron lifetime increases with the fuels burn-up. Also, by increasing the number of highly enriched uranium control fuel elements in the reference core, the prompt neutron lifetime increases, but effective delayed neutron fraction does not show any considerable change. PMID:24976672
A parallel solver for huge dense linear systems
NASA Astrophysics Data System (ADS)
Badia, J. M.; Movilla, J. L.; Climente, J. I.; Castillo, M.; Marqués, M.; Mayo, R.; Quintana-Ortí, E. S.; Planelles, J.
2011-11-01
HDSS (Huge Dense Linear System Solver) is a Fortran Application Programming Interface (API) to facilitate the parallel solution of very large dense systems to scientists and engineers. The API makes use of parallelism to yield an efficient solution of the systems on a wide range of parallel platforms, from clusters of processors to massively parallel multiprocessors. It exploits out-of-core strategies to leverage the secondary memory in order to solve huge linear systems O(100.000). The API is based on the parallel linear algebra library PLAPACK, and on its Out-Of-Core (OOC) extension POOCLAPACK. Both PLAPACK and POOCLAPACK use the Message Passing Interface (MPI) as the communication layer and BLAS to perform the local matrix operations. The API provides a friendly interface to the users, hiding almost all the technical aspects related to the parallel execution of the code and the use of the secondary memory to solve the systems. In particular, the API can automatically select the best way to store and solve the systems, depending of the dimension of the system, the number of processes and the main memory of the platform. Experimental results on several parallel platforms report high performance, reaching more than 1 TFLOP with 64 cores to solve a system with more than 200 000 equations and more than 10 000 right-hand side vectors. New version program summaryProgram title: Huge Dense System Solver (HDSS) Catalogue identifier: AEHU_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHU_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 87 062 No. of bytes in distributed program, including test data, etc.: 1 069 110 Distribution format: tar.gz Programming language: Fortran90, C Computer: Parallel architectures: multiprocessors, computer clusters Operating system: Linux/Unix Has the code been vectorized or parallelized?: Yes, includes MPI primitives. RAM: Tested for up to 190 GB Classification: 6.5 External routines: MPI ( http://www.mpi-forum.org/), BLAS ( http://www.netlib.org/blas/), PLAPACK ( http://www.cs.utexas.edu/~plapack/), POOCLAPACK ( ftp://ftp.cs.utexas.edu/pub/rvdg/PLAPACK/pooclapack.ps) (code for PLAPACK and POOCLAPACK is included in the distribution). Catalogue identifier of previous version: AEHU_v1_0 Journal reference of previous version: Comput. Phys. Comm. 182 (2011) 533 Does the new version supersede the previous version?: Yes Nature of problem: Huge scale dense systems of linear equations, Ax=B, beyond standard LAPACK capabilities. Solution method: The linear systems are solved by means of parallelized routines based on the LU factorization, using efficient secondary storage algorithms when the available main memory is insufficient. Reasons for new version: In many applications we need to guarantee a high accuracy in the solution of very large linear systems and we can do it by using double-precision arithmetic. Summary of revisions: Version 1.1 Can be used to solve linear systems using double-precision arithmetic. New version of the initialization routine. The user can choose the kind of arithmetic and the values of several parameters of the environment. Running time: About 5 hours to solve a system with more than 200 000 equations and more than 10 000 right-hand side vectors using double-precision arithmetic on an eight-node commodity cluster with a total of 64 Intel cores.
NASA Technical Reports Server (NTRS)
Vonhermann, Pieter; Pintz, Adam
1994-01-01
This manual describes the use of the ANSCARES program to prepare a neutral file of FEM stress results taken from ANSYS Release 5.0, in the format needed by CARES/LIFE ceramics reliability program. It is intended for use by experienced users of ANSYS and CARES. Knowledge of compiling and linking FORTRAN programs is also required. Maximum use is made of existing routines (from other CARES interface programs and ANSYS routines) to extract the finite element results and prepare the neutral file for input to the reliability analysis. FORTRAN and machine language routines as described are used to read the ANSYS results file. Sub-element stresses are computed and written to a neutral file using FORTRAN subroutines which are nearly identical to those used in the NASCARES (MSC/NASTRAN to CARES) interface.
Source, conveyance and fate of suspended sediments following Hurricane Irene. New England, USA
Yellen, Brian; Woodruff, Jon D.; Kratz, Laura N.; Mabee, Steven B.; Morrison, Jonathan; Martini, Anna M.
2014-01-01
Hurricane Irene passed directly over the Connecticut River valley in late August, 2011. Intense precipitation and high antecedent soil moisture resulted in record flooding, mass wasting and fluvial erosion, allowing for observations of how these rare but significant extreme events affect a landscape still responding to Pleistocene glaciation and associated sediment emplacement. Clays and silts from upland glacial deposits, once suspended in the stream network, were routed directly to the mouth of the Connecticut River, resulting in record-breaking sediment loads fifteen-times greater than predicted from the pre-existing rating curve. Denudation was particularly extensive in mountainous areas. We calculate that sediment yield during the event from the Deerfield River, a steep tributary comprising 5% of the entire Connecticut River watershed, exceeded at minimum 10–40 years of routine sediment discharge and accounted for approximately 40% of the total event sediment discharge from the Connecticut River. A series of surface sediment cores taken in floodplain ponds adjacent to the tidal section of the Connecticut River before and after the event provides insight into differences in sediment sourcing and routing for the Irene event compared to periods of more routine flooding. Relative to routine conditions, sedimentation from Irene was anomalously inorganic, fine grained, and enriched in elements commonly found in chemically immature glacial tills and glaciolacustrine material. These unique sedimentary characteristics document the crucial role played by extreme precipitation from tropical disturbances in denuding this landscape.
Source, conveyance and fate of suspended sediments following Hurricane Irene. New England, USA
NASA Astrophysics Data System (ADS)
Yellen, B.; Woodruff, J. D.; Kratz, L. N.; Mabee, S. B.; Morrison, J.; Martini, A. M.
2014-12-01
Hurricane Irene passed directly over the Connecticut River valley in late August, 2011. Intense precipitation and high antecedent soil moisture resulted in record flooding, mass wasting and fluvial erosion, allowing for observations of how these rare but significant extreme events affect a landscape still responding to Pleistocene glaciation and associated sediment emplacement. Clays and silts from upland glacial deposits, once suspended in the stream network, were routed directly to the mouth of the Connecticut River, resulting in record-breaking sediment loads fifteen-times greater than predicted from the pre-existing rating curve. Denudation was particularly extensive in mountainous areas. We calculate that sediment yield during the event from the Deerfield River, a steep tributary comprising 5% of the entire Connecticut River watershed, exceeded at minimum 10-40 years of routine sediment discharge and accounted for approximately 40% of the total event sediment discharge from the Connecticut River. A series of surface sediment cores taken in floodplain ponds adjacent to the tidal section of the Connecticut River before and after the event provides insight into differences in sediment sourcing and routing for the Irene event compared to periods of more routine flooding. Relative to routine conditions, sedimentation from Irene was anomalously inorganic, fine grained, and enriched in elements commonly found in chemically immature glacial tills and glaciolacustrine material. These unique sedimentary characteristics document the crucial role played by extreme precipitation from tropical disturbances in denuding this landscape.
Preliminary structural design of a lunar transfer vehicle aerobrake. M.S. Thesis
NASA Technical Reports Server (NTRS)
Bush, Lance B.
1992-01-01
An aerobrake concept for a Lunar transfer vehicle was weight optimized through the use of the Taguchi design method, structural finite element analyses and structural sizing routines. Six design parameters were chosen to represent the aerobrake structural configuration. The design parameters included honeycomb core thickness, diameter to depth ratio, shape, material, number of concentric ring frames, and number of radial frames. Each parameter was assigned three levels. The minimum weight aerobrake configuration resulting from the study was approx. half the weight of the average of all twenty seven experimental configurations. The parameters having the most significant impact on the aerobrake structural weight were identified.
SOFIP: A Short Orbital Flux Integration Program
NASA Technical Reports Server (NTRS)
Stassinopoulos, E. G.; Hebert, J. J.; Butler, E. L.; Barth, J. L.
1979-01-01
A computer code was developed to evaluate the space radiation environment encountered by geocentric satellites. The Short Orbital Flux Integration Program (SOFIP) is a compact routine of modular compositions, designed mostly with structured programming techniques in order to provide core and time economy and ease of use. The program in its simplest form produces for a given input trajectory a composite integral orbital spectrum of either protons or electrons. Additional features are available separately or in combination with the inclusion of the corresponding (optional) modules. The code is described in detail, and the function and usage of the various modules are explained. A program listing and sample outputs are attached.
Hemodynamic and Thermal Responses to Head and Neck Cooling in Men and Women
NASA Technical Reports Server (NTRS)
Ku, Yu-Tsuan E.; Montgomery, Leslie D.; Carbo, Jorge E.; Webbon, Bruce W.
1995-01-01
Personal cooling systems are used to alleviate symptoms of multiple sclerosis and to prevent increased core temperature during daily activities. Configurations of these systems include passive ice vests and circulating liquid cooling garments (LCGs) in the forms of vests, cooling caps and combined head and neck cooling systems. However, little information is available oil the amount or heat that can be extracted from the body with these systems or the physiologic changes produced by routine operation of these systems. The objective of this study was to determine the operating characteristics and the physiologic change, produced by short term use of one commercially available thermal control system.
Electrical transmission line diametrical retainer
Hall, David R.; Hall, Jr., H. Tracy; Pixton, David; Dahlgren, Scott; Sneddon, Cameron; Briscoe, Michael; Fox, Joe
2004-12-14
The invention is a mechanism for retaining an electrical transmission line. In one embodiment of the invention it is a system for retaining an electrical transmission line within down hole components. In accordance with one aspect of the invention, the system includes a plurality of downhole components, such as sections of pipe in a drill string. The system also includes a coaxial cable running between the first and second end of a drill pipe, the coaxial cable having a conductive tube and a conductive core within it. The invention allows the electrical transmission line to with stand the tension and compression of drill pipe during routine drilling cycles.
Benchmarking routine psychological services: a discussion of challenges and methods.
Delgadillo, Jaime; McMillan, Dean; Leach, Chris; Lucock, Mike; Gilbody, Simon; Wood, Nick
2014-01-01
Policy developments in recent years have led to important changes in the level of access to evidence-based psychological treatments. Several methods have been used to investigate the effectiveness of these treatments in routine care, with different approaches to outcome definition and data analysis. To present a review of challenges and methods for the evaluation of evidence-based treatments delivered in routine mental healthcare. This is followed by a case example of a benchmarking method applied in primary care. High, average and poor performance benchmarks were calculated through a meta-analysis of published data from services working under the Improving Access to Psychological Therapies (IAPT) Programme in England. Pre-post treatment effect sizes (ES) and confidence intervals were estimated to illustrate a benchmarking method enabling services to evaluate routine clinical outcomes. High, average and poor performance ES for routine IAPT services were estimated to be 0.91, 0.73 and 0.46 for depression (using PHQ-9) and 1.02, 0.78 and 0.52 for anxiety (using GAD-7). Data from one specific IAPT service exemplify how to evaluate and contextualize routine clinical performance against these benchmarks. The main contribution of this report is to summarize key recommendations for the selection of an adequate set of psychometric measures, the operational definition of outcomes, and the statistical evaluation of clinical performance. A benchmarking method is also presented, which may enable a robust evaluation of clinical performance against national benchmarks. Some limitations concerned significant heterogeneity among data sources, and wide variations in ES and data completeness.
Compression After Impact on Honeycomb Core Sandwich Panels with Thin Facesheets, Part 2: Analysis
NASA Technical Reports Server (NTRS)
Mcquigg, Thomas D.; Kapania, Rakesh K.; Scotti, Stephen J.; Walker, Sandra P.
2012-01-01
A two part research study has been completed on the topic of compression after impact (CAI) of thin facesheet honeycomb core sandwich panels. The research has focused on both experiments and analysis in an effort to establish and validate a new understanding of the damage tolerance of these materials. Part 2, the subject of the current paper, is focused on the analysis, which corresponds to the CAI testings described in Part 1. Of interest, are sandwich panels, with aerospace applications, which consist of very thin, woven S2-fiberglass (with MTM45-1 epoxy) facesheets adhered to a Nomex honeycomb core. Two sets of materials, which were identical with the exception of the density of the honeycomb core, were tested in Part 1. The results highlighted the need for analysis methods which taken into account multiple failure modes. A finite element model (FEM) is developed here, in Part 2. A commercial implementation of the Multicontinuum Failure Theory (MCT) for progressive failure analysis (PFA) in composite laminates, Helius:MCT, is included in this model. The inclusion of PFA in the present model provided a new, unique ability to account for multiple failure modes. In addition, significant impact damage detail is included in the model. A sensitivity study, used to assess the effect of each damage parameter on overall analysis results, is included in an appendix. Analysis results are compared to the experimental results for each of the 32 CAI sandwich panel specimens tested to failure. The failure of each specimen is predicted using the high-fidelity, physicsbased analysis model developed here, and the results highlight key improvements in the understanding of honeycomb core sandwich panel CAI failure. Finally, a parametric study highlights the strength benefits compared to mass penalty for various core densities.
Tracking Student Progression through the Core Curriculum. CCRC Analytics
ERIC Educational Resources Information Center
Hodara, Michelle; Rodriguez, Olga
2013-01-01
This report demonstrates useful methods for examining student progression through the core curriculum. The authors carry out analyses at two colleges in two different states, illustrating students' overall progression through the core curriculum and the relationship of this "core" progression to their college outcomes. By means of this analysis,…
Prostate needle biopsy processing: a survey of laboratory practice across Europe.
Varma, Murali; Berney, Daniel M; Algaba, Ferran; Camparo, Philippe; Compérat, Eva; Griffiths, David F R; Kristiansen, Glen; Lopez-Beltran, Antonio; Montironi, Rodolfo; Egevad, Lars
2013-02-01
To determine the degree of variation in the handling of prostate needle biopsies (PBNx) in laboratories across Europe. A web based survey was emailed to members of the European Network of Uropathology and the British Association of Urological Pathologists. Responses were received from 241 laboratories in 15 countries. PNBx were generally taken by urologists (93.8%) or radiologists (23.7%) but in 8.7% were also taken by non-medical personnel such as radiographers, nurses or biomedical assistants. Of the responding laboratories, 40.8% received cores in separate containers, 42.3% processed one core/block, 54.2% examined three levels/block, 49.4% examined one H&E section/level and 56.1% retained spare sections for potential immunohistochemistry. Of the laboratories, 40.9% retained unstained spares for over a year while 36.2% discarded spares within 1 month of reporting. Only two (0.8%) respondents routinely performed immunohistochemistry on all PNBx. There were differences in laboratory practice between the UK and the rest of Europe (RE). Procurement of PNBx by non-medical personnel was more common in the UK. RE laboratories more commonly received each core in a separate container, processed one core/block, examined fewer levels/block and examined more H&E sections/level. RE laboratories also retained spares for potential immunohistochemistry less often and for shorter periods. Use of p63 as the sole basal cell marker was more common in RE. There are marked differences in procurement, handling and processing of PNBx in laboratories across Europe. This data can help the development of best practice guidelines.
The Influence of Oxygen and Sulfur on Uranium Partitioning Into the Core
NASA Astrophysics Data System (ADS)
Moore, R. D., Jr.; Van Orman, J. A.; Hauck, S. A., II
2017-12-01
Uranium, along with K and Th, may provide substantial long-term heating in planetary cores, depending on the magnitude of their partitioning into the metal during differentiation. In general, non-metallic light elements are known to have a large influence on the partitioning of trace elements, and the presence of sulfur is known to enhance the partitioning of uranium into the metal. Data from the steelmaking literature indicate that oxygen also enhances the solubility of oxygen in liquid iron alloys. Here we present experimental data on the partitioning of U between immiscible liquids in the Fe-S-O system, and use these data along with published metal-silicate partitioning data to calibrate a quantitative activity model for U in the metal. We also determined partition coefficients for Th, K, Nb, Nd, Sm, and Yb, but were unable to fully constrain activity models for these elements with available data. A Monte Carlo fitting routine was used to calculate U-S, U-O, and U-S-O interaction coefficients, and their associated uncertainties. We find that the combined interaction of uranium with sulfur and oxygen is predominant, with S and O together enhancing the solubility of uranium to a far greater degree than either element in isolation. This suggests that uranium complexes with sulfite or sulfate species in the metal. For a model Mars core composition containing 14 at% S and 5 at% O, the metal/silicate partition coefficient for U is predicted to be an order of magnitude larger than for a pure Fe-Ni core.
Peterson, Kari; Cole-Dai, Jihong; Brandis, Derek; Cox, Thomas; Splett, Scott
2015-10-01
An ion chromatography-electrospray ionization-tandem mass spectrometry (IC-ESI-MS/MS) method has been developed for rapid and accurate measurement of perchlorate in polar snow and ice core samples in which perchlorate concentrations are expected to be as low as 0.1 ng L(-1). Separation of perchlorate from major inorganic species in snow is achieved with an ion chromatography system interfaced to an AB SCIEX triple quadrupole mass spectrometer operating in multiple reaction monitoring mode. Under optimized conditions, the limit of detection and lower limit of quantification without pre-concentration have been determined to be 0.1 and 0.3 ng L(-1), respectively, with a linear dynamic range of 0.3-10.0 ng L(-1) in routine measurement. These represent improvements over previously reported methods using similar analytical techniques. The improved method allows fast, accurate, and reproducible perchlorate quantification down to the sub-ng L(-1) level and will facilitate perchlorate measurement in the study of natural perchlorate production with polar ice cores in which perchlorate concentrations are anticipated to vary in the low and sub-ng L(-1) range. Initial measurements of perchlorate in ice core samples from central Greenland show that typical perchlorate concentrations in snow dated prior to the Industrial Revolution are about 0.8 ng L(-1), while perchlorate concentrations are significantly higher in recent (post-1980) snow, suggesting that anthropogenic sources are a significant contributor to perchlorate in the current environment.
Filippov, Andrey A; Sergueev, Kirill V; He, Yunxiu; Huang, Xiao-Zhe; Gnade, Bryan T; Mueller, Allen J; Fernandez-Prada, Carmen M; Nikolich, Mikeljon P
2011-01-01
Bacteriophages specific for Yersinia pestis are routinely used for plague diagnostics and could be an alternative to antibiotics in case of drug-resistant plague. A major concern of bacteriophage therapy is the emergence of phage-resistant mutants. The use of phage cocktails can overcome this problem but only if the phages exploit different receptors. Some phage-resistant mutants lose virulence and therefore should not complicate bacteriophage therapy. The purpose of this work was to identify Y. pestis phage receptors using site-directed mutagenesis and trans-complementation and to determine potential attenuation of phage-resistant mutants for mice. Six receptors for eight phages were found in different parts of the lipopolysaccharide (LPS) inner and outer core. The receptor for R phage was localized beyond the LPS core. Most spontaneous and defined phage-resistant mutants of Y. pestis were attenuated, showing increase in LD₅₀ and time to death. The loss of different LPS core biosynthesis enzymes resulted in the reduction of Y. pestis virulence and there was a correlation between the degree of core truncation and the impact on virulence. The yrbH and waaA mutants completely lost their virulence. We identified Y. pestis receptors for eight bacteriophages. Nine phages together use at least seven different Y. pestis receptors that makes some of them promising for formulation of plague therapeutic cocktails. Most phage-resistant Y. pestis mutants become attenuated and thus should not pose a serious problem for bacteriophage therapy of plague. LPS is a critical virulence factor of Y. pestis.
Filippov, Andrey A.; Sergueev, Kirill V.; He, Yunxiu; Huang, Xiao-Zhe; Gnade, Bryan T.; Mueller, Allen J.; Fernandez-Prada, Carmen M.; Nikolich, Mikeljon P.
2011-01-01
Background Bacteriophages specific for Yersinia pestis are routinely used for plague diagnostics and could be an alternative to antibiotics in case of drug-resistant plague. A major concern of bacteriophage therapy is the emergence of phage-resistant mutants. The use of phage cocktails can overcome this problem but only if the phages exploit different receptors. Some phage-resistant mutants lose virulence and therefore should not complicate bacteriophage therapy. Methodology/Principal Findings The purpose of this work was to identify Y. pestis phage receptors using site-directed mutagenesis and trans-complementation and to determine potential attenuation of phage-resistant mutants for mice. Six receptors for eight phages were found in different parts of the lipopolysaccharide (LPS) inner and outer core. The receptor for R phage was localized beyond the LPS core. Most spontaneous and defined phage-resistant mutants of Y. pestis were attenuated, showing increase in LD50 and time to death. The loss of different LPS core biosynthesis enzymes resulted in the reduction of Y. pestis virulence and there was a correlation between the degree of core truncation and the impact on virulence. The yrbH and waaA mutants completely lost their virulence. Conclusions/Significance We identified Y. pestis receptors for eight bacteriophages. Nine phages together use at least seven different Y. pestis receptors that makes some of them promising for formulation of plague therapeutic cocktails. Most phage-resistant Y. pestis mutants become attenuated and thus should not pose a serious problem for bacteriophage therapy of plague. LPS is a critical virulence factor of Y. pestis. PMID:21980477
Waveguides for performing enzymatic reactions
Levene; Michael J. , Korlach; Jonas , Turner; Stephen W. , Craighead; Harold G. , Webb; Watt W.
2007-11-06
The present invention is directed to a method and an apparatus for analysis of an analyte. The method involves providing a zero-mode waveguide which includes a cladding surrounding a core where the cladding is configured to preclude propagation of electromagnetic energy of a frequency less than a cutoff frequency longitudinally through the core of the zero-mode waveguide. The analyte is positioned in the core of the zero-mode waveguide and is then subjected, in the core of the zero-mode wave guide, to activating electromagnetic radiation of a frequency less than the cut-off frequency under conditions effective to permit analysis of the analyte in an effective observation volume which is more compact than if the analysis were carried out in the absence of the zero-mode waveguide.
Levene, Michael J.; Korlach, Jonas; Turner, Stephen W.; Craighead, Harold G.; Webb, Watt W.
2007-02-20
The present invention is directed to a method and an apparatus for analysis of an analyte. The method involves providing a zero-mode waveguide which includes a cladding surrounding a core where the cladding is configured to preclude propagation of electromagnetic energy of a frequency less than a cutoff frequency longitudinally through the core of the zero-mode waveguide. The analyte is positioned in the core of the zero-mode waveguide and is then subjected, in the core of the zero-mode waveguide, to activating electromagnetic radiation of a frequency less than the cut-off frequency under conditions effective to permit analysis of the analyte in an effective observation volume which is more compact than if the analysis were carried out in the absence of the zero-mode waveguide.
ANSYS-based birefringence property analysis of side-hole fiber induced by pressure and temperature
NASA Astrophysics Data System (ADS)
Zhou, Xinbang; Gong, Zhenfeng
2018-03-01
In this paper, we theoretically investigate the influences of pressure and temperature on the birefringence property of side-hole fibers with different shapes of holes using the finite element analysis method. A physical mechanism of the birefringence of the side-hole fiber is discussed with the presence of different external pressures and temperatures. The strain field distribution and birefringence values of circular-core, rectangular-core, and triangular-core side-hole fibers are presented. Our analysis shows the triangular-core side-hole fiber has low temperature sensitivity which weakens the cross sensitivity of temperature and strain. Additionally, an optimized structure design of the side-hole fiber is presented which can be used for the sensing application.
Publications - GMC 217 | Alaska Division of Geological & Geophysical
DGGS GMC 217 Publication Details Title: Results of core analysis (ambient porosity and grain density Wilson and Associates, and Geocore, 1993, Results of core analysis (ambient porosity and grain density
Publications - GMC 335 | Alaska Division of Geological & Geophysical
DGGS GMC 335 Publication Details Title: Geochemical analysis of core (3340'-3625') from the BP Reference ExxonMobil, 2006, Geochemical analysis of core (3340'-3625') from the BP Exploration (Alaska) Inc
Petrophysics of low-permeability medina sandstone, northwestern Pennsylvania, Appalachian Basin
Castle, J.W.; Byrnes, A.P.
1998-01-01
Petrophysical core testing combined with geophysical log analysis of low-permeability, Lower Silurian sandstones of the Appalachian basin provides guidelines and equations for predicting gas producibility. Permeability values are predictable from the borehole logs by applying empirically derived equations based on correlation between in-situ porosity and in-situ effective gas permeability. An Archie-form equation provides reasonable accuracy of log-derived water saturations because of saturated brine salinities and low clay content in the sands. Although measured porosity and permeability average less than 6% and 0.1 mD, infrequent values as high as 18% and 1,048 mD occur. Values of effective gas permeability at irreducible water saturation (Swi) range from 60% to 99% of routine values for the highest permeability rocks to several orders of magnitude less for the lowest permeability rocks. Sandstones having porosity greater than 6% and effective gas permeability greater than 0.01 mD exhibit Swi less than 20%. With decreasing porosity, Swi sharply increases to values near 40% at 3 porosity%. Analysis of cumulative storage and flow capacity indicates zones with porosity greater than 6% generally contain over 90% of flow capacity and hold a major portion of storage capacity. For rocks with Swi < 20%, gas relative permeabilities exceed 45%. Gas relative permeability and hydrocarbon volume decrease rapidly with increasing Swi as porosity drops below 6%. At Swi above 40%, gas relative permeabilities are less than approximately 10%.
Tian, Yong; Shen, Huiyan; Wang, Qiang; Liu, Aifeng; Gao, Wei; Chen, Xu-Wei; Chen, Ming-Li; Zhao, Zongshan
2018-06-13
High temporal resolution components analysis is still a great challenge for the frontier of atmospheric aerosol research. Here, an online high time resolution method for monitoring soluble sulfate and sulfur trioxide in atmospheric aerosols was developed by integrating a membrane-based parallel plate denuder, a particle collector, and a liquid waveguide capillary cell into a flow injection analysis system. The BaCl 2 solution (containing HCl, glycerin, and ethanol) was enabled to quantitatively transform sulfate into a well-distributed BaSO 4 solution for turbidimetric detection. The time resolution for monitoring the soluble sulfate and sulfur trioxide was 15 h -1 . The limits of detection were 86 and 7.3 μg L -1 ( S/ N = 3) with a 20 and 200 μL SO 4 2- solution injection, respectively. Both the interday and intraday precision values (relative standard deviation) were less than 6.0%. The analytical results of the certificated reference materials (GBW(E)08026 and GNM-M07117-2013) were identical to the certified values (no significant difference at a 95% confidence level). The validity and practicability of the developed device were also evaluated during a firecracker day and a routine day, obviously revealing the continuous variance in atmospheric sulfate and sulfur trioxide in both interday and intraday studies.
ERIC Educational Resources Information Center
Darwazeh, Afnan N.
A class of 37 in-service government school teachers from the Nablus (Palestine) district was studied to investigate the effects of 18 hours of training in Instructional Designer's Competencies (IDC) on teachers' planning routine, and their students' academic achievement. A questionnaire measured IDC in five domains: analysis, design,…
Playing Aggression: The Social Construction of the "Sassy Girl" in a Peer Culture Play Routine
ERIC Educational Resources Information Center
Contemporary Issues in Early Childhood, 2013
2013-01-01
The ethnographic analysis presented here examines a play routine that centered on two four-year-old female children constructing and being "sassy girls". Data was gathered over the course of six months in one preschool classroom by acting as a participant observer, videotaping, audiotaping, and conducting formal and informal interviews…
Barium and calcium analyses in sediment cores using µ-XRF core scanners
NASA Astrophysics Data System (ADS)
Acar, Dursun; Çaǧatay, Namık; Genç, S. Can; Eriş, K. Kadir; Sarı, Erol; Uçarkus, Gülsen
2017-04-01
Barium and Ca are used as proxies for organic productivity in paleooceanographic studies. With its heavy atomic weight (137.33 u), barium is easily detectable in small concentrations (several ppm levels) in marine sediments using XRF methods, including the analysis by µ-XRF core scanners. Calcium has an intermediate atomic weight (40.078 u) but is a major element in the earth's crust and in sediments and sedimentary rocks, and hence it is easily detectable by µ-XRF techniques. Normally, µ-XRF elemental analysis of cores are carried out using split half cores or 1-2 cm thich u-channels with an original moisture. Sediment cores show variation in different water content (and porosity) along their length. This in turn results in variation in the XRF counts of the elements and causes error in the elemental concentrations. We tried µ-XRF elemental analysis of split half cores, subsampled as 1 cm thick u-channels with original moisture and 0.3 mm-thin film slices of the core with original wet sample and after air drying with humidity protector mylar film. We found considerable increase in counts of most elements, and in particular for Ba and Ca, when we used 0.3 mm thin film, dried slice. In the case of Ba, the counts increased about three times that of the analysis made with wet and 1 cm thick u-channels. The higher Ba and Ca counts are mainly due to the possible precipitation of Ba as barite and Ca as gypsum from oxidation of Fe-sulphides and the evaporation of pore waters. The secondary barite and gypsum precipitation would be especially serious in unoxic sediment units, such as sapropels, with considerable Fe-sulphides and bio-barite.It is therefore suggested that reseachers should be cautious of such secondary precipitation on core surfaces when analyzing cores that have long been exposed to the atmospheric conditions.
Han, Changhee; Burn-Nunes, Laurie J; Lee, Khanghyun; Chang, Chaewon; Kang, Jung-Ho; Han, Yeongcheol; Hur, Soon Do; Hong, Sungmin
2015-08-01
An improved decontamination method and ultraclean analytical procedures have been developed to minimize Pb contamination of processed glacial ice cores and to achieve reliable determination of Pb isotopes in North Greenland Eemian Ice Drilling (NEEM) deep ice core sections with concentrations at the sub-picogram per gram level. A PL-7 (Fuso Chemical) silica-gel activator has replaced the previously used colloidal silica activator produced by Merck and has been shown to provide sufficiently enhanced ion beam intensity for Pb isotope analysis for a few tens of picograms of Pb. Considering the quantities of Pb contained in the NEEM Greenland ice core and a sample weight of 10 g used for the analysis, the blank contribution from the sample treatment was observed to be negligible. The decontamination and analysis of the artificial ice cores and selected NEEM Greenland ice core sections confirmed the cleanliness and effectiveness of the overall analytical process. Copyright © 2015 Elsevier B.V. All rights reserved.
Reliability database development for use with an object-oriented fault tree evaluation program
NASA Technical Reports Server (NTRS)
Heger, A. Sharif; Harringtton, Robert J.; Koen, Billy V.; Patterson-Hine, F. Ann
1989-01-01
A description is given of the development of a fault-tree analysis method using object-oriented programming. In addition, the authors discuss the programs that have been developed or are under development to connect a fault-tree analysis routine to a reliability database. To assess the performance of the routines, a relational database simulating one of the nuclear power industry databases has been constructed. For a realistic assessment of the results of this project, the use of one of existing nuclear power reliability databases is planned.
1986-05-01
8217..".. .4.-,: -, 4’.-. -I I" " " . " "" .. -"" " " " +,’" ŗŖ. -II O CHART -JON SECIJRITY CLASSIFICATION OF THIS PAGE (When DateEntered) REPORT ...DOCUMENTATION PAGE BEFORE COMPLETING FORM I. REPORT NUMBER 2. GOVT ACCESSION NO. 3. RECIPIENT’S CATALOG NUMBER ~AFIT/CI/NR-86-28T 4. TITLE (and Subtitle) S...TYPE OF REPORT & PERIOD COVERED Objective Analysis of Tropical Cyclone THESIS/ Intensity, Strength, and Size Using Routine Aircraft Reconnaissance
NASA Astrophysics Data System (ADS)
Viirman, Olov
2015-11-01
This paper investigates the teaching practices used by university mathematics teachers when lecturing, a topic within university mathematics education research which is gaining an increasing interest. In the study, a view of mathematics teaching as a discursive practice is taken, and Sfard's commognitive framework is used to investigate the teaching practices of seven Swedish university mathematics teachers on the topic of functions. The present paper looks at the discourse of mathematics teaching, presenting a categorization of the didactical routines into three categories - explanation, motivation and question posing routines. All of these are present in the discourses of all seven teachers, but within these general categories, a number of different sub-categories of routines are found, used in different ways and to different extent by the various teachers. The explanation routines include known mathematical facts, summary and repetition, different representations, everyday language, and concretization and metaphor; the motivation routines include reference to utility, the nature of mathematics, humour and result focus; and the question posing routines include control questions, asking for facts, enquiries and rhetorical questions. This categorization of question posing routines, for instance, complements those already found in the literature. In addition to providing a valuable insight into the teaching of functions at the university level, the categorizations presented in the study can also be useful for investigating the teaching of other mathematical topics.
Magnetic resonance imaging in laboratory petrophysical core analysis
NASA Astrophysics Data System (ADS)
Mitchell, J.; Chandrasekera, T. C.; Holland, D. J.; Gladden, L. F.; Fordham, E. J.
2013-05-01
Magnetic resonance imaging (MRI) is a well-known technique in medical diagnosis and materials science. In the more specialized arena of laboratory-scale petrophysical rock core analysis, the role of MRI has undergone a substantial change in focus over the last three decades. Initially, alongside the continual drive to exploit higher magnetic field strengths in MRI applications for medicine and chemistry, the same trend was followed in core analysis. However, the spatial resolution achievable in heterogeneous porous media is inherently limited due to the magnetic susceptibility contrast between solid and fluid. As a result, imaging resolution at the length-scale of typical pore diameters is not practical and so MRI of core-plugs has often been viewed as an inappropriate use of expensive magnetic resonance facilities. Recently, there has been a paradigm shift in the use of MRI in laboratory-scale core analysis. The focus is now on acquiring data in the laboratory that are directly comparable to data obtained from magnetic resonance well-logging tools (i.e., a common physics of measurement). To maintain consistency with well-logging instrumentation, it is desirable to measure distributions of transverse (T2) relaxation time-the industry-standard metric in well-logging-at the laboratory-scale. These T2 distributions can be spatially resolved over the length of a core-plug. The use of low-field magnets in the laboratory environment is optimal for core analysis not only because the magnetic field strength is closer to that of well-logging tools, but also because the magnetic susceptibility contrast is minimized, allowing the acquisition of quantitative image voxel (or pixel) intensities that are directly scalable to liquid volume. Beyond simple determination of macroscopic rock heterogeneity, it is possible to utilize the spatial resolution for monitoring forced displacement of oil by water or chemical agents, determining capillary pressure curves, and estimating wettability. The history of MRI in petrophysics is reviewed and future directions considered, including advanced data processing techniques such as compressed sensing reconstruction and Bayesian inference analysis of under-sampled data. Although this review focuses on rock core analysis, the techniques described are applicable in a wider context to porous media in general, such as cements, soils, ceramics, and catalytic materials.
Vroblesky, Don A.; Willey, Richard E.; Clifford, Scott; Murphy, James J.
2008-01-01
This study examined volatile organic compound concentrations in cores from trees and shrubs for use as indicators of vadose-zone contamination or potential vapor intrusion by volatile organic compounds into buildings at the Durham Meadows Superfund Site, Durham, Connecticut. The study used both (1) real-time tree- and shrub-core analysis, which involved field heating the core samples for 5 to 10 minutes prior to field analysis, and (2) delayed analysis, which involved allowing the gases in the cores to equilibrate with the headspace gas in the sample vials unheated for 1 to 2 days prior to analysis. General correspondence was found between the two approaches, indicating that preheating and field analysis of vegetation cores is a viable approach to real-time monitoring of subsurface volatile organic compounds. In most cases, volatile organic compounds in cores from trees and shrubs at the Merriam Manufacturing Company property showed a general correspondence to the distribution of volatile organic compounds detected in a soil-gas survey, despite the fact that most of the soil-gas survey data in close proximity to the relevant trees were collected about 3 years prior to the tree-core collection. Most of the trees cored at the Durham Meadows Superfund Site, outside of the Merriam Manufacturing Company property, contained no volatile organic compounds and were in areas where indoor air sampling and soil-gas sampling showed little or no volatile organic compound concentrations. An exception was tree DM11, which contained barely detectable concentrations of trichloroethene near a house where previous investigations found low concentrations of trichloroethene (0.13 to 1.2 parts per billion by volume) in indoor air and 7.7 micrograms per liter of trichloroethene in the ground water. The barely detectable concentration of trichloroethene in tree DM11 and the lack of volatile organic compound detection in nearby tree DM10 (adjacent to the well having 7.7 micrograms of trichloroethene) may be attributable to the relatively large depth to water (17.6 feet), the relatively low soil-vapor trichloroethene concentration, and the large amount of rainfall during and preceding the tree-coring event. The data indicate that real-time and delayed analyses of tree cores are viable approaches to examining subsurface volatile organic compound soil-gas or vadose-zone contamination at the Durham Meadows Superfund Site and other similar sites. Thus, the methods may have application for determining the potential for vapor intrusion into buildings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael A. Pope
2011-10-01
The Deep Burn (DB) Project is a U.S. Department of Energy sponsored feasibility study of Transuranic Management using high burnup fuel in the high temperature helium cooled reactor (HTR). The DB Project consists of seven tasks: project management, core and fuel analysis, spent fuel management, fuel cycle integration, TRU fuel modeling, TRU fuel qualification, and HTR fuel recycle. In the Phase II of the Project, we conducted nuclear analysis of TRU destruction/utilization in the HTR prismatic block design (Task 2.1), deep burn fuel/TRISO microanalysis (Task 2.3), and synergy with fast reactors (Task 4.2). The Task 2.1 covers the core physicsmore » design, thermo-hydraulic CFD analysis, and the thermofluid and safety analysis (low pressure conduction cooling, LPCC) of the HTR prismatic block design. The Task 2.3 covers the analysis of the structural behavior of TRISO fuel containing TRU at very high burnup level, i.e. exceeding 50% of FIMA. The Task 4.2 includes the self-cleaning HTR based on recycle of HTR-generated TRU in the same HTR. Chapter IV contains the design and analysis results of the 600MWth DB-HTR core physics with the cycle length, the average discharged burnup, heavy metal and plutonium consumptions, radial and axial power distributions, temperature reactivity coefficients. Also, it contains the analysis results of the 450MWth DB-HTR core physics and the analysis of the decay heat of a TRU loaded DB-HTR core. The evaluation of the hot spot fuel temperature of the fuel block in the DB-HTR (Deep-Burn High Temperature Reactor) core under full operating power conditions are described in Chapter V. The investigated designs are the 600MWth and 460MWth DB-HTRs. In Chapter VI, the thermo-fluid and safety of the 600MWth DB-HTRs has been analyzed to investigate a thermal-fluid design performance at the steady state and a passive safety performance during an LPCC event. Chapter VII describes the analysis results of the TRISO fuel microanalysis of the 600MWth and 450MWth DB-HTRs. The TRISO fuel microanalysis covers the gas pressure buildup in a coated fuel particle including helium production, the thermo-mechanical behavior of a CFP, the failure probabilities of CFPs, the temperature distribution in a CPF, and the fission product (FP) transport in a CFP and a graphite. In Chapter VIII, it contains the core design and analysis of sodium cooled fast reactor (SFR) with deep burn HTR reactor. It considers a synergistic combination of the DB-MHR and an SFR burner for a safe and efficient transmutation of the TRUs from LWRs. Chapter IX describes the design and analysis results of the self-cleaning (or self-recycling) HTR core. The analysis is considered zero and 5-year cooling time of the spent LWR fuels.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Francesco Venneri; Chang-Keun Jo; Jae-Man Noh
2010-09-01
The Deep Burn (DB) Project is a U.S. Department of Energy sponsored feasibility study of Transuranic Management using high burnup fuel in the high temperature helium cooled reactor (HTR). The DB Project consists of seven tasks: project management, core and fuel analysis, spent fuel management, fuel cycle integration, TRU fuel modeling, TRU fuel qualification, and HTR fuel recycle. In the Phase II of the Project, we conducted nuclear analysis of TRU destruction/utilization in the HTR prismatic block design (Task 2.1), deep burn fuel/TRISO microanalysis (Task 2.3), and synergy with fast reactors (Task 4.2). The Task 2.1 covers the core physicsmore » design, thermo-hydraulic CFD analysis, and the thermofluid and safety analysis (low pressure conduction cooling, LPCC) of the HTR prismatic block design. The Task 2.3 covers the analysis of the structural behavior of TRISO fuel containing TRU at very high burnup level, i.e. exceeding 50% of FIMA. The Task 4.2 includes the self-cleaning HTR based on recycle of HTR-generated TRU in the same HTR. Chapter IV contains the design and analysis results of the 600MWth DB-HTR core physics with the cycle length, the average discharged burnup, heavy metal and plutonium consumptions, radial and axial power distributions, temperature reactivity coefficients. Also, it contains the analysis results of the 450MWth DB-HTR core physics and the analysis of the decay heat of a TRU loaded DB-HTR core. The evaluation of the hot spot fuel temperature of the fuel block in the DB-HTR (Deep-Burn High Temperature Reactor) core under full operating power conditions are described in Chapter V. The investigated designs are the 600MWth and 460MWth DB-HTRs. In Chapter VI, the thermo-fluid and safety of the 600MWth DB-HTRs has been analyzed to investigate a thermal-fluid design performance at the steady state and a passive safety performance during an LPCC event. Chapter VII describes the analysis results of the TRISO fuel microanalysis of the 600MWth and 450MWth DB-HTRs. The TRISO fuel microanalysis covers the gas pressure buildup in a coated fuel particle including helium production, the thermo-mechanical behavior of a CFP, the failure probabilities of CFPs, the temperature distribution in a CPF, and the fission product (FP) transport in a CFP and a graphite. In Chapter VIII, it contains the core design and analysis of sodium cooled fast reactor (SFR) with deep burn HTR reactor. It considers a synergistic combination of the DB-MHR and an SFR burner for a safe and efficient transmutation of the TRUs from LWRs. Chapter IX describes the design and analysis results of the self-cleaning (or self-recycling) HTR core. The analysis is considered zero and 5-year cooling time of the spent LWR fuels.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Nathan; Faucett, Christopher; Haskin, Troy Christopher
Following the conclusion of the first phase of the crosswalk analysis, one of the key unanswered questions was whether or not the deviations found would persist during a partially recovered accident scenario, similar to the one that occurred in TMI - 2. In particular this analysis aims to compare the impact of core degradation morphology on quenching models inherent within the two codes and the coolability of debris during partially recovered accidents. A primary motivation for this study is the development of insights into how uncertainties in core damage progression models impact the ability to assess the potential for recoverymore » of a degraded core. These quench and core recovery models are of the most interest when there is a significant amount of core damage, but intact and degraded fuel still remain in the cor e region or the lower plenum. Accordingly this analysis presents a spectrum of partially recovered accident scenarios by varying both water injection timing and rate to highlight the impact of core degradation phenomena on recovered accident scenarios. This analysis uses the newly released MELCOR 2.2 rev. 966 5 and MAAP5, Version 5.04. These code versions, which incorporate a significant number of modifications that have been driven by analyses and forensic evidence obtained from the Fukushima - Daiichi reactor site.« less
Extended capability of the integrated transport analysis suite, TASK3D-a, for LHD experiment
NASA Astrophysics Data System (ADS)
Yokoyama, M.; Seki, R.; Suzuki, C.; Sato, M.; Emoto, M.; Murakami, S.; Osakabe, M.; Tsujimura, T. Ii.; Yoshimura, Y.; Ido, T.; Ogawa, K.; Satake, S.; Suzuki, Y.; Goto, T.; Ida, K.; Pablant, N.; Gates, D.; Warmer, F.; Vincenzi, P.; Simulation Reactor Research Project, Numerical; LHD Experiment Group
2017-12-01
The integrated transport analysis suite, TASK3D-a (Analysis), has been developed to be capable for routine whole-discharge analyses of plasmas confined in three-dimensional (3D) magnetic configurations such as the LHD. The routine dynamic energy balance analysis for NBI-heated plasmas was made possible in the first version released in September 2012. The suite has been further extended through implementing additional modules for neoclassical transport and ECH deposition for 3D configurations. A module has also been added for creating systematic data for the International Stellarator-Heliotron Confinement and Profile Database. Improvement of neutral beam injection modules for multiple-ion species plasmas and loose coupling with a large-simulation code are also highlights of recent developments.
Logging-while-coring method and apparatus
Goldberg, David S.; Myers, Gregory J.
2007-11-13
A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.
Logging-while-coring method and apparatus
Goldberg, David S.; Myers, Gregory J.
2007-01-30
A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.
Synthesis of parallel and antiparallel core-shell triangular nanoparticles
NASA Astrophysics Data System (ADS)
Bhattacharjee, Gourab; Satpati, Biswarup
2018-04-01
Core-shell triangular nanoparticles were synthesized by seed mediated growth. Using triangular gold (Au) nanoparticle as template, we have grown silver (Ag) shellto get core-shell nanoparticle. Here by changing the chemistry we have grown two types of core-shell structures where core and shell is having same symmetry and also having opposite symmetry. Both core and core-shell nanoparticles were characterized using transmission electron microscopy (TEM) and energy dispersive X-ray spectroscopy (EDX) to know the crystal structure and composition of these synthesized core-shell nanoparticles. From diffraction pattern analysis and energy filtered TEM (EFTEM) we have confirmed the crystal facet in core is responsible for such two dimensional growth of core-shell nanostructures.
NASA Astrophysics Data System (ADS)
Ansari, Saleem A.; Haroon, Muhammad; Rashid, Atif; Kazmi, Zafar
2017-02-01
Extensive calculation and measurements of flow-induced vibrations (FIV) of reactor internals were made in a PWR plant to assess the structural integrity of reactor core support structure against coolant flow. The work was done to meet the requirements of the Fukushima Response Action Plan (FRAP) for enhancement of reactor safety, and the regulatory guide RG-1.20. For the core surveillance measurements the Reactor Internals Vibration Monitoring System (IVMS) has been developed based on detailed neutron noise analysis of the flux signals from the four ex-core neutron detectors. The natural frequencies, displacement and mode shapes of the reactor core barrel (CB) motion were determined with the help of IVMS. The random pressure fluctuations in reactor coolant flow due to turbulence force have been identified as the predominant cause of beam-mode deflection of CB. The dynamic FIV calculations were also made to supplement the core surveillance measurements. The calculational package employed the computational fluid dynamics, mode shape analysis, calculation of power spectral densities of flow & pressure fields and the structural response to random flow excitation forces. The dynamic loads and stiffness of the Hold-Down Spring that keeps the core structure in position against upward coolant thrust were also determined by noise measurements. Also, the boron concentration in primary coolant at any time of the core cycle has been determined with the IVMS.
Long, James M.; Schaffler, James J.
2013-01-01
RATIONALE The otoliths of the inner ear of fishes record the environment of their surrounding water throughout their life. For paddlefish (Polyodon spathula), otoliths have not been routinely used by scientists since their detriments were outlined in the early 1940s. We sought to determine if paddlefish otoliths were useful for resolving elemental information contained within. METHODS Adult paddlefish were collected from two wild, self-sustaining populations in Oklahoma reservoirs in the Arkansas River basin. Juveniles were obtained from a hatchery in the Red River basin of Oklahoma. Otoliths were removed and laser ablation, inductively coupled plasma mass spectrometry (ICP-MS) was used to quantify eight elements (Li, Mg, Mn, Rb, Sr, Y, Ba, and Pb) along the core and edge portions, which were analyzed for differences between otolith regions and among paddlefish sources. RESULTS Differences were found among samples for six of the eight elements examined. Otoliths from Red River basin paddlefish born in a hatchery had significantly lower amounts of Mg and Mn, but higher levels of Rb than otoliths from wild paddlefish in the Arkansas River basin. Concentrations of Y, Sr, and Ba were reduced on the edges of adult paddlefish from both reservoirs compared with the cores. CONCLUSIONS This research shows the utility of using an ICP-MS analysis of paddlefish otoliths. Future research that seeks to determine sources of paddlefish production, such as which reservoir tributaries are most important for reproduction or what proportion of the population is composed of wild versus hatchery-produced individuals, appears promising. Published in 2013. This article is a U.S. Government work and is in the public domain in the USA.
Residence time distribution measurements in a pilot-scale poison tank using radiotracer technique.
Pant, H J; Goswami, Sunil; Samantray, J S; Sharma, V K; Maheshwari, N K
2015-09-01
Various types of systems are used to control the reactivity and shutting down of a nuclear reactor during emergency and routine shutdown operations. Injection of boron solution (borated water) into the core of a reactor is one of the commonly used methods during emergency operation. A pilot-scale poison tank was designed and fabricated to simulate injection of boron poison into the core of a reactor along with coolant water. In order to design a full-scale poison tank, it was desired to characterize flow of liquid from the tank. Residence time distribution (RTD) measurement and analysis was adopted to characterize the flow dynamics. Radiotracer technique was applied to measure RTD of aqueous phase in the tank using Bromine-82 as a radiotracer. RTD measurements were carried out with two different modes of operation of the tank and at different flow rates. In Mode-1, the radiotracer was instantaneously injected at the inlet and monitored at the outlet, whereas in Mode-2, the tank was filled with radiotracer and its concentration was measured at the outlet. From the measured RTD curves, mean residence times (MRTs), dead volume and fraction of liquid pumped in with time were determined. The treated RTD curves were modeled using suitable mathematical models. An axial dispersion model with high degree of backmixing was found suitable to describe flow when operated in Mode-1, whereas a tanks-in-series model with backmixing was found suitable to describe flow of the poison in the tank when operated in Mode-2. The results were utilized to scale-up and design a full-scale poison tank for a nuclear reactor. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ganesh, Shankar; Chhabra, Deepak; Kumari, Nitika
2016-10-17
Studies have shown that farming is associated with many agricultural workers experiencing low back pain (LBP). The rehabilitation of these workers should facilitate their functioning, activities and level of participation in an adequate way. The objectives of this study were to identify the health components associated with LBP and to evaluate the effectiveness of the interventions in returning agricultural workers with LBP to their vocation using the International Classification of Function (ICF) -based tools. Thirty-one full time agricultural workers from 3 different Indian states were prospectively assessed using the ICF core set for LBP. ICF core sets permitted analysis of limitations of function from both the participant and rehabilitation team's perspectives. Each ICF category was rated using an ICF qualifier. The components identified were linked to the ICF categorical profile and assessment sheet. The clinicians identified the global, service program and cycle goals based on ICF. The participants' functioning was followed over a 4-month period. After intervention, the participants were able to undergo their routine activities without increases in pain. However, on returning to active farming, participants noted few improvements in the components d410 (changing basic body position), d415 (maintaining body position), d430 (lifting and carrying objects), d465 (moving around using equipment), d850 (remunerative employment) and d859 (work and employment, other specified and unspecified). The results of the study conclude that the current interventions for LBP are not effective in returning agriculture workers with LBP in India to pain-free farming. There is an urgent need to individualize the health needs of agriculture workers.
Thompson, Joanne; Coleman, Rob; Colwell, Brigitte; Freeman, Jenny; Greenfield, Diana; Holmes, Karen; Mathers, Nigel; Reed, Malcolm
2014-02-01
The process of breast cancer follow-up has psychosocial benefits for patients, notably reassurance, although attending hospital appointments can increase anxiety. Discharge from hospital follow-up can also invoke anxiety as many patients seek reassurance from continued specialist follow-up. Inevitably, due to increased survival and associated resource issues, opportunities for follow-up and support will be reduced. We delivered and evaluated an intervention which supported the transition from cancer patient to cancer survivor, for breast cancer patients being discharged to primary care. We delivered and evaluated a pilot of a patient-centred group intervention 'Preparing Patients for Discharge', aimed at reducing distress. Between January and September 2008, 172 participants were recruited and 74 (43%) expressed an interest in participating in the intervention; 32 of 74 took part, and participated in its evaluation using a semi-structured evaluation questionnaire, standardized measures [Hospital Anxiety and Depression Scale (HADS) and Clinical Outcomes for Routine Evaluation (CORE)] and independent qualitative interviews. The qualitative analysis of questionnaire data indicated key factors were 1) shared experience, 2) support and reassurance, and 3) positive views about cancer and being discharged. The interview data revealed that the intervention enabled participants to: share experiences, focus on emotional needs, and have open discussions about recurrence, while increasing confidence in being discharged and using alternative support services. However, no significant differences were found in pre-post-interventions scores of HADS and CORE. Providing a structured group intervention approach for breast cancer patients offers an early opportunity to support cancer survivors and facilitate and encourage self-management. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2008-07-15
The Meeting papers discuss research and test reactor fuel performance, manufacturing and testing. Some of the main topics are: conversion from HEU to LEU in different reactors and corresponding problems and activities; flux performance and core lifetime analysis with HEU and LEU fuels; physics and safety characteristics; measurement of gamma field parameters in core with LEU fuel; nondestructive analysis of RERTR fuel; thermal hydraulic analysis; fuel interactions; transient analyses and thermal hydraulics for HEU and LEU cores; microstructure research reactor fuels; post irradiation analysis and performance; computer codes and other related problems.
Georgiadis, G S; Charalampidis, D G; Argyriou, C; Georgakarakos, E I; Lazarides, M K
2015-05-01
Existing guidelines suggest routine use of pre-operative color Doppler ultrasound (DUS) vessel mapping before the creation of arteriovenous fistulae (AVF); however, there is controversy about its benefit over traditional clinical examination or selective ultrasound use. This was a systematic review and meta-analysis of randomized controlled trials (RCTs) comparing routine DUS mapping before the creation of AVF with patients for whom the decision for AVF placement was based on clinical examination and selective ultrasound use. A search of MEDLINE/PubMed, SCOPUS, and the Cochrane Library was carried out in June 2014. The analyzed outcome measures were the immediate failure rate and the early/midterm adequacy of the fistula for hemodialysis. Additionally, assessment of the methodological quality of the included studies was carried out. Five studies (574 patients) were analyzed. A random effects model was used to pool the data. The pooled odds ratio (OR) for the immediate failure rate was 0.32 (95% confidence interval [CI] 0.17-0.60; p < .01), which was significantly in favor of the DUS mapping group. The pooled OR for the early/midterm adequacy for hemodialysis was 0.66 (95% CI 0.42-1.03; p = .06), with a trend in favor of the DUS mapping group; however, subgroup analysis revealed that routine DUS mapping was more beneficial than selective DUS (p < .05). The available evidence, based mainly on moderate quality RCTs, suggests that the pre-operative clinical examination should always be supplemented with routine DUS mapping before AVF creation. This policy avoids negative surgical explorations and significantly reduces the immediate AVF failure rate. Copyright © 2015 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
Blasimann, Angela; Eberle, Simon; Scuderi, Manuel Markus
2018-03-01
Soccer is seen as highly intensive sport with an increased injury rate. Male adults are the players with the highest injury incidence. Accordingly, the importance of core muscle strengthening to prevent injury has increased in the past few years. Up to date, core muscle strengthening plays an important role in different prevention programs, such as the "FIFA 11 +". The aim of this systematic review was to investigate the effect of core muscle strengthening on injury rate in male adult soccer players, including at least the known and easy exercises "plank" and "side plank", on injury rate in male adult soccer players. The databases PubMed, PEDro, Cochrane Library, SPORTDiscus and Cinahl were searched systematically. Included studies had to comprise exercises for core muscles as an intervention (as a part of a prevention program) for adult male soccer players. The control group had to continue their usual exercise routine. The exercises "plank" and "side plank" were mandatory elements of the training program. The number of injuries and/or the injury rate (per 1000 hours) were defined as outcomes. The quality of the included studies was assessed with the PEDro scale and the Risk of Bias tool. Seven studies with 2491 participants in total could be included. Two studies found a significant decrease in the injury rate in the intervention group (p < 0.05, p < 0.001 respectively). In two studies, no significance level was reported, but the training showed preventive effects in the intervention group. In the other three studies, no significant changes in the injury rate were found (p > 0.05). The seven included studies differed greatly with respect to the applied methods, the chosen interventions and the obtained results. Furthermore, core muscles were never trained separately but were always part of a program containing other preventive elements. Therefore, it was difficult to compare the studies. However, prevention programs including strengthening exercises for core muscles tend to positively affect the injury rate. Based on the literature found, the research question cannot definitively be answered. In the future, further studies are needed which investigate the effect of isolated core muscle training on the injury rate of soccer players. © Georg Thieme Verlag KG Stuttgart · New York.
Gap analysis: a method to assess core competency development in the curriculum.
Fater, Kerry H
2013-01-01
To determine the extent to which safety and quality improvement core competency development occurs in an undergraduate nursing program. Rapid change and increased complexity of health care environments demands that health care professionals are adequately prepared to provide high quality, safe care. A gap analysis compared the present state of competency development to a desirable (ideal) state. The core competencies, Nurse of the Future Nursing Core Competencies, reflect the ideal state and represent minimal expectations for entry into practice from pre-licensure programs. Findings from the gap analysis suggest significant strengths in numerous competency domains, deficiencies in two competency domains, and areas of redundancy in the curriculum. Gap analysis provides valuable data to direct curriculum revision. Opportunities for competency development were identified, and strategies were created jointly with the practice partner, thereby enhancing relevant knowledge, attitudes, and skills nurses need for clinical practice currently and in the future.
Brightness analysis of an electron beam with a complex profile
NASA Astrophysics Data System (ADS)
Maesaka, Hirokazu; Hara, Toru; Togawa, Kazuaki; Inagaki, Takahiro; Tanaka, Hitoshi
2018-05-01
We propose a novel analysis method to obtain the core bright part of an electron beam with a complex phase-space profile. This method is beneficial to evaluate the performance of simulation data of a linear accelerator (linac), such as an x-ray free electron laser (XFEL) machine, since the phase-space distribution of a linac electron beam is not simple, compared to a Gaussian beam in a synchrotron. In this analysis, the brightness of undulator radiation is calculated and the core of an electron beam is determined by maximizing the brightness. We successfully extracted core electrons from a complex beam profile of XFEL simulation data, which was not expressed by a set of slice parameters. FEL simulations showed that the FEL intensity was well remained even after extracting the core part. Consequently, the FEL performance can be estimated by this analysis without time-consuming FEL simulations.
Publications - GMC 313 | Alaska Division of Geological & Geophysical
'-13,075.0') from the Pan American Redoubt Shoal State (29690) #1 well Authors: Saltmarsh, Art, and Core . Bibliographic Reference Saltmarsh, Art, and Core Laboratories, 2004, Porosity and permeability core analysis
Validity of Computer Adaptive Tests of Daily Routines for Youth with Spinal Cord Injury
Haley, Stephen M.
2013-01-01
Objective: To evaluate the accuracy of computer adaptive tests (CATs) of daily routines for child- and parent-reported outcomes following pediatric spinal cord injury (SCI) and to evaluate the validity of the scales. Methods: One hundred ninety-six daily routine items were administered to 381 youths and 322 parents. Pearson correlations, intraclass correlation coefficients (ICC), and 95% confidence intervals (CI) were calculated to evaluate the accuracy of simulated 5-item, 10-item, and 15-item CATs against the full-item banks and to evaluate concurrent validity. Independent samples t tests and analysis of variance were used to evaluate the ability of the daily routine scales to discriminate between children with tetraplegia and paraplegia and among 5 motor groups. Results: ICC and 95% CI demonstrated that simulated 5-, 10-, and 15-item CATs accurately represented the full-item banks for both child- and parent-report scales. The daily routine scales demonstrated discriminative validity, except between 2 motor groups of children with paraplegia. Concurrent validity of the daily routine scales was demonstrated through significant relationships with the FIM scores. Conclusion: Child- and parent-reported outcomes of daily routines can be obtained using CATs with the same relative precision of a full-item bank. Five-item, 10-item, and 15-item CATs have discriminative and concurrent validity. PMID:23671380
Improving care coordination using organisational routines.
Prætorius, Thim
2016-01-01
The purpose of this paper is to systematically apply theory of organisational routines to standardised care pathways. The explanatory power of routines is used to address open questions in the care pathway literature about their coordinating and organising role, the way they change and can be replicated, the way they are influenced by the organisation and the way they influence health care professionals. Theory of routines is systematically applied to care pathways in order to develop theoretically derived propositions. Care pathways mirror routines by being recurrent, collective and embedded and specific to an organisation. In particular, care pathways resemble standard operating procedures that can give rise to recurrent collective action patterns. In all, 11 propositions related to five categories are proposed by building on these insights: care pathways and coordination, change, replication, the organisation and health care professionals. Research limitations/implications - The paper is conceptual and uses care pathways as illustrative instances of hospital routines. The propositions provide a starting point for empirical research. The analysis highlights implications that health care professionals and managers have to consider in relation to coordination, change, replication, the way the organisation influences care pathways and the way care pathways influence health care professionals. Originality/value - Theory on organisational routines offers fundamental, yet unexplored, insights into hospital processes, including in particular care coordination.
S2LET: A code to perform fast wavelet analysis on the sphere
NASA Astrophysics Data System (ADS)
Leistedt, B.; McEwen, J. D.; Vandergheynst, P.; Wiaux, Y.
2013-10-01
We describe S2LET, a fast and robust implementation of the scale-discretised wavelet transform on the sphere. Wavelets are constructed through a tiling of the harmonic line and can be used to probe spatially localised, scale-dependent features of signals on the sphere. The reconstruction of a signal from its wavelets coefficients is made exact here through the use of a sampling theorem on the sphere. Moreover, a multiresolution algorithm is presented to capture all information of each wavelet scale in the minimal number of samples on the sphere. In addition S2LET supports the HEALPix pixelisation scheme, in which case the transform is not exact but nevertheless achieves good numerical accuracy. The core routines of S2LET are written in C and have interfaces in Matlab, IDL and Java. Real signals can be written to and read from FITS files and plotted as Mollweide projections. The S2LET code is made publicly available, is extensively documented, and ships with several examples in the four languages supported. At present the code is restricted to axisymmetric wavelets but will be extended to directional, steerable wavelets in a future release.
NASA Astrophysics Data System (ADS)
Issautier, Karine; Ongala-Edoumou, Samuel; Moncuquet, Michel
2016-04-01
The quasi-thermal noise (QTN) method consists in measuring the electrostatic fluctuations produced by the thermal motion of the ambient particles. This noise is detected with a sensitive wave receiver and measured at the terminal of a passive electric antenna, which is immersed in a stable plasma. The analysis of the so-called QTN provides in situ measurements, mainly the total electron density, with a good accuracy, and thermal temperature in a large number of space media. We create a preliminary electron database to analyse the anti-correlation between electron density and temperature deduced from WIND perigees in the Earth's plasmasphere. We analyse the radio power spectra measured by the Thermal Noise Receiver (TNR), using the 100-m long dipole antenna, onboard WIND spacecraft. We develop a systematic routine to determine the electron density, core and halo temperature and the magnitude of the magnetic field based on QTN in Bernstein modes. Indeed, the spectra are weakly banded between gyroharmonics below the upper hybrid frequency, from which we derive the local electron density. From the gyrofrequency determination, we obtain an independent measure of the magnetic field magnitude, which is in close agreement with the onboard magnetometer.
Combustion and Engine-Core Noise
NASA Astrophysics Data System (ADS)
Ihme, Matthias
2017-01-01
The implementation of advanced low-emission aircraft engine technologies and the reduction of noise from airframe, fan, and jet exhaust have made noise contributions from an engine core increasingly important. Therefore, meeting future ambitious noise-reduction goals requires the consideration of engine-core noise. This article reviews progress on the fundamental understanding, experimental analysis, and modeling of engine-core noise; addresses limitations of current techniques; and identifies opportunities for future research. After identifying core-noise contributions from the combustor, turbomachinery, nozzles, and jet exhaust, they are examined in detail. Contributions from direct combustion noise, originating from unsteady combustion, and indirect combustion noise, resulting from the interaction of flow-field perturbations with mean-flow variations in turbine stages and nozzles, are analyzed. A new indirect noise-source contribution arising from mixture inhomogeneities is identified by extending the theory. Although typically omitted in core-noise analysis, the impact of mean-flow variations and nozzle-upstream perturbations on the jet-noise modulation is examined, providing potential avenues for future core-noise mitigation.
Subannual layer variability in Greenland firn cores
NASA Astrophysics Data System (ADS)
Kjær, Helle Astrid; Vallelonga, Paul; Vinther, Bo; Winstrup, Mai; Simonsen, Marius; Maffezzoli, Niccoló; Jensen, Camilla Marie
2017-04-01
Ice cores are used to infer information about the past and modern techniques allow for high resolution (< cm) continuous flow analysis (CFA) of the ice. Such analysis is often used to inform on annual layers to constrain dating of ice cores, but can also be extended to provide information on sub-annual deposition patterns. In this study we use available high resolution data from multiple shallow cores around Greenland to investigate the seasonality and trends in the most often continuously measured components sodium, insoluble dust, calcium, ammonium and conductivity (or acidity) from 1800 AD to today. We evaluate the similarities and differences between the records and discuss the causes from different sources and transport to deposition and post-deposition effects over differences in measurement set up. Further we add to the array of cores already published with measurements from the newly drilled ReCAP ice core from a coastal ice cap in eastern Greenland and from a shallow core drilled at the high accumulation site at the Greenland South Dome.
UMTRA Project water sampling and analysis plan, Durango, Colorado. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-09-01
Planned, routine ground water sampling activities at the US Department of Energy (DOE) Uranium Mill Tailings Remedial Action (UMTRA) Project site in Durango, Colorado, are described in this water sampling and analysis plan. The plan identifies and justifies the sampling locations, analytical parameters, detection limits, and sampling frequency for the routine monitoring stations at the site. The ground water data are used to characterize the site ground water compliance strategies and to monitor contaminants of potential concern identified in the baseline risk assessment (DOE, 1995a). Regulatory basis for routine ground water monitoring at UMTRA Project sites is derived from themore » US EPA regulations in 40 CFR Part 192 (1994) and EPA standards of 1995 (60 FR 2854). Sampling procedures are guided by the UMTRA Project standard operating procedures (SOP) (JEG, n.d.), the Technical Approach Document (TAD) (DOE, 1989), and the most effective technical approach for the site.« less
NASA Technical Reports Server (NTRS)
Horst, Richard L.; Mahaffey, David L.; Munson, Robert C.
1989-01-01
The present Phase 2 small business innovation research study was designed to address issues related to scalp-recorded event-related potential (ERP) indices of mental workload and to transition this technology from the laboratory to cockpit simulator environments for use as a systems engineering tool. The project involved five main tasks: (1) Two laboratory studies confirmed the generality of the ERP indices of workload obtained in the Phase 1 study and revealed two additional ERP components related to workload. (2) A task analysis' of flight scenarios and pilot tasks in the Advanced Concepts Flight Simulator (ACFS) defined cockpit events (i.e., displays, messages, alarms) that would be expected to elicit ERPs related to workload. (3) Software was developed to support ERP data analysis. An existing ARD-proprietary package of ERP data analysis routines was upgraded, new graphics routines were developed to enhance interactive data analysis, and routines were developed to compare alternative single-trial analysis techniques using simulated ERP data. (4) Working in conjunction with NASA Langley research scientists and simulator engineers, preparations were made for an ACFS validation study of ERP measures of workload. (5) A design specification was developed for a general purpose, computerized, workload assessment system that can function in simulators such as the ACFS.
Dygut, Jacek; Kalinowska, Barbara; Banach, Mateusz; Piwowar, Monika; Konieczny, Leszek; Roterman, Irena
2016-10-18
The presented analysis concerns the inter-domain and inter-protein interface in protein complexes. We propose extending the traditional understanding of the protein domain as a function of local compactness with an additional criterion which refers to the presence of a well-defined hydrophobic core. Interface areas in selected homodimers vary with respect to their contribution to share as well as individual (domain-specific) hydrophobic cores. The basic definition of a protein domain, i.e., a structural unit characterized by tighter packing than its immediate environment, is extended in order to acknowledge the role of a structured hydrophobic core, which includes the interface area. The hydrophobic properties of interfaces vary depending on the status of interacting domains-In this context we can distinguish: (1) Shared hydrophobic cores (spanning the whole dimer); (2) Individual hydrophobic cores present in each monomer irrespective of whether the dimer contains a shared core. Analysis of interfaces in dystrophin and utrophin indicates the presence of an additional quasi-domain with a prominent hydrophobic core, consisting of fragments contributed by both monomers. In addition, we have also attempted to determine the relationship between the type of interface (as categorized above) and the biological function of each complex. This analysis is entirely based on the fuzzy oil drop model.
Development and Validation of the Pediatric Diabetes Routines Questionnaire for Adolescents.
Pierce, Jessica S; Jordan, Sara S; Arnau, Randolph C
2018-04-06
This study describes the development and psychometric evaluation of an adolescent self-report version of the Pediatric Diabetes Routines Questionnaire (PDRQ:A), a measure of diabetes-specific routines for youth with type 1 diabetes, and further validation of the parent-version (PDRQ:P) in an adolescent sample. Participants included 120 parent-adolescent dyads (ages 12-17) and an additional 24 parents who completed measures of diabetes-specific adolescent routines, general adolescent routines, diabetes self-care, and family support of youth diabetes care. The PDRQ:P/A demonstrated good internal consistency, test-retest reliability, and parent-child agreement, and adequate validity coefficients. Confirmatory factor analysis supported a one-factor model. Promising results were obtained. The PDRQ:P/A is a clinically feasible parent- and self-report measure that can provide valuable information regarding how frequently adolescents engage in their diabetes management tasks in a consistent manner. Addition of an adolescent report format will enhance the utility of the measure for clinical and research use.
Research on Shock Responses of Three Types of Honeycomb Cores
NASA Astrophysics Data System (ADS)
Peng, Fei; Yang, Zhiguang; Jiang, Liangliang; Ren, Yanting
2018-03-01
The shock responses of three kinds of honeycomb cores have been investigated and analyzed based on explicit dynamics analysis. According to the real geometric configuration and the current main manufacturing methods of aluminum alloy honeycomb cores, the finite element models of honeycomb cores with three different cellular configurations (conventional hexagon honeycomb core, rectangle honeycomb core and auxetic honeycomb core with negative Poisson’s ratio) have been established through FEM parametric modeling method based on Python and Abaqus. In order to highlight the impact response characteristics of the above three honeycomb cores, a 5 mm thick panel with the same mass and material was taken as contrast. The analysis results showed that the peak values of longitudinal acceleration history curves of the three honeycomb cores were lower than those of the aluminum alloy panel in all three reference points under the loading of a longitudinal pulse pressure load with the peak value of 1 MPa and the pulse width of 1 μs. It could be concluded that due to the complex reflection and diffraction of stress wave induced by shock in honeycomb structures, the impact energy was redistributed which led to a decrease in the peak values of the longitudinal acceleration at the measuring points of honeycomb cores relative to the panel.
Zero-mode clad waveguides for performing spectroscopy with confined effective observation volumes
Levene, Michael J.; Korlach, Jonas; Turner, Stephen W.; Craighead, Harold G.; Webb, Watt W.
2005-07-12
The present invention is directed to a method and an apparatus for analysis of an analyte. The method involves providing a zero-mode waveguide which includes a cladding surrounding a core where the cladding is configured to preclude propagation of electromagnetic energy of a frequency less than a cutoff frequency longitudinally through the core of the zero-mode waveguide. The analyte is positioned in the core of the zero-mode waveguide and is then subjected, in the core of the zero-mode waveguide, to activating electromagnetic radiation of a frequency less than the cut-off frequency under conditions effective to permit analysis of the analyte in an effective observation volume which is more compact than if the analysis were carried out in the absence of the zero-mode waveguide.
Waveguides for performing spectroscopy with confined effective observation volumes
Levene, Michael J.; Korlach, Jonas; Turner, Stephen W.; Craighead, Harold G.; Webb, Watt W.
2006-03-14
The present invention is directed to a method and an apparatus for analysis of an analyte. The method involves providing a zero-mode waveguide which includes a cladding surrounding a core where the cladding is configured to preclude propagation of electromagnetic energy of a frequency less than a cutoff frequency longitudinally through the core of the zero-mode waveguide. The analyte is positioned in the core of the zero-mode waveguide and is then subjected, in the core of the zero-mode waveguide, to activating electromagnetic radiation of a frequency less than the cut-off frequency under conditions effective to permit analysis of the analyte in an effective observation volume which is more compact than if the analysis were carried out in the absence of the zero-mode waveguide.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mouradian, E.M.
1966-02-16
A thermal analysis is carried out to determine the temperature distribution throughout a SNAP 10A reactor core, particularly in the vicinity of the grid plates, during atmospheric reentry. The transient temperatue distribution of the grid plate indicates when sufficient melting occurs so that fuel elements are free to be released and continue their descent individually.
Fast Image Subtraction Using Multi-cores and GPUs
NASA Astrophysics Data System (ADS)
Hartung, Steven; Shukla, H.
2013-01-01
Many important image processing techniques in astronomy require a massive number of computations per pixel. Among them is an image differencing technique known as Optimal Image Subtraction (OIS), which is very useful for detecting and characterizing transient phenomena. Like many image processing routines, OIS computations increase proportionally with the number of pixels being processed, and the number of pixels in need of processing is increasing rapidly. Utilizing many-core graphical processing unit (GPU) technology in a hybrid conjunction with multi-core CPU and computer clustering technologies, this work presents a new astronomy image processing pipeline architecture. The chosen OIS implementation focuses on the 2nd order spatially-varying kernel with the Dirac delta function basis, a powerful image differencing method that has seen limited deployment in part because of the heavy computational burden. This tool can process standard image calibration and OIS differencing in a fashion that is scalable with the increasing data volume. It employs several parallel processing technologies in a hierarchical fashion in order to best utilize each of their strengths. The Linux/Unix based application can operate on a single computer, or on an MPI configured cluster, with or without GPU hardware. With GPU hardware available, even low-cost commercial video cards, the OIS convolution and subtraction times for large images can be accelerated by up to three orders of magnitude.
Müller, Martina S; Roelofs, Yvonne; Erikstad, Kjell Einar; Groothuis, Ton G G
2012-01-01
Animals and plants routinely produce more offspring than they can afford to rear. Mothers can favour certain young by conferring on them competitive advantages such as a leading position in the birth sequence, more resources or hormones. Avian mothers create hatching asynchrony within a clutch and at the same time bestow the eggs with different concentrations of androgens that may enhance or counteract the competitive advantage experienced by early-hatching "core" young. In siblicidal birds, core young assume a dominant social position in the nest due to their size advantage and when threatened with starvation fatally attack subdominant later-hatching "marginal" young. A role for maternal androgens in siblicidal aggression has frequently been suggested but never tested. We studied this in the facultatively siblicidal black-headed kittiwake. We found that marginal eggs contain higher instead of lower concentrations of androgens than core eggs. Surprisingly, exposure to experimentally elevated yolk androgens increased sibling aggression and dominance, even though in nature marginal eggs never produce dominant chicks. We propose the "adoption facilitation hypothesis" to explain this paradox. This cliff-nesting colonial species has a high adoption rate: ejected marginal kittiwake chicks frequently fall into other nests containing chicks of similar or smaller size and exposure to yolk androgens might help them integrate themselves into a foster nest.
The circadian rhythm of core temperature: effects of physical activity and aging.
Weinert, Dietmar; Waterhouse, Jim
2007-02-28
The circadian rhythm of core temperature depends upon several interacting rhythms, of both endogenous and exogenous origin, but an understanding of the process requires these two components to be separated. Constant routines remove the exogenous (masking) component at source, but they are severely limited in their application. By contrast, several purification methods have successfully reduced the masking component of overt circadian rhythms measured in field circumstances. One important, but incidental, outcome from these methods is that they enable a quantitative estimate of masking effects to be obtained. It has been shown that these effects of activity upon the temperature rhythm show circadian rhythmicity, and more detailed investigations of this have aided our understanding of thermoregulation and the genesis of the circadian rhythm of core temperature itself. The observed circadian rhythm of body temperature varies with age; in comparison with adults, it is poorly developed in the neonate and deteriorates in the aged subject. Comparing masked and purified data enables the reasons for these differences--whether due to the body clock, the effector pathways or organs, or irregularities due to the individual's lifestyle--to begin to be understood. Such investigations stress the immaturity of the circadian rhythm in the human neonate and its deterioration in elderly compared with younger subjects, but they also indicate the robustness of the body clock itself into advanced age, at least in mice.
Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.
UV-VIS absorption spectroscopy: Lambert-Beer reloaded
NASA Astrophysics Data System (ADS)
Mäntele, Werner; Deniz, Erhan
2017-02-01
UV-VIS absorption spectroscopy is used in almost every spectroscopy laboratory for routine analysis or research. All spectroscopists rely on the Lambert-Beer Law but many of them are less aware of its limitations. This tutorial discusses typical problems in routine spectroscopy that come along with technical limitations or careless selection of experimental parameters. Simple rules are provided to avoid these problems.
A framework for integration of scientific applications into the OpenTopography workflow
NASA Astrophysics Data System (ADS)
Nandigam, V.; Crosby, C.; Baru, C.
2012-12-01
The NSF-funded OpenTopography facility provides online access to Earth science-oriented high-resolution LIDAR topography data, online processing tools, and derivative products. The underlying cyberinfrastructure employs a multi-tier service oriented architecture that is comprised of an infrastructure tier, a processing services tier, and an application tier. The infrastructure tier consists of storage, compute resources as well as supporting databases. The services tier consists of the set of processing routines each deployed as a Web service. The applications tier provides client interfaces to the system. (e.g. Portal). We propose a "pluggable" infrastructure design that will allow new scientific algorithms and processing routines developed and maintained by the community to be integrated into the OpenTopography system so that the wider earth science community can benefit from its availability. All core components in OpenTopography are available as Web services using a customized open-source Opal toolkit. The Opal toolkit provides mechanisms to manage and track job submissions, with the help of a back-end database. It allows monitoring of job and system status by providing charting tools. All core components in OpenTopography have been developed, maintained and wrapped as Web services using Opal by OpenTopography developers. However, as the scientific community develops new processing and analysis approaches this integration approach is not scalable efficiently. Most of the new scientific applications will have their own active development teams performing regular updates, maintenance and other improvements. It would be optimal to have the application co-located where its developers can continue to actively work on it while still making it accessible within the OpenTopography workflow for processing capabilities. We will utilize a software framework for remote integration of these scientific applications into the OpenTopography system. This will be accomplished by virtually extending the OpenTopography service over the various infrastructures running these scientific applications and processing routines. This involves packaging and distributing a customized instance of the Opal toolkit that will wrap the software application as an OPAL-based web service and integrate it into the OpenTopography framework. We plan to make this as automated as possible. A structured specification of service inputs and outputs along with metadata annotations encoded in XML can be utilized to automate the generation of user interfaces, with appropriate tools tips and user help features, and generation of other internal software. The OpenTopography Opal toolkit will also include the customizations that will enable security authentication, authorization and the ability to write application usage and job statistics back to the OpenTopography databases. This usage information could then be reported to the original service providers and used for auditing and performance improvements. This pluggable framework will enable the application developers to continue to work on enhancing their application while making the latest iteration available in a timely manner to the earth sciences community. This will also help us establish an overall framework that other scientific application providers will also be able to use going forward.
Hydro-ball in-core instrumentation system and method of operation
Tower, Stephen N.; Veronesi, Luciano; Braun, Howard E.
1990-01-01
A hydro-ball in-core instrumentation system employs detector strings each comprising a wire having radiation sensitive balls affixed diametrically at spaced positions therealong and opposite tip ends of which are transportable by fluid drag through interior passageways. In the passageways primary coolant is caused to flow selectively in first and second opposite directions for transporting the detector strings from stored positions in an exterior chamber to inserted positions within the instrumentation thimbles of the fuel rod assemblies of a pressure vessel, and for return. The coolant pressure within the detector passageways is the same as that within the vessel; face contact, disconnectable joints between sections of the interior passageways within the vessel facilitate assembly and disassembly of the vessel for refueling and routine maintenance operations. The detector strings may pass through a very short bend radius thereby minimizing space requirements for the connections of the instrumentation system to the vessel and concomitantly the vessel containment structure. Improved radiation mapping and a significant reduction in potential exposure of personnel to radiation are provided. Both top head and bottom head penetration embodiments are disclosed.
Bartlett, John M S; Campbell, Fiona M; Ibrahim, Merdol; Thomas, Jeremy; Wencyk, Pete; Ellis, Ian; Kay, Elaine; Connolly, Yvonne; O'Grady, Anthony; Barnett, Sarah; Starczynski, Jane; Cunningham, Paul; Miller, Keith
2010-02-01
To assess a new HER2 fluorescence in situ hybridization (FISH) test and report on multicentre intrasite and intersite variation. HER2 results were scored from 45 breast cancers in eight laboratories using the Kreatech Poseidon HER2 FISH probe (Kreatech Diagnostics, Amsterdam, the Netherlands). Overall, 80.9% of cores were successfully analysed. Mean intrasite variation for HER2 ratio assessment was low (4.74%). Intersite variation in ratio was in line with previous reports (11.9+/-0.8%) for both reference and non-reference laboratories; only one laboratory displayed significantly higher intersite variation (P=0.009) than the remaining seven laboratories. The overall incidence of misclassification of cores was <1.3%, demonstrating an excellent level of concordance (>98.7%) across all eight laboratories, irrespective of whether they were 'reference' or 'routine diagnostic' laboratories. The Kreatech Poseidon HER2 FISH test is robust and reproducible. Highly quantitatively reproducible FISH results were obtained from eight 'diagnostic' and 'reference' laboratories; however, continued quality assessments are essential to good performance.
Agreement between auricular and rectal measurements of body temperature in healthy cats.
Sousa, Marlos G; Carareto, Roberta; Pereira-Junior, Valdo A; Aquino, Monally C C
2013-04-01
Measurement of body temperature is a routine part of the clinical assessment of a patient. However, this procedure may be time-consuming and stressful to most animals because the standard site of temperature acquisition remains the rectal mucosa. Although an increasing number of clinicians have been using auricular temperature to estimate core body temperature, evidence is still lacking regarding agreement between these two methods in cats. In this investigation, we evaluated the agreement between temperatures measured in the rectum and ear in 29 healthy cats over a 2-week period. Temperatures were measured in the rectum (using digital and mercury-in-glass thermometers) and ear once a day for 14 consecutive days, producing 406 temperature readings for each thermometer. Mean temperature and confidence intervals were similar between methods, and Bland-Altman plots showed small biases and narrow limits of agreement acceptable for clinical purposes. The interobserver variability was also checked, which indicated a strong correlation between two near-simultaneous temperature readings. Results are consistent with auricular thermometry being a reliable alternative to rectal thermometry for assessing core body temperature in healthy cats.