Estimation variance bounds of importance sampling simulations in digital communication systems
NASA Technical Reports Server (NTRS)
Lu, D.; Yao, K.
1991-01-01
In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.
An efficient sampling technique for sums of bandpass functions
NASA Technical Reports Server (NTRS)
Lawton, W. M.
1982-01-01
A well known sampling theorem states that a bandlimited function can be completely determined by its values at a uniformly placed set of points whose density is at least twice the highest frequency component of the function (Nyquist rate). A less familiar but important sampling theorem states that a bandlimited narrowband function can be completely determined by its values at a properly chosen, nonuniformly placed set of points whose density is at least twice the passband width. This allows for efficient digital demodulation of narrowband signals, which are common in sonar, radar and radio interferometry, without the side effect of signal group delay from an analog demodulator. This theorem was extended by developing a technique which allows a finite sum of bandlimited narrowband functions to be determined by its values at a properly chosen, nonuniformly placed set of points whose density can be made arbitrarily close to the sum of the passband widths.
NASA Astrophysics Data System (ADS)
Clarkson, William I.; Calamida, Annalisa; Sahu, Kailash C.; Gennaro, Mario; Brown, Thomas M.; Avila, Roberto J.; Rich, R. Michael; Debattista, Victor P.
2018-01-01
We report results from a pilot study using archival Hubble Space Telescope imaging observations in seven filters over a multi-year time-baseline to probe the co-dependence of chemical abundance and kinematics, using proper motion-based rotation curves selected on relative metallicity. With spectroscopic studies suggesting the metallicity distribution of the Bulge may be bimodal, we follow a data-driven approach to classify stars as belonging to metal-rich or metal-poor ends of the observed relative photometric metallicity distribution, with classification implemented using standard unsupervised learning techniques. We detect clear differences in both slope and amplitude of the proper motion-based rotation curve as traced by the more “metal-rich” and “metal-poor” samples. The sense of the discrepancy is qualitatively in agreement both with recent observational and theoretical indications; the “metal-poor” sample does indeed show a weaker rotation signature.This is the first study to dissect the proper motion rotation curve of the Bulge by chemical abundance using main-sequence targets, which are orders of magnitude more common on the sky than bright giants. These techniques thus offer a pencil-beam complement to wide-field studies that use more traditional tracer populations.
Analysis of defect structure in silicon. Characterization of samples from UCP ingot 5848-13C
NASA Technical Reports Server (NTRS)
Natesh, R.; Guyer, T.; Stringfellow, G. B.
1982-01-01
Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13 C. Important trends were noticed between the measured data, cell efficiency, and diffusion length. Grain boundary substructure appears to have an important effect on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements give statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for QTM analysis was perfected.
Evaluating structural connectomics in relation to different Q-space sampling techniques.
Rodrigues, Paulo; Prats-Galino, Alberto; Gallardo-Pujol, David; Villoslada, Pablo; Falcon, Carles; Prckovska, Vesna
2013-01-01
Brain networks are becoming forefront research in neuroscience. Network-based analysis on the functional and structural connectomes can lead to powerful imaging markers for brain diseases. However, constructing the structural connectome can be based upon different acquisition and reconstruction techniques whose information content and mutual differences has not yet been properly studied in a unified framework. The variations of the structural connectome if not properly understood can lead to dangerous conclusions when performing these type of studies. In this work we present evaluation of the structural connectome by analysing and comparing graph-based measures on real data acquired by the three most important Diffusion Weighted Imaging techniques: DTI, HARDI and DSI. We thus come to several important conclusions demonstrating that even though the different techniques demonstrate differences in the anatomy of the reconstructed fibers the respective connectomes show variations of 20%.
Basic tree-ring sample preparation techniques for aging aspen
Lance A. Asherin; Stephen A. Mata
2001-01-01
Aspen is notoriously difficult to age because of its light-colored wood and faint annual growth rings. Careful preparation and processing of aspen ring samples can overcome these problems, yield accurate age and growth estimates, and concisely date disturbance events present in the tree-ring record. Proper collection of aspen wood is essential in obtaining usable ring...
Wear measurement of the cutting edge of superhard turning tools using TLA technique
NASA Astrophysics Data System (ADS)
Vasváry, L.; Ditrói, F.; Takács, S.; Szabó, Z.; Szűcs, J.; Kundrák, J.; Mahunka, I.
1994-03-01
Wear measurement on superhard boron nitride and artificial diamond turning tools was performed using thin layer activation (TLA) technique. The samples were irradiated in two different geometries to improve the sensitivity of the method and change the region of wear to be investigated. The most proper irradiation parameters and nuclear reactions were investigated for both kind of tools.
A technique for extracting blood samples from mice in fire toxicity tests
NASA Technical Reports Server (NTRS)
Bucci, T. J.; Hilado, C. J.; Lopez, M. T.
1976-01-01
The extraction of adequate blood samples from moribund and dead mice has been a problem because of the small quantity of blood in each animal and the short time available between the animals' death and coagulation of the blood. These difficulties are particularly critical in fire toxicity tests because removal of the test animals while observing proper safety precautions for personnel is time-consuming. Techniques for extracting blood samples from mice were evaluated, and a technique was developed to obtain up to 0.8 ml of blood from a single mouse after death. The technique involves rapid exposure and cutting of the posterior vena cava and accumulation of blood in the peritoneal space. Blood samples of 0.5 ml or more from individual mice have been consistently obtained as much as 16 minutes after apparent death. Results of carboxyhemoglobin analyses of blood appeared reproducible and consistent with carbon monoxide concentrations in the exposure chamber.
Lampi, Tiina; Dekker, Hannah; Ten Bruggenkate, Chris M; Schulten, Engelbert A J M; Mikkonen, Jopi J W; Koistinen, Arto; Kullaa, Arja M
2018-01-01
The aim of this study was to define the acid-etching technique for bone samples embedded in polymethyl metacrylate (PMMA) in order to visualize the osteocyte lacuno-canalicular network (LCN) for scanning electron microscopy (SEM). Human jaw bone tissue samples (N = 18) were collected from the study population consisting of patients having received dental implant surgery. After collection, the bone samples were fixed in 70% ethanol and non-decalcified samples embedded routinely into polymethyl metacrylate (PMMA). The PMMA embedded specimens were acid-etched in either 9 or 37% phosphoric acid (PA) and prepared for SEM for further analysis. PMMA embedded bone specimens acid-etched by 9% PA concentration accomplishes the most informative and favorable visualization of the LCN to be observed by SEM. Etching of PMMA embedded specimens is recommendable to start with 30 s or 40 s etching duration in order to find the proper etching duration for the samples examined. Visualizing osteocytes and LCN provides a tool to study bone structure that reflects changes in bone metabolism and diseases related to bone tissue. By proper etching protocol of non-decalcified and using scanning electron microscope it is possible to visualize the morphology of osteocytes and the network supporting vitality of bone tissue.
Nurses' knowledge of inhaler technique in the inpatient hospital setting.
De Tratto, Katie; Gomez, Christy; Ryan, Catherine J; Bracken, Nina; Steffen, Alana; Corbridge, Susan J
2014-01-01
High rates of inhaler misuse in patients with chronic obstructive pulmonary disease and asthma contribute to hospital readmissions and increased healthcare cost. The purpose of this study was to examine inpatient staff nurses' self-perception of their knowledge of proper inhaler technique compared with demonstrated technique and frequency of providing patients with inhaler technique teaching during hospitalization and at discharge. A prospective, descriptive study. A 495-bed urban academic medical center in the Midwest United States. A convenience sample of 100 nurses working on inpatient medical units. Participants completed a 5-item, 4-point Likert-scale survey evaluating self-perception of inhaler technique knowledge, frequency of providing patient education, and responsibility for providing education. Participants demonstrated inhaler technique to the investigators using both a metered dose inhaler (MDI) and Diskus device inhaler, and performance was measured via a validated checklist. Overall misuse rates were high for both MDI and Diskus devices. There was poor correlation between perceived ability and investigator-measured performance of inhaler technique. Frequency of education during hospitalization and at discharge was related to measured level of performance for the Diskus device but not for the MDI. Nurses are a key component of patient education in the hospital; however, nursing staff lack adequate knowledge of inhaler technique. Identifying gaps in nursing knowledge regarding proper inhaler technique and patient education about proper inhaler technique is important to design interventions that may positively impact patient outcomes. Interventions could include one-on-one education, Web-based education, unit-based education, or hospital-wide competency-based education. All should include return demonstration of appropriate technique.
Interpretation of Blood Microbiology Results - Function of the Clinical Microbiologist.
Kristóf, Katalin; Pongrácz, Júlia
2016-04-01
The proper use and interpretation of blood microbiology results may be one of the most challenging and one of the most important functions of clinical microbiology laboratories. Effective implementation of this function requires careful consideration of specimen collection and processing, pathogen detection techniques, and prompt and precise reporting of identification and susceptibility results. The responsibility of the treating physician is proper formulation of the analytical request and to provide the laboratory with complete and precise patient information, which are inevitable prerequisites of a proper testing and interpretation. The clinical microbiologist can offer advice concerning the differential diagnosis, sampling techniques and detection methods to facilitate diagnosis. Rapid detection methods are essential, since the sooner a pathogen is detected, the better chance the patient has of getting cured. Besides the gold-standard blood culture technique, microbiologic methods that decrease the time in obtaining a relevant result are more and more utilized today. In the case of certain pathogens, the pathogen can be identified directly from the blood culture bottle after propagation with serological or automated/semi-automated systems or molecular methods or with MALDI-TOF MS (matrix-assisted laser desorption-ionization time of flight mass spectrometry). Molecular biology methods are also suitable for the rapid detection and identification of pathogens from aseptically collected blood samples. Another important duty of the microbiology laboratory is to notify the treating physician immediately about all relevant information if a positive sample is detected. The clinical microbiologist may provide important guidance regarding the clinical significance of blood isolates, since one-third to one-half of blood culture isolates are contaminants or isolates of unknown clinical significance. To fully exploit the benefits of blood culture and other (non- culture based) diagnoses, the microbiologist and the clinician should interact directly.
NASA Astrophysics Data System (ADS)
Takiue, Makoto; Fujii, Haruo; Ishikawa, Hiroaki
1984-12-01
2, 5-diphenyloxazole (PPO) has been proposed as a wavelength shifter for Cherenkov counting. Since PPO is not incorporated with water, we have introduced the fluor into water in the form of micelle using a PPO-ethanol system. This technique makes it possible to obtain a high Cherenkov counting efficiency under stable sample conditions, attributed to the proper spectrometric features of the PPO. The 32P Cherenkov counting efficiency (68.4%) obtained from this technique is large as that measured with a conventional Cherenkov technique.
Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather
2018-04-01
Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass spectrometry data, and software tools.
Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z
2015-12-01
Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.
Chew, K S; Mohd Hashairi, F; Jusoh, A F; Aziz, A A; Nik Hisamuddin, N A R; Siti Asma, H
2013-08-01
Although a vital test, blood culture is often plagued with the problem of contamination and false results, especially in a chaotic emergency department setting. The objectives of this pilot study is to find out the level of understanding among healthcare staffs in emergency department, Hospital Universiti Sains Malaysia (HUSM) regarding good blood culture sampling practice. All healthcare staffs in emergency department, HUSM who consented to this study were given a set of selfadministered anonymous questionnaire to fill. More than half (53.1%) of the 64 participants are emergency medicine residents. Majority of them (75%) have been working in the emergency medicine, HUSM for more than 2 years. More than half of them were able to answer correctly the amount of blood volume needed for culture in adult and pediatric patients. When asked what are the factors required to improve the true yield as well as to reduce the risk of culture contamination, the four commonest answers given were observing proper aseptic technique during blood sampling, donning sterile glove, proper hand scrubbing as well as ensuring the sterility of the equipments. This study suggests that there is a lack of proper knowledge of good blood culture sampling practice among our healthcare staffs in emergency department.
Hendricks, Sharief; O'connor, Sam; Lambert, Michael; Brown, James; Burger, Nicholas; Mc Fie, Sarah; Readhead, Clint; Viljoen, Wayne
2015-01-01
In rugby union, understanding the techniques and events leading to concussions is important because of the nature of the injury and the severity and potential long-term consequences, particularly in junior players. Proper contact technique is a prerequisite for successful participation in rugby and is a major factor associated with injury. However, the execution of proper contact technique and its relationship to injury has yet to be studied in matches. Therefore, the aim of this study was to compare contact techniques leading to concussion with a representative sample of similarly matched non-injury (NI) contact events. Injury surveillance was conducted at the 2011-2013 under-18 Craven Week Rugby tournaments. Video footage of 10 concussive events (5 tackle, 4 ruck and 1 aerial collision) and 83 NI events were identified (19 tackle, 61 ruck and 3 aerial collisions). Thereafter, each phase of play was analysed using standardised technical proficiency criteria. Overall score for ruck proficiency in concussive events was 5.67 (out of a total of 15) vs. 6.98 for NI events (n = 54) (effect size = 0.52, small). Overall average score for tackler proficiency was 7.25 (n = 4) and 6.67 (n = 15) for injury and NI tackles, respectively (out of 16) (effect size = 0.19, trivial). This is the first study to compare concussion injury contact technique to a player-matched sample of NI contact techniques. Certain individual technical criteria had an effect towards an NI outcome, and others had an effect towards a concussive event, highlighting that failure to execute certain techniques may substantially increase the opportunity for concussion.
Bhagwat, Swarupa Nikhil; Sharma, Jayashree H; Jose, Julie; Modi, Charusmita J
2015-01-01
The routine immunohematological tests can be performed by automated as well as manual techniques. These techniques have advantages and disadvantages inherent to them. The present study aims to compare the results of manual and automated techniques for blood grouping and crossmatching so as to validate the automated system effectively. A total of 1000 samples were subjected to blood grouping by the conventional tube technique (CTT) and the automated microplate LYRA system on Techno TwinStation. A total of 269 samples (multitransfused patients and multigravida females) were compared for 927 crossmatches by the CTT in indirect antiglobulin phase against the column agglutination technique (CAT) performed on Techno TwinStation. For blood grouping, the study showed a concordance in results for 942/1000 samples (94.2%), discordance for 4/1000 (0.4%) samples and uninterpretable result for 54/1000 samples (5.4%). On resolution, the uninterpretable results reduced to 49/1000 samples (4.9%) with 951/1000 samples (95.1%) showing concordant results. For crossmatching, the automated CAT showed concordant results in 887/927 (95.6%) and discordant results in 3/927 (0.32%) crossmatches as compared to the CTT. Total 37/927 (3.9%) crossmatches were not interpretable by the automated technique. The automated system shows a high concordance of results with CTT and hence can be brought into routine use. However, the high proportion of uninterpretable results emphasizes on the fact that proper training and standardization are needed prior to its use.
NASA Astrophysics Data System (ADS)
Jacobsen, Jerrold J.; Houston Jetzer, Kelly; Patani, Néha; Zimmerman, John; Zweerink, Gerald
1995-07-01
Significant attention is paid to the proper technique for reading a meniscus. Video shows meniscus-viewing techniques for colorless and dark liquids and the consequences of not reading a meniscus at eye level. Lessons are provided on approaching the end point, focusing on end point colors produced via different commonly used indicators. The concept of a titration curve is illustrated by means of a pH meter. Carefully recorded images of the entire range of meniscus values in a buret, pipet, and graduated cylinder are included so that you can show your students, in lecture or pre-lab discussion, any meniscus and discuss how to read the buret properly. These buret meniscus values are very carefully recorded at the rate of one video frame per hundredth of a milliliter, so that an image showing any given meniscus value can be obtained. These images can be easily incorporated into a computer-based multimedia environment for testing or meniscus-reading exercises. Two of the authors have used this technique and found the exercise to be very well received by their students. Video on side two shows nearly 100 "bloopers", demonstrating both the right way and wrong ways to do tasks associated with titration. This material can be used in a variety of situations: to show students the correct way to do something; to test students by asking them "What is this person doing wrong?"; or to develop multimedia, computer-based lessons. The contents of Titration Techniques are listed below: Side 1 Titration: what it is. A simple titration; Acid-base titration animation; A brief redox titration; Redox titration animation; A complete acid-base titration. Titration techniques. Hand technique variations; Stopcock; Using a buret to measure liquid volumes; Wait before reading meniscus; Dirty and clean burets; Read meniscus at eye level (see Fig. 1); Meniscus viewing techniques--light colored liquids; Meniscus viewing techniques--dark liquids; Using a magnetic stirrer; Rough titration; Significant figures; Approaching the end point; End point colors; Titration with a pH meter; Titration curves; Colors of indicators. Meniscus values. Buret meniscus values; Pipet meniscus values; Graduated cylinder meniscus values. Side 2"Bloopers". Introducing the people; Titration animation; Inspecting the buret; Rinsing the buret with water; Preparing a solid sample; Obtaining a liquid sample; Delivering a liquid sample with a Mohr pipet; Pipetting a liquid sample with a Mohr pipet; Rinsing the Mohr pipet with sample; Using the Mohr pipet to transfer sample; Delivering a liquid sample with a volumetric pipet; Pipetting a liquid sample with a volumetric pipet; Rinsing the volumetric pipet with sample; Using the volumetric pipet to transfer sample; Obtaining the titrant; Rinsing the buret with titrant; Filling the buret with titrant; Adding the indicator; The initial reading; Beginning the titration; Delivering titrant; The final reading. Figure 3. Near the end point a single drop of titrant can cause a lasting color change.
The impact of nonuniform sampling on stratospheric ozone trends derived from occultation instruments
NASA Astrophysics Data System (ADS)
Damadeo, Robert P.; Zawodny, Joseph M.; Remsberg, Ellis E.; Walker, Kaley A.
2018-01-01
This paper applies a recently developed technique for deriving long-term trends in ozone from sparsely sampled data sets to multiple occultation instruments simultaneously without the need for homogenization. The technique can compensate for the nonuniform temporal, spatial, and diurnal sampling of the different instruments and can also be used to account for biases and drifts between instruments. These problems have been noted in recent international assessments as being a primary source of uncertainty that clouds the significance of derived trends. Results show potential recovery
trends of ˜ 2-3 % decade-1 in the upper stratosphere at midlatitudes, which are similar to other studies, and also how sampling biases present in these data sets can create differences in derived recovery trends of up to ˜ 1 % decade-1 if not properly accounted for. Limitations inherent to all techniques (e.g., relative instrument drifts) and their impacts (e.g., trend differences up to ˜ 2 % decade-1) are also described and a potential path forward towards resolution is presented.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2013-12-20
Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.
Role of microextraction sampling procedures in forensic toxicology.
Barroso, Mário; Moreno, Ivo; da Fonseca, Beatriz; Queiroz, João António; Gallardo, Eugenia
2012-07-01
The last two decades have provided analysts with more sensitive technology, enabling scientists from all analytical fields to see what they were not able to see just a few years ago. This increased sensitivity has allowed drug detection at very low concentrations and testing in unconventional samples (e.g., hair, oral fluid and sweat), where despite having low analyte concentrations has also led to a reduction in sample size. Along with this reduction, and as a result of the use of excessive amounts of potentially toxic organic solvents (with the subsequent environmental pollution and costs associated with their proper disposal), there has been a growing tendency to use miniaturized sampling techniques. Those sampling procedures allow reducing organic solvent consumption to a minimum and at the same time provide a rapid, simple and cost-effective approach. In addition, it is possible to get at least some degree of automation when using these techniques, which will enhance sample throughput. Those miniaturized sample preparation techniques may be roughly categorized in solid-phase and liquid-phase microextraction, depending on the nature of the analyte. This paper reviews recently published literature on the use of microextraction sampling procedures, with a special focus on the field of forensic toxicology.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis.
Cohnstaedt, Lee W; Rochon, Kateryn; Duehl, Adrian J; Anderson, John F; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C; Obenauer, Peter J; Campbell, James F; Lysyk, Tim J; Allan, Sandra A
2012-03-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium "Advancements in arthropod monitoring technology, techniques, and analysis" presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis
Rochon, Kateryn; Duehl, Adrian J.; Anderson, John F.; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C.; Obenauer, Peter J.; Campbell, James F.; Lysyk, Tim J.; Allan, Sandra A.
2015-01-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthropod monitoring technology, techniques, and analysis” presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles. PMID:26543242
Bhagwat, Swarupa Nikhil; Sharma, Jayashree H; Jose, Julie; Modi, Charusmita J
2015-01-01
Context: The routine immunohematological tests can be performed by automated as well as manual techniques. These techniques have advantages and disadvantages inherent to them. Aims: The present study aims to compare the results of manual and automated techniques for blood grouping and crossmatching so as to validate the automated system effectively. Materials and Methods: A total of 1000 samples were subjected to blood grouping by the conventional tube technique (CTT) and the automated microplate LYRA system on Techno TwinStation. A total of 269 samples (multitransfused patients and multigravida females) were compared for 927 crossmatches by the CTT in indirect antiglobulin phase against the column agglutination technique (CAT) performed on Techno TwinStation. Results: For blood grouping, the study showed a concordance in results for 942/1000 samples (94.2%), discordance for 4/1000 (0.4%) samples and uninterpretable result for 54/1000 samples (5.4%). On resolution, the uninterpretable results reduced to 49/1000 samples (4.9%) with 951/1000 samples (95.1%) showing concordant results. For crossmatching, the automated CAT showed concordant results in 887/927 (95.6%) and discordant results in 3/927 (0.32%) crossmatches as compared to the CTT. Total 37/927 (3.9%) crossmatches were not interpretable by the automated technique. Conclusions: The automated system shows a high concordance of results with CTT and hence can be brought into routine use. However, the high proportion of uninterpretable results emphasizes on the fact that proper training and standardization are needed prior to its use. PMID:26417159
Clustering on very small scales from a large, complete sample of confirmed quasar pairs
NASA Astrophysics Data System (ADS)
Eftekharzadeh, Sarah; Myers, Adam D.; Djorgovski, Stanislav G.; Graham, Matthew J.; Hennawi, Joseph F.; Mahabal, Ashish A.; Richards, Gordon T.
2016-06-01
We present by far the largest sample of spectroscopically confirmed binaryquasars with proper transverse separations of 17.0 ≤ Rprop ≤ 36.6 h-1 kpc. Our sample, whichis an order-of-magnitude larger than previous samples, is selected from Sloan Digital Sky Survey (SDSS) imaging over an area corresponding to the SDSS 6th data release (DR6). Our quasars are targeted using a Kernel Density Estimation technique (KDE), and confirmed using long-slit spectroscopy on a range of facilities.Our most complete sub-sample of 44 binary quasars with g<20.85, extends across angular scales of 2.9" < Δθ < 6.3", and is targeted from a parent sample that would be equivalent to a full spectroscopic survey of nearly 300,000 quasars.We determine the projected correlation function of quasars (\\bar Wp) over proper transverse scales of 17.0 ≤ Rprop ≤ 36.6 h-1 kpc, and also in 4 bins of scale within this complete range.To investigate the redshift evolution of quasar clustering on small scales, we make the first self-consistent measurement of the projected quasar correlation function in 4 bins of redshift over 0.4 ≤ z ≤ 2.3.
Arul, Pitchaikaran; Pushparaj, Magesh; Pandian, Kanmani; Chennimalai, Lingasamy; Rajendran, Karthika; Selvaraj, Eniya; Masilamani, Suresh
2018-01-01
An important component of laboratory medicine is preanalytical phase. Since laboratory report plays a major role in patient management, more importance should be given to the quality of laboratory tests. The present study was undertaken to find the prevalence and types of preanalytical errors at a tertiary care hospital in South India. In this cross-sectional study, a total of 118,732 samples ([62,474 outpatient department [OPD] and 56,258 inpatient department [IPD]) were received in hematology laboratory. These samples were analyzed for preanalytical errors such as misidentification, incorrect vials, inadequate samples, clotted samples, diluted samples, and hemolyzed samples. The overall prevalence of preanalytical errors found was 513 samples, which is 0.43% of the total number of samples received. The most common preanalytical error observed was inadequate samples followed by clotted samples. Overall frequencies (both OPD and IPD) of preanalytical errors such as misidentification, incorrect vials, inadequate samples, clotted samples, diluted samples, and hemolyzed samples were 0.02%, 0.05%, 0.2%, 0.12%, 0.02%, and 0.03%, respectively. The present study concluded that incorrect phlebotomy techniques due to lack of awareness is the main reason for preanalytical errors. This can be avoided by proper communication and coordination between laboratory and wards, proper training and continuing medical education programs for laboratory and paramedical staffs, and knowledge of the intervening factors that can influence laboratory results.
Effects of a pulsed Nd:YAG laser on enamel and dentin
NASA Astrophysics Data System (ADS)
Myers, Terry D.
1990-06-01
Enamel and dentin samples were exposed extraorally to a pulsed neodymium yttriuma1uminumgarnet (Nd:YAG) laser. The lased samples were observed using both scanning electron microscopy and histological techniques to determine the effects of the laser. The present study has provided the following points: (1) Properly treated, enamel can be 1aser etched to a depth comparable to that achieved with phosphoric acid etching; and (2) both carious and noncarious dentin can be vaporized by the Nd:YAG laser. No cracking or chipping of any enamel or dentin sample was observed histologically or under the SEM.
NASA Astrophysics Data System (ADS)
Schooneveld, E. M.; Mayers, J.; Rhodes, N. J.; Pietropaolo, A.; Andreani, C.; Senesi, R.; Gorini, G.; Perelli-Cippo, E.; Tardocchi, M.
2006-09-01
This article reports a novel experimental technique, namely, the foil cycling technique, developed on the VESUVIO spectrometer (ISIS spallation source) operating in the resonance detector configuration. It is shown that with a proper use of two foils of the same neutron absorbing material it is possible, in a double energy analysis process, to narrow the width of the instrumental resolution of a spectrometer operating in the resonance detector configuration and to achieve an effective subtraction of the neutron and gamma backgrounds. Preliminary experimental results, obtained from deep inelastic neutron scattering measurements on lead, zirconium hydride, and deuterium chloride samples, are presented.
Introduction to Field Water-Quality Methods for the Collection of Metals - 2007 Project Summary
Allen, Monica L.
2008-01-01
The U.S. Geological Survey (USGS), Region VI of the U.S. Environmental Protection Agency (USEPA), and the Osage Nation presented three 3-day workshops, in June-August 2007, entitled ?Introduction to Field Water-Quality Methods for the Collection of Metals.? The purpose of the workshops was to provide instruction to tribes within USEPA Region VI on various USGS surface-water measurement methods and water-quality sampling protocols for the collection of surface-water samples for metals analysis. Workshop attendees included members from over 22 tribes and pueblos. USGS instructors came from Oklahoma, New Mexico, and Georgia. Workshops were held in eastern and south-central Oklahoma and New Mexico and covered many topics including presampling preparation, water-quality monitors, and sampling for metals in surface water. Attendees spent one full classroom day learning the field methods used by the USGS Water Resources Discipline and learning about the complexity of obtaining valid water-quality and quality-assurance data. Lectures included (1) a description of metal contamination sources in surface water; (2) introduction on how to select field sites, equipment, and laboratories for sample analysis; (3) collection of sediment in surface water; and (4) utilization of proper protocol and methodology for sampling metals in surface water. Attendees also were provided USGS sampling equipment for use during the field portion of the class so they had actual ?hands-on? experience to take back to their own organizations. The final 2 days of the workshop consisted of field demonstrations of current USGS water-quality sample-collection methods. The hands-on training ensured that attendees were exposed to and experienced proper sampling procedures. Attendees learned integrated-flow techniques during sample collection, field-property documentation, and discharge measurements and calculations. They also used enclosed chambers for sample processing and collected quality-assurance samples to verify their techniques. Benefits of integrated water-quality sample-collection methods are varied. Tribal environmental programs now have the ability to collect data that are comparable across watersheds. The use of consistent sample collection, manipulation, and storage techniques will provide consistent quality data that will enhance the understanding of local water resources. The improved data quality also will help the USEPA better document the condition of the region?s water. Ultimately, these workshops equipped tribes to use uniform sampling methods and to provide consistent quality data that are comparable across the region.
Ibrahim, Akram; Férachou, Denis; Sharma, Gargi; Singh, Kanwarpal; Kirouac-Turmel, Marie; Ozaki, Tsuneyuki
2016-01-01
Time-domain spectroscopy using coherent millimeter and sub-millimeter radiation (also known as terahertz radiation) is rapidly expanding its application, owing greatly to the remarkable advances in generating and detecting such radiation. However, many current techniques for coherent terahertz detection have limited dynamic range, thus making it difficult to perform some basic experiments that need to directly compare strong and weak terahertz signals. Here, we propose and demonstrate a novel technique based on cross-polarized spectral-domain interferometry to achieve ultra-high dynamic range electro-optic sampling measurement of coherent millimeter and sub-millimeter radiation. In our scheme, we exploit the birefringence in a single-mode polarization maintaining fiber in order to measure the phase change induced by the electric field of terahertz radiation in the detection crystal. With our new technique, we have achieved a dynamic range of 7 × 106, which is 4 orders of magnitude higher than conventional electro-optic sampling techniques, while maintaining comparable signal-to-noise ratio. The present technique is foreseen to have great impact on experiments such as linear terahertz spectroscopy of optically thick materials (such as aqueous samples) and nonlinear terahertz spectroscopy, where the higher dynamic range is crucial for proper interpretation of experimentally obtained results. PMID:26976363
Ibrahim, Akram; Férachou, Denis; Sharma, Gargi; Singh, Kanwarpal; Kirouac-Turmel, Marie; Ozaki, Tsuneyuki
2016-03-15
Time-domain spectroscopy using coherent millimeter and sub-millimeter radiation (also known as terahertz radiation) is rapidly expanding its application, owing greatly to the remarkable advances in generating and detecting such radiation. However, many current techniques for coherent terahertz detection have limited dynamic range, thus making it difficult to perform some basic experiments that need to directly compare strong and weak terahertz signals. Here, we propose and demonstrate a novel technique based on cross-polarized spectral-domain interferometry to achieve ultra-high dynamic range electro-optic sampling measurement of coherent millimeter and sub-millimeter radiation. In our scheme, we exploit the birefringence in a single-mode polarization maintaining fiber in order to measure the phase change induced by the electric field of terahertz radiation in the detection crystal. With our new technique, we have achieved a dynamic range of 7 × 10(6), which is 4 orders of magnitude higher than conventional electro-optic sampling techniques, while maintaining comparable signal-to-noise ratio. The present technique is foreseen to have great impact on experiments such as linear terahertz spectroscopy of optically thick materials (such as aqueous samples) and nonlinear terahertz spectroscopy, where the higher dynamic range is crucial for proper interpretation of experimentally obtained results.
Pillai, Anil Kumar; Silvers, William; Christensen, Preston; Riegel, Matthew; Adams-Huet, Beverley; Lingvay, Ildiko; Sun, Xiankai; Öz, Orhan K
2015-01-01
Advances in noninvasive imaging modalities have provided opportunities to study β cell function through imaging zinc release from insulin secreting β cells. Understanding the temporal secretory pattern of insulin and zinc corelease after a glucose challenge is essential for proper timing of administration of zinc sensing probes. Portal venous sampling is an essential part of pharmacological and nutritional studies in animal models. The purpose of this study was to compare two different percutaneous image-guided techniques: transhepatic ultrasound guided portal vein access and transsplenic fluoroscopy guided splenic vein access for ease of access, safety, and evaluation of temporal kinetics of insulin and zinc release into the venous effluent from the pancreas. Both techniques were safe, reproducible, and easy to perform. The mean time required to obtain desired catheter position for venous sampling was 15 minutes shorter using the transsplenic technique. A clear biphasic insulin release profile was observed in both techniques. Statistically higher insulin concentration but similar zinc release after a glucose challenge was observed from splenic vein samples, as compared to the ones from the portal vein. To our knowledge, this is the first report of percutaneous methods to assess zinc release kinetics from the porcine pancreas.
Pillai, Anil Kumar; Silvers, William; Christensen, Preston; Riegel, Matthew; Adams-Huet, Beverley; Lingvay, Ildiko; Sun, Xiankai; Öz, Orhan K.
2015-01-01
Advances in noninvasive imaging modalities have provided opportunities to study β cell function through imaging zinc release from insulin secreting β cells. Understanding the temporal secretory pattern of insulin and zinc corelease after a glucose challenge is essential for proper timing of administration of zinc sensing probes. Portal venous sampling is an essential part of pharmacological and nutritional studies in animal models. The purpose of this study was to compare two different percutaneous image-guided techniques: transhepatic ultrasound guided portal vein access and transsplenic fluoroscopy guided splenic vein access for ease of access, safety, and evaluation of temporal kinetics of insulin and zinc release into the venous effluent from the pancreas. Both techniques were safe, reproducible, and easy to perform. The mean time required to obtain desired catheter position for venous sampling was 15 minutes shorter using the transsplenic technique. A clear biphasic insulin release profile was observed in both techniques. Statistically higher insulin concentration but similar zinc release after a glucose challenge was observed from splenic vein samples, as compared to the ones from the portal vein. To our knowledge, this is the first report of percutaneous methods to assess zinc release kinetics from the porcine pancreas. PMID:26273676
Marín, M-J; Figuero, E; González, I; O'Connor, A; Diz, P; Álvarez, M; Herrera, D; Sanz, M
2016-05-01
The prevalence and amounts of periodontal pathogens detected in bacteraemia samples after tooth brushing-induced by means of four diagnostic technique, three based on culture and one in a molecular-based technique, have been compared in this study. Blood samples were collected from thirty-six subjects with different periodontal status (17 were healthy, 10 with gingivitis and 9 with periodontitis) at baseline and 2 minutes after tooth brushing. Each sample was analyzed by three culture-based methods [direct anaerobic culturing (DAC), hemo-culture (BACTEC), and lysis-centrifugation (LC)] and one molecular-based technique [quantitative polymerase chain reaction (qPCR)]. With culture any bacterial isolate was detected and quantified, while with qPCR only Porphyromonas gingivalis and Aggregatibacter actinomycetemcomitans were detected and quantified. Descriptive analyses, ANOVA and Chi-squared tests, were performed. Neither BACTEC nor qPCR detected any type of bacteria in the blood samples. Only LC (2.7%) and DAC (8.3%) detected bacteraemia, although not in the same patients. Fusobacterium nucleatum was the most frequently detected bacterial species. The disparity in the results when the same samples were analyzed with four different microbiological detection methods highlights the need for a proper validation of the methodology to detect periodontal pathogens in bacteraemia samples, mainly when the presence of periodontal pathogens in blood samples after tooth brushing was very seldom.
An Analysis of Nondestructive Evaluation Techniques for Polymer Matrix Composite Sandwich Materials
NASA Technical Reports Server (NTRS)
Cosgriff, Laura M.; Roberts, Gary D.; Binienda, Wieslaw K.; Zheng, Diahua; Averbeck, Timothy; Roth, Donald J.; Jeanneau, Philippe
2006-01-01
Structural sandwich materials composed of triaxially braided polymer matrix composite material face sheets sandwiching a foam core are being utilized for applications including aerospace components and recreational equipment. Since full scale components are being made from these sandwich materials, it is necessary to develop proper inspection practices for their manufacture and in-field use. Specifically, nondestructive evaluation (NDE) techniques need to be investigated for analysis of components made from these materials. Hockey blades made from sandwich materials and a flat sandwich sample were examined with multiple NDE techniques including thermographic, radiographic, and shearographic methods to investigate damage induced in the blades and flat panel components. Hockey blades used during actual play and a flat polymer matrix composite sandwich sample with damage inserted into the foam core were investigated with each technique. NDE images from the samples were presented and discussed. Structural elements within each blade were observed with radiographic imaging. Damaged regions and some structural elements of the hockey blades were identified with thermographic imaging. Structural elements, damaged regions, and other material variations were detected in the hockey blades with shearography. Each technique s advantages and disadvantages were considered in making recommendations for inspection of components made from these types of materials.
Illera, Juan-Carlos; Silván, Gema; Cáceres, Sara; Carbonell, Maria-Dolores; Gerique, Cati; Martínez-Fernández, Leticia; Munro, Coralie; Casares, Miguel
2014-01-01
Monitoring ovarian cycles through hormonal analysis is important in order to improve breeding management of captive elephants, and non-invasive collection techniques are particularly interesting for this purpose. However, there are some practical difficulties in collecting proper samples, and easier and more practical methods may be an advantage for some institutions and/or some animals. This study describes the development and validation of an enzymeimmunoassay (EIA) for progestins in salivary samples of African elephants, Loxodonta africana. Weekly urinary and salivary samples from five non-pregnant elephant cows aged 7-12 years were obtained for 28 weeks and analyzed using EIA. Both techniques correlated positively (r = 0.799; P < 0.001), and the cycle characteristics obtained were identical. The results clearly show that ovarian cycles can be monitored by measuring progestins from salivary samples in the African elephant. This is a simple and non-invasive method that may be a practical alternative to other sampling methods used in the species. © 2014 Wiley Periodicals, Inc.
Inorganic chemical analysis of environmental materials—A lecture series
Crock, J.G.; Lamothe, P.J.
2011-01-01
At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.
Casale, M; Oliveri, P; Armanino, C; Lanteri, S; Forina, M
2010-06-04
Four rapid and low-cost vanguard analytical systems (NIR and UV-vis spectroscopy, a headspace-mass based artificial nose and a voltammetric artificial tongue), together with chemometric pattern recognition techniques, were applied and compared in addressing a food authentication problem: the distinction between wine samples from the same Italian oenological region, according to the grape variety. Specifically, 59 certified samples belonging to the Barbera d'Alba and Dolcetto d'Alba appellations and collected from the same vintage (2007) were analysed. The instrumental responses, after proper data pre-processing, were used as fingerprints of the characteristics of the samples: the results from principal component analysis and linear discriminant analysis were discussed, comparing the capability of the four analytical strategies in addressing the problem studied. Copyright 2010 Elsevier B.V. All rights reserved.
Safety Precautions and Operating Procedures in an (A)BSL-4 Laboratory: 2. General Practices.
Mazur, Steven; Holbrook, Michael R; Burdette, Tracey; Joselyn, Nicole; Barr, Jason; Pusl, Daniela; Bollinger, Laura; Coe, Linda; Jahrling, Peter B; Lackemeyer, Matthew G; Wada, Jiro; Kuhn, Jens H; Janosko, Krisztina
2016-10-03
Work in a biosafety level 4 (BSL-4) containment laboratory requires time and great attention to detail. The same work that is done in a BSL-2 laboratory with non-high-consequence pathogens will take significantly longer in a BSL-4 setting. This increased time requirement is due to a multitude of factors that are aimed at protecting the researcher from laboratory-acquired infections, the work environment from potential contamination and the local community from possible release of high-consequence pathogens. Inside the laboratory, movement is restricted due to air hoses attached to the mandatory full-body safety suits. In addition, disinfection of every item that is removed from Class II biosafety cabinets (BSCs) is required. Laboratory specialists must be trained in the practices of the BSL-4 laboratory and must show high proficiency in the skills they are performing. The focus of this article is to outline proper procedures and techniques to ensure laboratory biosafety and experimental accuracy using a standard viral plaque assay as an example procedure. In particular, proper techniques to work safely in a BSL-4 environment when performing an experiment will be visually emphasized. These techniques include: setting up a Class II BSC for experiments, proper cleaning of the Class II BSC when finished working, waste management and safe disposal of waste generated inside a BSL-4 laboratory, and the removal of inactivated samples from inside a BSL-4 laboratory to the BSL-2 laboratory.
Quantitative phase imaging of living cells with a swept laser source
NASA Astrophysics Data System (ADS)
Chen, Shichao; Zhu, Yizheng
2016-03-01
Digital holographic phase microscopy is a well-established quantitative phase imaging technique. However, interference artifacts from inside the system, typically induced by elements whose optical thickness are within the source coherence length, limit the imaging quality as well as sensitivity. In this paper, a swept laser source based technique is presented. Spectra acquired at a number of wavelengths, after Fourier Transform, can be used to identify the sources of the interference artifacts. With proper tuning of the optical pathlength difference between sample and reference arms, it is possible to avoid these artifacts and achieve sensitivity below 0.3nm. Performance of the proposed technique is examined in live cell imaging.
Temporally flickering nanoparticles for compound cellular imaging and super resolution
NASA Astrophysics Data System (ADS)
Ilovitsh, Tali; Danan, Yossef; Meir, Rinat; Meiri, Amihai; Zalevsky, Zeev
2016-03-01
This work presents the use of flickering nanoparticles for imaging biological samples. The method has high noise immunity, and it enables the detection of overlapping types of GNPs, at significantly sub-diffraction distances, making it attractive for super resolving localization microscopy techniques. The method utilizes a lock-in technique at which the imaging of the sample is done using a time-modulated laser beam that match the number of the types of gold nanoparticles (GNPs) that label a given sample, and resulting in the excitation of the temporal flickering of the scattered light at known temporal frequencies. The final image where the GNPs are spatially separated is obtained using post processing where the proper spectral components corresponding to the different modulation frequencies are extracted. This allows the simultaneous super resolved imaging of multiple types of GNPs that label targets of interest within biological samples. Additionally applying the post-processing algorithm of the K-factor image decomposition algorithm can further improve the performance of the proposed approach.
NASA Technical Reports Server (NTRS)
Baker, G. R.; Fethe, T. P.
1975-01-01
Research in the application of remotely sensed data from LANDSAT or other airborne platforms to the efficient management of a large timber based forest industry was divided into three phases: (1) establishment of a photo/ground sample correlation, (2) investigation of techniques for multi-spectral digital analysis, and (3) development of a semi-automated multi-level sampling system. To properly verify results, three distinct test areas were selected: (1) Jacksonville Mill Region, Lower Coastal Plain, Flatwoods, (2) Pensacola Mill Region, Middle Coastal Plain, and (3) Mississippi Mill Region, Middle Coastal Plain. The following conclusions were reached: (1) the probability of establishing an information base suitable for management requirements through a photo/ground double sampling procedure, alleviating the ground sampling effort, is encouraging, (2) known classification techniques must be investigated to ascertain the level of precision possible in separating the many densities involved, and (3) the multi-level approach must be related to an information system that is executable and feasible.
NASA Astrophysics Data System (ADS)
Younse, Paulo
Four sealing methods for encapsulating samples in 1 cm diameter thin-walled sample tubes were designed, along with a set of tests for characterization and evaluation of contamination prevention and sample preservation capability for the proposed Mars Sample Return (MSR) campaign. The sealing methods include a finned shape memory alloy (SMA) plug, expanding torque plug, contracting SMA ring cap, and expanding SMA ring plug. Mechanical strength and hermeticity of the seal were measured using a helium leak detector. Robustness of the seal to Mars simulant dust, surface abrasion, and pressure differentials were tested. Survivability tests were run to simulate thermal cycles on Mars, vibration from a Mars Ascent Vehicle (MAV), and shock from Earth Entry Vehicle (EEV) landing. Material compatibility with potential sample minerals and organic molecules were studied to select proper tube and seal materials that would not lead to adverse reactions nor contaminate the sample. Cleaning and sterilization techniques were executed on coupons made from the seal materials to assess compliance with planetary protection and contamination control. Finally, a method to cut a sealed tube for sample removal was designed and tested.
Gill, Samuel C; Lim, Nathan M; Grinaway, Patrick B; Rustenburg, Ariën S; Fass, Josh; Ross, Gregory A; Chodera, John D; Mobley, David L
2018-05-31
Accurately predicting protein-ligand binding affinities and binding modes is a major goal in computational chemistry, but even the prediction of ligand binding modes in proteins poses major challenges. Here, we focus on solving the binding mode prediction problem for rigid fragments. That is, we focus on computing the dominant placement, conformation, and orientations of a relatively rigid, fragment-like ligand in a receptor, and the populations of the multiple binding modes which may be relevant. This problem is important in its own right, but is even more timely given the recent success of alchemical free energy calculations. Alchemical calculations are increasingly used to predict binding free energies of ligands to receptors. However, the accuracy of these calculations is dependent on proper sampling of the relevant ligand binding modes. Unfortunately, ligand binding modes may often be uncertain, hard to predict, and/or slow to interconvert on simulation time scales, so proper sampling with current techniques can require prohibitively long simulations. We need new methods which dramatically improve sampling of ligand binding modes. Here, we develop and apply a nonequilibrium candidate Monte Carlo (NCMC) method to improve sampling of ligand binding modes. In this technique, the ligand is rotated and subsequently allowed to relax in its new position through alchemical perturbation before accepting or rejecting the rotation and relaxation as a nonequilibrium Monte Carlo move. When applied to a T4 lysozyme model binding system, this NCMC method shows over 2 orders of magnitude improvement in binding mode sampling efficiency compared to a brute force molecular dynamics simulation. This is a first step toward applying this methodology to pharmaceutically relevant binding of fragments and, eventually, drug-like molecules. We are making this approach available via our new Binding modes of ligands using enhanced sampling (BLUES) package which is freely available on GitHub.
Mapping Ocean Surface Topography with a Synthetic-Aperture Interferometry Radar
NASA Technical Reports Server (NTRS)
Fu, Lee-Lueng; Rodriguez, Ernesto
2006-01-01
We propose to apply the technique of synthetic aperture radar interferometry to the measurement of ocean surface topography at spatial resolution approaching 1 km. The measurement will have wide ranging applications in oceanography, hydrology. and marine geophysics. The oceanographic and related societal applications are briefly discussed in the paper. To meet the requirements for oceanographic applications, the instrument must be flown in an orbit with proper sampling of ocean tides.
Towards Mapping the Ocean Surface Topography at 1 km Resolution
NASA Technical Reports Server (NTRS)
Fu, Lee-Lueng; Rodriquez, Ernesto
2006-01-01
We propose to apply the technique of synthetic aperture radar interferometry to the measurement of ocean surface topography at spatial resolution approaching 1 km. The measurement will have wide ranging applications in oceanography, hydrology, and marine geophysics. The oceanographic and related societal applications are briefly discussed in the paper. To meet the requirements for oceanographic applications, the instrument must be flown in an orbit with proper sampling of ocean tides.
EM Propagation & Atmospheric Effects Assessment
2008-09-30
The split-step Fourier parabolic equation ( SSPE ) algorithm provides the complex amplitude and phase (group delay) of the continuous wave (CW) signal...the APM is based on the SSPE , we are implementing the more efficient Fourier synthesis technique to determine the transfer function. To this end a...needed in order to sample H(f) via the SSPE , and indeed with the proper parameters chosen, the two pulses can be resolved in the time window shown in
Sampling designs for HIV molecular epidemiology with application to Honduras.
Shepherd, Bryan E; Rossini, Anthony J; Soto, Ramon Jeremias; De Rivera, Ivette Lorenzana; Mullins, James I
2005-11-01
Proper sampling is essential to characterize the molecular epidemiology of human immunodeficiency virus (HIV). HIV sampling frames are difficult to identify, so most studies use convenience samples. We discuss statistically valid and feasible sampling techniques that overcome some of the potential for bias due to convenience sampling and ensure better representation of the study population. We employ a sampling design called stratified cluster sampling. This first divides the population into geographical and/or social strata. Within each stratum, a population of clusters is chosen from groups, locations, or facilities where HIV-positive individuals might be found. Some clusters are randomly selected within strata and individuals are randomly selected within clusters. Variation and cost help determine the number of clusters and the number of individuals within clusters that are to be sampled. We illustrate the approach through a study designed to survey the heterogeneity of subtype B strains in Honduras.
Investigation of digital encoding techniques for television transmission
NASA Technical Reports Server (NTRS)
Schilling, D. L.
1983-01-01
Composite color television signals are sampled at four times the color subcarrier and transformed using intraframe two dimensional Walsh functions. It is shown that by properly sampling a composite color signal and employing a Walsh transform the YIQ time signals which sum to produce the composite color signal can be represented, in the transform domain, by three component signals in space. By suitably zonal quantizing the transform coefficients, the YIQ signals can be processed independently to achieve data compression and obtain the same results as component coding. Computer simulations of three bandwidth compressors operating at 1.09, 1.53 and 1.8 bits/ sample are presented. The above results can also be applied to the PAL color system.
An improved switching converter model. Ph.D. Thesis. Final Report
NASA Technical Reports Server (NTRS)
Shortt, D. J.
1982-01-01
The nonlinear modeling and analysis of dc-dc converters in the continuous mode and discontinuous mode was done by averaging and discrete sampling techniques. A model was developed by combining these two techniques. This model, the discrete average model, accurately predicts the envelope of the output voltage and is easy to implement in circuit and state variable forms. The proposed model is shown to be dependent on the type of duty cycle control. The proper selection of the power stage model, between average and discrete average, is largely a function of the error processor in the feedback loop. The accuracy of the measurement data taken by a conventional technique is affected by the conditions at which the data is collected.
Remote sensing for oceanography: Past, present, future
NASA Technical Reports Server (NTRS)
Mcgoldrick, L. F.
1984-01-01
Oceanic dynamics was traditionally investigated by sampling from instruments in situ, yielding quantitative measurements that are intermittent in both space and time; the ocean is undersampled. The need to obtain proper sampling of the averaged quantities treated in analytical and numerical models is at present the most significant limitation on advances in physical oceanography. Within the past decade, many electromagnetic techniques for the study of the Earth and planets were applied to the study of the ocean. Now satellites promise nearly total coverage of the world's oceans using only a few days to a few weeks of observations. Both a review of the early and present techniques applied to satellite oceanography and a description of some future systems to be launched into orbit during the remainder of this century are presented. Both scientific and technologic capabilities are discussed.
Maternal–Child Microbiome: Specimen Collection, Storage and Implications for Research and Practice
Jordan, Sheila; Baker, Brenda; Dunn, Alexis; Edwards, Sara; Ferranti, Erin; Mutic, Abby D.; Yang, Irene; Rodriguez, Jeannie
2017-01-01
Background The maternal microbiome is a key contributor to the development and outcomes of pregnancy and the health status of both mother and infant. Significant advances are occurring in the science of the maternal and child microbiome and hold promise in improving outcomes related to pregnancy complications, child development, and chronic health conditions of mother and child. Objectives The purpose of the paper is to review site-specific considerations in the collection and storage of maternal and child microbiome samples and its implications for nursing research and practice. Approach Microbiome sampling protocols were reviewed and synthesized. Precautions across sampling protocols were also noted. Results Oral, vaginal, gut, placental, and breastmilk are viable sources for sampling the maternal and/or child microbiome. Prior to sampling special considerations need to be addressed related to various factors including current medications, health status, and hygiene practices. Proper storage of samples will avoid degradation of cellular and DNA structures vital for analysis. Discussion Changes in the microbiome throughout the perinatal, postpartum and childhood periods are dramatic and significant to outcomes of the pregnancy and the long-term health of mother and child. Proper sampling techniques are required to produce reliable results from which evidence-based practice recommendations will be built. Ethical and practical issues surrounding study design and protocol development must also be considered when researching vulnerable groups such as pregnant women and infants. Nurses hold the responsibility to both perform the research and to translate findings from microbiome investigations for clinical use. PMID:28252577
Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K
2017-12-01
Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.
Verification of intravenous catheter placement by auscultation--a simple, noninvasive technique.
Lehavi, Amit; Rudich, Utay; Schechtman, Moshe; Katz, Yeshayahu Shai
2014-01-01
Verification of proper placement of an intravenous catheter may not always be simple. We evaluated the auscultation technique for this purpose. Twenty healthy volunteers were randomized for 18G catheter inserted intravenously either in the right (12) or left arm (8), and subcutaneously in the opposite arm. A standard stethoscope was placed over an area approximately 3 cm proximal to the tip of the catheter in the presumed direction of the vein to grade on a 0-6 scale the murmur heard by rapidly injecting 2 mL of NaCl 0.9% solution. The auscultation was evaluated by a blinded staff anesthesiologist. All 20 intravenous injection were evaluated as flow murmurs, and were graded an average 5.65 (±0.98), whereas all 20 subcutaneous injections were evaluated as either crackles or no sound, and were graded an average 2.00 (±1.38), without negative results. Sensitivity was calculated as 95%. Specificity and Kappa could not be calculated due to an empty false-positive group. Being simple, handy and noninvasive, we recommend to use the auscultation technique for verification of the proper placement of an intravenous catheter when uncertain of its position. Data obtained in our limited sample of healthy subjects need to be confirmed in the clinical setting.
Halal authenticity issues in meat and meat products.
Nakyinsige, Khadijah; Man, Yaakob Bin Che; Sazili, Awis Qurni
2012-07-01
In the recent years, Muslims have become increasingly concerned about the meat they eat. Proper product description is very crucial for consumers to make informed choices and to ensure fair trade, particularly in the ever growing halal food market. Globally, Muslim consumers are concerned about a number of issues concerning meat and meat products such as pork substitution, undeclared blood plasma, use of prohibited ingredients, pork intestine casings and non-halal methods of slaughter. Analytical techniques which are appropriate and specific have been developed to deal with particular issues. The most suitable technique for any particular sample is often determined by the nature of the sample itself. This paper sets out to identify what makes meat halal, highlight the halal authenticity issues that occur in meat and meat products and provide an overview of the possible analytical methods for halal authentication of meat and meat products. Copyright © 2012 Elsevier Ltd. All rights reserved.
Evaluation of a New Ensemble Learning Framework for Mass Classification in Mammograms.
Rahmani Seryasat, Omid; Haddadnia, Javad
2018-06-01
Mammography is the most common screening method for diagnosis of breast cancer. In this study, a computer-aided system for diagnosis of benignity and malignity of the masses was implemented in mammogram images. In the computer aided diagnosis system, we first reduce the noise in the mammograms using an effective noise removal technique. After the noise removal, the mass in the region of interest must be segmented and this segmentation is done using a deformable model. After the mass segmentation, a number of features are extracted from it. These features include: features of the mass shape and border, tissue properties, and the fractal dimension. After extracting a large number of features, a proper subset must be chosen from among them. In this study, we make use of a new method on the basis of a genetic algorithm for selection of a proper set of features. After determining the proper features, a classifier is trained. To classify the samples, a new architecture for combination of the classifiers is proposed. In this architecture, easy and difficult samples are identified and trained using different classifiers. Finally, the proposed mass diagnosis system was also tested on mini-Mammographic Image Analysis Society and digital database for screening mammography databases. The obtained results indicate that the proposed system can compete with the state-of-the-art methods in terms of accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.
SSAGES: Software Suite for Advanced General Ensemble Simulations.
Sidky, Hythem; Colón, Yamil J; Helfferich, Julian; Sikora, Benjamin J; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S; Reid, Daniel R; Sevgen, Emre; Thapar, Vikram; Webb, Michael A; Whitmer, Jonathan K; de Pablo, Juan J
2018-01-28
Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques-including adaptive biasing force, string methods, and forward flux sampling-that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.
Koopman Mode Decomposition Methods in Dynamic Stall: Reduced Order Modeling and Control
2015-11-10
the flow phenomena by separating them into individual modes. The technique of Proper Orthogonal Decomposition (POD), see [ Holmes : 1998] is a popular...sampled values h(k), k = 0,…,2M-1, of the exponential sum 1. Solve the following linear system where 2. Compute all zeros zj D, j = 1,…,M...of the Prony polynomial i.e., calculate all eigenvalues of the associated companion matrix and form fj = log zj for j = 1,…,M, where log is the
Method for Hot Real-Time Sampling of Gasification Products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pomeroy, Marc D
The Thermochemical Process Development Unit (TCPDU) at the National Renewable Energy Laboratory (NREL) is a highly instrumented half-ton/day pilot scale plant capable of demonstrating industrially relevant thermochemical technologies from lignocellulosic biomass conversion, including gasification. Gasification creates primarily Syngas (a mixture of Hydrogen and Carbon Monoxide) that can be utilized with synthesis catalysts to form transportation fuels and other valuable chemicals. Biomass derived gasification products are a very complex mixture of chemical components that typically contain Sulfur and Nitrogen species that can act as catalysis poisons for tar reforming and synthesis catalysts. Real-time hot online sampling techniques, such as Molecular Beammore » Mass Spectrometry (MBMS), and Gas Chromatographs with Sulfur and Nitrogen specific detectors can provide real-time analysis providing operational indicators for performance. Sampling typically requires coated sampling lines to minimize trace sulfur interactions with steel surfaces. Other materials used inline have also shown conversion of sulfur species into new components and must be minimized. Sample line Residence time within the sampling lines must also be kept to a minimum to reduce further reaction chemistries. Solids from ash and char contribute to plugging and must be filtered at temperature. Experience at NREL has shown several key factors to consider when designing and installing an analytical sampling system for biomass gasification products. They include minimizing sampling distance, effective filtering as close to source as possible, proper line sizing, proper line materials or coatings, even heating of all components, minimizing pressure drops, and additional filtering or traps after pressure drops.« less
Wohlmeister, Denise; Vianna, Débora Renz Barreto; Helfer, Virginia Etges; Calil, Luciane Noal; Buffon, Andréia; Fuentefria, Alexandre Meneghello; Corbellini, Valeriano Antonio; Pilger, Diogo André
2017-10-01
Pathogenic Candida species are detected in clinical infections. CHROMagar™ is a phenotypical method used to identify Candida species, although it has limitations, which indicates the need for more sensitive and specific techniques. Infrared Spectroscopy (FT-IR) is an analytical vibrational technique used to identify patterns of metabolic fingerprint of biological matrixes, particularly whole microbial cell systems as Candida sp. in association of classificatory chemometrics algorithms. On the other hand, Soft Independent Modeling by Class Analogy (SIMCA) is one of the typical algorithms still little employed in microbiological classification. This study demonstrates the applicability of the FT-IR-technique by specular reflectance associated with SIMCA to discriminate Candida species isolated from vaginal discharges and grown on CHROMagar™. The differences in spectra of C. albicans, C. glabrata and C. krusei were suitable for use in the discrimination of these species, which was observed by PCA. Then, a SIMCA model was constructed with standard samples of three species and using the spectral region of 1792-1561cm -1 . All samples (n=48) were properly classified based on the chromogenic method using CHROMagar™ Candida. In total, 93.4% (n=45) of the samples were correctly and unambiguously classified (Class I). Two samples of C. albicans were classified correctly, though these could have been C. glabrata (Class II). Also, one C. glabrata sample could have been classified as C. krusei (Class II). Concerning these three samples, one triplicate of each was included in Class II and two in Class I. Therefore, FT-IR associated with SIMCA can be used to identify samples of C. albicans, C. glabrata, and C. krusei grown in CHROMagar™ Candida aiming to improve clinical applications of this technique. Copyright © 2017 Elsevier B.V. All rights reserved.
Impact of Oriented Clay Particles on X-Ray Spectroscopy Analysis
NASA Astrophysics Data System (ADS)
Lim, A. J. M. S.; Syazwani, R. N.; Wijeyesekera, D. C.
2016-07-01
Understanding the engineering properties of the mineralogy and microfabic of clayey soils is very complex and thus very difficult for soil characterization. Micromechanics of soils recognize that the micro structure and mineralogy of clay have a significant influence on its engineering behaviour. To achieve a more reliable quantitative evaluation of clay mineralogy, a proper sample preparation technique for quantitative clay mineral analysis is necessary. This paper presents the quantitative evaluation of elemental analysis and chemical characterization of oriented and random oriented clay particles using X-ray spectroscopy. Three different types of clays namely marine clay, bentonite and kaolin clay were studied. The oriented samples were prepared by placing the dispersed clay in water and left to settle on porous ceramic tiles by applying a relatively weak suction through a vacuum pump. Images form a Scanning Electron Microscope (SEM) was also used to show the comparison between the orientation patterns of both the sample preparation techniques. From the quantitative analysis of the X-ray spectroscopy, oriented sampling method showed more accuracy in identifying mineral deposits, because it produced better peak intensity on the spectrum and more mineral content can be identified compared to randomly oriented samples.
Jeong, Hyunjo; Barnard, Daniel; Cho, Sungjong; Zhang, Shuzeng; Li, Xiongbing
2017-11-01
This paper presents analytical and experimental techniques for accurate determination of the nonlinearity parameter (β) in thick solid samples. When piezoelectric transducers are used for β measurements, the receiver calibration is required to determine the transfer function from which the absolute displacement can be calculated. The measured fundamental and second harmonic displacement amplitudes should be modified to account for beam diffraction and material absorption. All these issues are addressed in this study and the proposed technique is validated through the β measurements of thick solid samples. A simplified self-reciprocity calibration procedure for a broadband receiver is described. The diffraction and attenuation corrections for the fundamental and second harmonics are explicitly derived. Aluminum alloy samples in five different thicknesses (4, 6, 8, 10, 12cm) are prepared and β measurements are made using the finite amplitude, through-transmission method. The effects of diffraction and attenuation corrections on β measurements are systematically investigated. When diffraction and attenuation corrections are all properly made, the variation of β between different thickness samples is found to be less than 3.2%. Copyright © 2017 Elsevier B.V. All rights reserved.
Rodríguez-Entrena, Macario; Schuberth, Florian; Gelhard, Carsten
2018-01-01
Structural equation modeling using partial least squares (PLS-SEM) has become a main-stream modeling approach in various disciplines. Nevertheless, prior literature still lacks a practical guidance on how to properly test for differences between parameter estimates. Whereas existing techniques such as parametric and non-parametric approaches in PLS multi-group analysis solely allow to assess differences between parameters that are estimated for different subpopulations, the study at hand introduces a technique that allows to also assess whether two parameter estimates that are derived from the same sample are statistically different. To illustrate this advancement to PLS-SEM, we particularly refer to a reduced version of the well-established technology acceptance model.
NASA Technical Reports Server (NTRS)
Natesh, R.; Stringfellow, G. B.; Virkar, A. V.; Dunn, J.; Guyer, T.
1983-01-01
Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13C. Important correlation was obtained between defect densities, cell efficiency, and diffusion length. Grain boundary substructure displayed a strong influence on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements gave statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for quantimet quantitative image analyzer (QTM) analysis was perfected and is used routinely. The relationships between hole mobility and grain boundary density was determined. Mobility was measured using the van der Pauw technique, and grain boundary density was measured using quantitative microscopy technique. Mobility was found to decrease with increasing grain boundary density.
NASA Technical Reports Server (NTRS)
Whiteman, David N.; Venable, Demetrius; Landulfo, Eduardo
2012-01-01
In a recent publication, LeBlanc and McDermid proposed a hybrid calibration technique for Raman water vapor lidar involving a tungsten lamp and radiosondes. Measurements made with the lidar telescope viewing the calibration lamp were used to stabilize the lidar calibration determined by comparison with radiosonde. The technique provided a significantly more stable calibration constant than radiosondes used alone. The technique involves the use of a calibration lamp in a fixed position in front of the lidar receiver aperture. We examine this configuration and find that such a configuration likely does not properly sample the full lidar system optical efficiency. While the technique is a useful addition to the use of radiosondes alone for lidar calibration, it is important to understand the scenarios under which it will not provide an accurate quantification of system optical efficiency changes. We offer examples of these scenarios.
Design Considerations of a Compounded Sterile Preparations Course
Petraglia, Christine; Mattison, Melissa J.
2016-01-01
Objective. To design a comprehensive learning and assessment environment for the practical application of compounded sterile preparations using a constructivist approach. Design. Compounded Sterile Preparations Laboratory is a required 1-credit course that builds upon the themes of training aseptic technique typically used in health system settings and threads application of concepts from other courses in the curriculum. Students used critical-thinking skills to devise appropriate strategies to compound sterile preparations. Assessment. Aseptic technique skills were assessed with objective, structured, checklist-based rubrics. Most students successfully completed practical assessments using appropriate technique (mean assessment grade=83.2%). Almost all students passed the practical media fill (98%) and gloved fingertip sampling (86%) tests on the first attempt; all passed on the second attempt. Conclusion. Employing a constructivist scaffold approach to teaching proper hygiene and aseptic technique prepared students to pass media fill and gloved fingertip tests and to perform well on practical compounding assessments. PMID:26941438
Effectiveness of Various Methods of Teaching Proper Inhaler Technique.
Axtell, Samantha; Haines, Seena; Fairclough, Jamie
2017-04-01
To compare the effectiveness of 4 different instructional interventions in training proper inhaler technique. Randomized, noncrossover trial. Health fair and indigent clinic. Inhaler-naive adult volunteers who spoke and read English. Subjects were assigned to complete the following: (1) read a metered dose inhaler (MDI) package insert pamphlet, (2) watch a Centers for Disease Control and Prevention (CDC) video demonstrating MDI technique, (3) watch a YouTube video demonstrating MDI technique, or (4) receive direct instruction of MDI technique from a pharmacist. Inhaler use competency (completion of all 7 prespecified critical steps). Of the 72 subjects, 21 (29.2%) demonstrated competent inhaler technique. A statistically significant difference between pharmacist direct instruction and the remaining interventions, both combined ( P < .0001) and individually ( P ≤ .03), was evident. No statistically significant difference was detected among the remaining 3 intervention groups. Critical steps most frequently omitted or improperly performed were exhaling before inhalation and holding of breath after inhalation. A 2-minute pharmacist counseling session is more effective than other interventions in successfully educating patients on proper inhaler technique. Pharmacists can play a pivotal role in reducing the implications of improper inhaler use.
Estimating propagation velocity through a surface acoustic wave sensor
Xu, Wenyuan; Huizinga, John S.
2010-03-16
Techniques are described for estimating the propagation velocity through a surface acoustic wave sensor. In particular, techniques which measure and exploit a proper segment of phase frequency response of the surface acoustic wave sensor are described for use as a basis of bacterial detection by the sensor. As described, use of velocity estimation based on a proper segment of phase frequency response has advantages over conventional techniques that use phase shift as the basis for detection.
Zhu, Liang; Schade, Gunnar Wolfgang; Nielsen, Claus Jørgen
2013-12-17
We demonstrate the capabilities and properties of using Proton Transfer Reaction time-of-flight mass spectrometry (PTR-ToF-MS) to real-time monitor gaseous emissions from industrial scale amine-based carbon capture processes. The benchmark monoethanolamine (MEA) was used as an example of amines needing to be monitored from carbon capture facilities, and to describe how the measurements may be influenced by potentially interfering species in CO2 absorber stack discharges. On the basis of known or expected emission compositions, we investigated the PTR-ToF-MS MEA response as a function of sample flow humidity, ammonia, and CO2 abundances, and show that all can exhibit interferences, thus making accurate amine measurements difficult. This warrants a proper sample pretreatment, and we show an example using a dilution with bottled zero air of 1:20 to 1:10 to monitor stack gas concentrations at the CO2 Technology Center Mongstad (TCM), Norway. Observed emissions included many expected chemical species, dominantly ammonia and acetaldehyde, but also two new species previously not reported but emitted in significant quantities. With respect to concerns regarding amine emissions, we show that accurate amine quantifications in the presence of water vapor, ammonia, and CO2 become feasible after proper sample dilution, thus making PTR-ToF-MS a viable technique to monitor future carbon capture facility emissions, without conventional laborious sample pretreatment.
O'Hara, R P; Palazotto, A N
2012-12-01
To properly model the structural dynamics of the forewing of the Manduca sexta species, it is critical that the material and structural properties of the biological specimen be understood. This paper presents the results of a morphological study that has been conducted to identify the material and structural properties of a sample of male and female Manduca sexta specimens. The average mass, area, shape, size and camber of the wing were evaluated using novel measurement techniques. Further emphasis is placed on studying the critical substructures of the wing: venation and membrane. The venation cross section is measured using detailed pathological techniques over the entire venation of the wing. The elastic modulus of the leading edge veins is experimentally determined using advanced non-contact structural dynamic techniques. The membrane elastic modulus is randomly sampled over the entire wing to determine global material properties for the membrane using nanoindentation. The data gathered from this morphological study form the basis for the replication of future finite element structural models and engineered biomimetic wings for use with flapping wing micro air vehicles.
Assessment of probability of detection of delaminations in fiber-reinforced composites
NASA Technical Reports Server (NTRS)
Chern, E. J.; Chu, H. P.; Yang, J. N.
1991-01-01
Delamination is one of the critical defects in composite materials and structures. An ultrasonic C-scan imaging technique which maps out the acoustic impedance mismatched areas with respect to the sample coordinates, is particularly well suited for detecting and characterizing delaminations in composites. To properly interpret the results, it is necessary to correlate the indications with the detection limits and probability of detection (POD) of the ultrasonic C-scan imaging technique. The baseline information on the assessment of POD of delaminations in composite materials and structures is very beneficial to the evaluation of spacecraft materials. In this study, we review the principle of POD, describe the laboratory set-up and procedure, and present the experimental results as well as assessment of POD of delaminations in fiber reinforced composite panels using ultrasonic C-scan techniques.
Proper projective symmetry in LRS Bianchi type V spacetimes
NASA Astrophysics Data System (ADS)
Shabbir, Ghulam; Mahomed, K. S.; Mahomed, F. M.; Moitsheki, R. J.
2018-04-01
In this paper, we investigate proper projective vector fields of locally rotationally symmetric (LRS) Bianchi type V spacetimes using direct integration and algebraic techniques. Despite the non-degeneracy in the Riemann tensor eigenvalues, we classify proper Bianchi type V spacetimes and show that the above spacetimes do not admit proper projective vector fields. Here, in all the cases projective vector fields are Killing vector fields.
NASA Astrophysics Data System (ADS)
Poirier, Marc; Gagnon, Martin; Tahan, Antoine; Coutu, André; Chamberland-lauzon, Joël
2017-01-01
In this paper, we present the application of cyclostationary modelling for the extrapolation of short stationary load strain samples measured in situ on hydraulic turbine blades. Long periods of measurements allow for a wide range of fluctuations representative of long-term reality to be considered. However, sampling over short periods limits the dynamic strain fluctuations available for analysis. The purpose of the technique presented here is therefore to generate a representative signal containing proper long term characteristics and expected spectrum starting with a much shorter signal period. The final objective is to obtain a strain history that can be used to estimate long-term fatigue behaviour of hydroelectric turbine runners.
Yang, Yi-Feng
2014-02-01
This paper discusses the effects of transformational leadership on cooperative conflict resolution (management) by evaluating several alternative models related to the mediating role of job satisfaction and change commitment. Samples of data from customer service personnel in Taiwan were analyzed. Based on the bootstrap sample technique, an empirical study was carried out to yield the best fitting model. The procedure of hierarchical nested model analysis was used, incorporating the methods of bootstrapping mediation, PRODCLIN2, and structural equation modeling (SEM) comparison. The analysis suggests that leadership that promotes integration (change commitment) and provides inspiration and motivation (job satisfaction), in the proper order, creates the means for cooperative conflict resolution.
Concentration of gold in natural waters
McHugh, J.B.
1988-01-01
The purpose of this paper is to investigate the amount of gold present in natural waters. One hundred and thirty-two natural water samples were collected from various sources and analyzed for gold by the latest techniques. Background values for gold in natural waters range from <0.001 to 0.005 ppb, and anomalous values range from 0.010 to 2.8 ppb. Waters collected from mineralized areas have a mean gold value of 0.101 ppb, whereas waters collected from unmineralized areas have a mean of 0.002 ppb. Some of the high gold values reported in the earlier literature were probably due to interferences by high salt content in the sample and/or lack of proper filter procedures. ?? 1988.
ERIC Educational Resources Information Center
Levesque, Luc
2014-01-01
Inaccurate measurements occur regularly in data acquisition as a result of improper sampling times. An understanding of proper sampling times when collecting data with an analogue-to-digital converter or video camera is crucial in order to avoid anomalies. A proper choice of sampling times should be based on the Nyquist sampling theorem. If the…
Participation in "Handwashing University" Promotes Proper Handwashing Techniques for Youth
ERIC Educational Resources Information Center
Fenton, Ginger; Radhakrishna, Rama; Cutter, Catherine Nettles
2010-01-01
A study was conducted to assess the effectiveness of the Handwashing University on teaching youth the benefits of proper handwashing. The Handwashing University is an interactive display with several successive stations through which participants move to learn necessary skills for proper handwashing. Upon completion of the Handwashing University,…
Power system frequency estimation based on an orthogonal decomposition method
NASA Astrophysics Data System (ADS)
Lee, Chih-Hung; Tsai, Men-Shen
2018-06-01
In recent years, several frequency estimation techniques have been proposed by which to estimate the frequency variations in power systems. In order to properly identify power quality issues under asynchronously-sampled signals that are contaminated with noise, flicker, and harmonic and inter-harmonic components, a good frequency estimator that is able to estimate the frequency as well as the rate of frequency changes precisely is needed. However, accurately estimating the fundamental frequency becomes a very difficult task without a priori information about the sampling frequency. In this paper, a better frequency evaluation scheme for power systems is proposed. This method employs a reconstruction technique in combination with orthogonal filters, which may maintain the required frequency characteristics of the orthogonal filters and improve the overall efficiency of power system monitoring through two-stage sliding discrete Fourier transforms. The results showed that this method can accurately estimate the power system frequency under different conditions, including asynchronously sampled signals contaminated by noise, flicker, and harmonic and inter-harmonic components. The proposed approach also provides high computational efficiency.
Türker-Kaya, Sevgi; Huck, Christian W
2017-01-20
Plant cells, tissues and organs are composed of various biomolecules arranged as structurally diverse units, which represent heterogeneity at microscopic levels. Molecular knowledge about those constituents with their localization in such complexity is very crucial for both basic and applied plant sciences. In this context, infrared imaging techniques have advantages over conventional methods to investigate heterogeneous plant structures in providing quantitative and qualitative analyses with spatial distribution of the components. Thus, particularly, with the use of proper analytical approaches and sampling methods, these technologies offer significant information for the studies on plant classification, physiology, ecology, genetics, pathology and other related disciplines. This review aims to present a general perspective about near-infrared and mid-infrared imaging/microspectroscopy in plant research. It is addressed to compare potentialities of these methodologies with their advantages and limitations. With regard to the organization of the document, the first section will introduce the respective underlying principles followed by instrumentation, sampling techniques, sample preparations, measurement, and an overview of spectral pre-processing and multivariate analysis. The last section will review selected applications in the literature.
A Simple Configuration for Quantitative Phase Contrast Microscopy of Transmissible Samples
NASA Astrophysics Data System (ADS)
Sengupta, Chandan; Dasgupta, Koustav; Bhattacharya, K.
Phase microscopy attempts to visualize and quantify the phase distribution of samples which are otherwise invisible under microscope without the use of stains. The two principal approaches to phase microscopy are essentially those of Fourier plane modulation and interferometric techniques. Although the former, first proposed by Zernike, had been the harbinger of phase microscopy, it was the latter that allowed for quantitative evaluation of phase samples. However interferometric techniques are fraught with associated problems such as complicated setup involving mirrors and beam-splitters, the need for a matched objective in the reference arm and also the need for vibration isolation. The present work proposes a single element cube beam-splitter (CBS) interferometer combined with a microscope objective (MO) for interference microscopy. Because of the monolithic nature of the interferometer, the system is almost insensitive to vibrations and relatively simple to align. It will be shown that phase shifting properties may also be introduced by suitable and proper use of polarizing devices. Initial results showing the quantitative three dimensional phase profiles of simulated and actual biological specimens are presented.
Space Interferometry Mission: Dynamical Observations of Galaxies (SIMDOG)
NASA Technical Reports Server (NTRS)
Shaya, Edward J.; Borne, Kirk D.; Nusser, Adi; Peebles, P. J. E.; Tonry, John; Tully, Brent R.; Vogel, Stuart; Zaritsky, Dennis
2004-01-01
Space Interferometry Mission (SIM) will be used to obtain proper motions for a sample of 27 galaxies; the first proper motion measurements of galaxies beyond the satellite system of the Milky Way. SIM measurements lead to knowledge of the full 6-dimensional position and velocity vector of each galaxy. In conjunction with new gravitational flow models, the result will be the first total mass measurements of individual galaxies. The project, includes developnient of powerful theoretical methods for orbital calculations. This SIM study will lead to vastly improved determinations of individual galaxy masses, halo sizes, and the fractional contribution of dark matter. Astronomers have struggled to calculate the orbits of galaxies with only position and redshift information. Traditional N-body techniques are unsuitable for an analysis backward in time from a present distribution if any components of velocity or position are not very precisely known.
Nanoparticles and capillary electrophoresis: A marriage with environmental impact.
Mebert, Andrea Mathilde; Tuttolomondo, Maria Victoria; Echazú, Maria Inés Alvarez; Foglia, Maria Lucia; Alvarez, Gisela Solange; Vescina, María Cristina; Santo-Orihuela, Pablo Luis; Desimone, Martín Federico
2016-08-01
The impact of nanomaterials in the environment and human health is a cause of big concern and even though intensive studies are currently being carried out, there is still a lot to elucidate. The development of validated methods for the characterization and quantification of nanomaterials and their impact on the environment should be encouraged to achieve a proper, safe, and sustainable use of nanoparticles (NPs). Recently, CE emerged as a well-adapted technique for the analysis of environmental samples. This review presents the application of NPs together with CE systems for environmental pollutants analysis, as well as the application of CE techniques for the analysis of various types of NPs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
RED RUNAWAYS II: LOW-MASS HILLS STARS IN SDSS STRIPE 82
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yanqiong; Smith, Martin C.; Carlin, Jeffrey L., E-mail: zhangyq@shao.ac.cn, E-mail: msmith@shao.ac.cn
Stars ejected from the Galactic Center can be used to place important constraints on the Milky Way potential. Since existing hypervelocity stars are too distant to accurately determine orbits, we have conducted a search for nearby candidates using full three-dimensional velocities. Since the efficacy of such studies is often hampered by deficiencies in proper motion catalogs, we have chosen to utilize the reliable, high-precision Sloan Digital Sky Survey (SDSS) Stripe 82 proper motion catalog. Although we do not find any candidates which have velocities in excess of the escape speed, we identify 226 stars on orbits that are consistent withmore » Galactic Center ejection. This number is significantly larger than what we would expect for halo stars on radial orbits and cannot be explained by disk or bulge contamination. If we restrict ourselves to metal-rich stars, we find 29 candidates with [Fe/H] > −0.8 dex and 10 with [Fe/H] > −0.6 dex. Their metallicities are more consistent with what we expect for bulge ejecta, and so we believe these candidates are especially deserving of further study. We have supplemented this sample using our own radial velocities, developing an algorithm to use proper motions for optimizing candidate selection. This technique provides considerable improvement on the blind spectroscopic sample of SDSS, being able to identify candidates with an efficiency around 20 times better than a blind search.« less
Bowie, Dennis M.
1991-01-01
The difficult asthmatic patient should first be managed by confirming the diagnosis and eliminating any aggravating environmental or occupational factors, including medication use. Proper treatment requires rational addition of drugs in a logical sequence. It is most important to ensure proper inhaler technique, patient compliance, effective doctor-patient communication, and proper patient monitoring. ImagesFigure 2 PMID:21229079
ERIC Educational Resources Information Center
Swiggart, William H.; Ghulyan, Marine V.; Dewey, Charlene M.
2012-01-01
Controlled prescription drug (CPD) abuse is an increasing threat to patient safety and health care providers (HCPs) are not adequately prepared nor do they routinely employ proper screening techniques. Using standardized patients (SPs) as an instructional strategy, the trained physicians on proper prescribing practices and SBIRT (Screening, Brief…
Analysis of the 148Gd and 154Dy Content in Proton-Irradiated Lead Targets.
Talip, Z; Pfister, S; Dressler, R; David, J C; Vögele, A; Vontobel, P; Michel, R; Schumann, D
2017-06-20
This work presents the determination of the 148 Gd and 154 Dy content in Pb targets irradiated by 220-2600 MeV protons. It includes the chemical separation of lanthanides, followed by the preparation of proper samples, by molecular plating technique, for α-spectrometry measurements. The experimental cross section results were compared with theoretical predictions, calculated with the INCL++-ABLA07 code. The comparisons showed a satisfactory agreement for 148 Gd (less than within a factor two), while measured 154 Dy cross sections are higher than the theoretical values.
Method for Hot Real-Time Analysis of Pyrolysis Vapors at Pilot Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pomeroy, Marc D
Pyrolysis oils contain more than 400 compounds, up to 60% of which do not re-volatilize for subsequent chemical analysis. Vapor chemical composition is also complicated as additional condensation reactions occur during quenching and collection of the product. Due to the complexity of the pyrolysis oil, and a desire to catalytically upgrade the vapor composition before condensation, online real-time analytical techniques such as Molecular Beam Mass Spectrometry (MBMS) are of great use. However, in order to properly sample hot pyrolysis vapors at the pilot scale, many challenges must be overcome.
Fourier Plane Image Combination by Feathering
NASA Astrophysics Data System (ADS)
Cotton, W. D.
2017-09-01
Astronomical objects frequently exhibit structure over a wide range of scales whereas many telescopes, especially interferometer arrays, only sample a limited range of spatial scales. To properly image these objects, images from a set of instruments covering the range of scales may be needed. These images then must be combined in a manner to recover all spatial scales. This paper describes the feathering technique for image combination in the Fourier transform plane. Implementations in several packages are discussed and example combinations of single dish and interferometric observations of both simulated and celestial radio emission are given.
An expert support system for breast cancer diagnosis using color wavelet features.
Issac Niwas, S; Palanisamy, P; Chibbar, Rajni; Zhang, W J
2012-10-01
Breast cancer diagnosis can be done through the pathologic assessments of breast tissue samples such as core needle biopsy technique. The result of analysis on this sample by pathologist is crucial for breast cancer patient. In this paper, nucleus of tissue samples are investigated after decomposition by means of the Log-Gabor wavelet on HSV color domain and an algorithm is developed to compute the color wavelet features. These features are used for breast cancer diagnosis using Support Vector Machine (SVM) classifier algorithm. The ability of properly trained SVM is to correctly classify patterns and make them particularly suitable for use in an expert system that aids in the diagnosis of cancer tissue samples. The results are compared with other multivariate classifiers such as Naïves Bayes classifier and Artificial Neural Network. The overall accuracy of the proposed method using SVM classifier will be further useful for automation in cancer diagnosis.
Simultaneous extraction of proteins and metabolites from cells in culture
Sapcariu, Sean C.; Kanashova, Tamara; Weindl, Daniel; Ghelfi, Jenny; Dittmar, Gunnar; Hiller, Karsten
2014-01-01
Proper sample preparation is an integral part of all omics approaches, and can drastically impact the results of a wide number of analyses. As metabolomics and proteomics research approaches often yield complementary information, it is desirable to have a sample preparation procedure which can yield information for both types of analyses from the same cell population. This protocol explains a method for the separation and isolation of metabolites and proteins from the same biological sample, in order for downstream use in metabolomics and proteomics analyses simultaneously. In this way, two different levels of biological regulation can be studied in a single sample, minimizing the variance that would result from multiple experiments. This protocol can be used with both adherent and suspension cell cultures, and the extraction of metabolites from cellular medium is also detailed, so that cellular uptake and secretion of metabolites can be quantified. Advantages of this technique includes:1.Inexpensive and quick to perform; this method does not require any kits.2.Can be used on any cells in culture, including cell lines and primary cells extracted from living organisms.3.A wide variety of different analysis techniques can be used, adding additional value to metabolomics data analyzed from a sample; this is of high value in experimental systems biology. PMID:26150938
Gas chromatographic concepts for the analysis of planetary atmospheres
NASA Technical Reports Server (NTRS)
Valentin, J. R.; Cullers, D. K.; Hall, K. W.; Krekorian, R. L.; Phillips, J. B.
1991-01-01
Over the last few years, new gas chromatographic (GC) concepts were developed for use on board spacecraft or any other restricted environments for determining the chemical composition of the atmosphere and surface material of various planetary bodies. Future NASA Missions include an entry probe that will be sent to Titan and various spacecraft that will land on Mars. In order to be able to properly respond to the mission science requirements and physical restrictions imposed on the instruments by these missions, GC analytical techniques are being developed. Some of these techniques include hardware and mathematical techniques that will improve GC sensitivity and increase the sampling rate of a GC descending through a planetary atmosphere. The technique of Multiplex Gas Chromatography (MGC) is an example of a technique that was studied in a simulated Titan atmosphere. In such an environment, the atmospheric pressure at instrument deployment is estimated to be a few torr. Thus, at such pressures, the small amount of sample that is acquired might not be enough to satisfy the detection requirements of the gas chromatograph. In MGC, many samples are pseudo-randomly introduced to the chromatograph without regard to elution of preceding components. The resulting data is then reduced using mathematical techniques such as cross-correlation of Fourier Transforms. Advantages realized from this technique include: improvement in detection limits of several orders of magnitude and increase in the number of analyses that can be conducted in a given period of time. Results proving the application of MGC at very low pressures emulating the same atmospheric pressures that a Titan Probe will encounter when the instruments are deployed are presented. The sample used contained hydrocarbons that are expected to be found in Titan's atmosphere. In addition, a new selective modulator was developed to monitor water under Martian atmospheric conditions. Since this modulator is selective only to water, the need for a GC column is eliminated. This results in further simplification of the instrument.
The effect of environmental performance and accounting characteristics to earnings informativeness
NASA Astrophysics Data System (ADS)
Herawaty, V.
2018-01-01
The objective of this empirical study is to analyze the influence of environmental performance and company’s accounting characteristics to earnings informativeness proxied by earnings response coefficient (ERC) on manufacturing companies listed on Indonesia Stock Exchange and consistently follow the PROPER assessment in 2010-2014. One of the company’s considerations is to create the green environment reflecting its environmental measures, drawing investors to respond to the company’s environmental performance. The data were obtained from Indonesian Capital Market Directory (ICMD), the Indonesia Stock Exchange homepage, the company’s annual reports, the decree of the Minister of Environment. The samples used in this research are 27 go public manufacturing companies listed on Indonesia Stock Exchange that consistently follow the PROPER in 2010-2014. The sampling technique used was the purposive method. This research uses multiple regression analysis. The results show that the environmental performance and profitability have a positive influence to earnings informativeness, while leverage has a negative influence to earnings informativeness. Growth opportunities as a control variable has a positive effect on earnings informativeness. This research has proved that the environmental performance is crucial through observing the investors’ reaction in the capital market.
NASA Astrophysics Data System (ADS)
Eftekharzadeh, S.; Myers, A. D.; Hennawi, J. F.; Djorgovski, S. G.; Richards, G. T.; Mahabal, A. A.; Graham, M. J.
2017-06-01
We present the most precise estimate to date of the clustering of quasars on very small scales, based on a sample of 47 binary quasars with magnitudes of g < 20.85 and proper transverse separations of ˜25 h-1 kpc. Our sample of binary quasars, which is about six times larger than any previous spectroscopically confirmed sample on these scales, is targeted using a kernel density estimation (KDE) technique applied to Sloan Digital Sky Survey (SDSS) imaging over most of the SDSS area. Our sample is 'complete' in that all of the KDE target pairs with 17.0 ≲ R ≲ 36.2 h-1 kpc in our area of interest have been spectroscopically confirmed from a combination of previous surveys and our own long-slit observational campaign. We catalogue 230 candidate quasar pairs with angular separations of <8 arcsec, from which our binary quasars were identified. We determine the projected correlation function of quasars (\\bar{W}_p) in four bins of proper transverse scale over the range 17.0 ≲ R ≲ 36.2 h-1 kpc. The implied small-scale quasar clustering amplitude from the projected correlation function, integrated across our entire redshift range, is A = 24.1 ± 3.6 at ˜26.6 h-1 kpc. Our sample is the first spectroscopically confirmed sample of quasar pairs that is sufficiently large to study how quasar clustering evolves with redshift at ˜25 h-1 kpc. We find that empirical descriptions of how quasar clustering evolves with redshift at ˜25 h-1 Mpc also adequately describe the evolution of quasar clustering at ˜25 h-1 kpc.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Proper light. 29.112 Section 29.112 Agriculture... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.112 Proper light. Tobacco shall not be inspected or sampled for the purposes of the Act except when displayed in proper light for correct...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 2 2014-01-01 2014-01-01 false Proper light. 29.112 Section 29.112 Agriculture... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.112 Proper light. Tobacco shall not be inspected or sampled for the purposes of the Act except when displayed in proper light for correct...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 2 2013-01-01 2013-01-01 false Proper light. 29.112 Section 29.112 Agriculture... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.112 Proper light. Tobacco shall not be inspected or sampled for the purposes of the Act except when displayed in proper light for correct...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Proper light. 29.112 Section 29.112 Agriculture... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.112 Proper light. Tobacco shall not be inspected or sampled for the purposes of the Act except when displayed in proper light for correct...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 2 2012-01-01 2012-01-01 false Proper light. 29.112 Section 29.112 Agriculture... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.112 Proper light. Tobacco shall not be inspected or sampled for the purposes of the Act except when displayed in proper light for correct...
NASA Astrophysics Data System (ADS)
Karagiannis, Georgios Th.
2016-04-01
The development of non-destructive techniques is a reality in the field of conservation science. These techniques are usually not so accurate, as the analytical micro-sampling techniques, however, the proper development of soft-computing techniques can improve their accuracy. In this work, we propose a real-time fast acquisition spectroscopic mapping imaging system that operates from the ultraviolet to mid infrared (UV/Vis/nIR/mIR) area of the electromagnetic spectrum and it is supported by a set of soft-computing methods to identify the materials that exist in a stratigraphic structure of paint layers. Particularly, the system acquires spectra in diffuse-reflectance mode, scanning in a Region-Of-Interest (ROI), and having wavelength range from 200 up to 5000 nm. Also, a fuzzy c-means clustering algorithm, i.e., the particular soft-computing algorithm, produces the mapping images. The evaluation of the method was tested on a byzantine painted icon.
Francy, D.S.; Jones, A.L.; Myers, Donna N.; Rowe, G.L.; Eberle, Michael; Sarver, K.M.
1998-01-01
The U.S. Geological Survey (USGS), Water Resources Division (WRD), requires that quality-assurance/quality-control (QA/QC) activities be included in any sampling and analysis program. Operational QA/QC procedures address local needs while incorporating national policies. Therefore, specific technical policies were established for all activities associated with water-quality project being done by the Ohio District. The policies described in this report provide Ohio District personnel, cooperating agencies, and others with a reference manual on QA/QC procedures that are followed in collecitng and analyzing water-quality samples and reporting water-quality information in the Ohio District. The project chief, project support staff, District Water-Quality Specialist, and District Laboratory Coordinator are all involved in planning and implementing QA/QC activities at the district level. The District Chief and other district-level managers provide oversight, and the Regional Water-Quality Specialist, Office of Water Quality (USGS headquarters), and the Branch of Quality Systems within the Office of Water Quality create national QA/QC polices and provide assistance to District personnel. In the literature, the quality of all measurement data is expressed in terms of precision, variability, bias, accuracy, completeness, representativeness, and comparability. In the Ohio District, bias and variability will be used to describe quality-control data generated from samples in the field and laboratory. Each project chief must plan for implementation and financing of QA/QC activities necessary to achieve data-quality objectives. At least 15 percent of the total project effort must be directed toward QA/QC activities. Of this total, 5-10 percent will be used for collection and analysis of quality-control samples. This is an absolute minimum, and more may be required based on project objectives. Proper techniques must be followed in the collection and processing of surface-water, ground-water, biological, precipitation, bed-sediment, bedload, suspended-sediment, and solid-phase samples. These techniques are briefly described in this report and are extensively documented. The reference documents listed in this report will be kept by the District librarian and District Water-Quality Specialist and updated regularly so that they are available to all District staff. Proper handling and documentation before, during, and after field activities are essential to ensure the integrity of the sample and to correct erroneous reporting of data results. Field sites are to be properly identified and entered into the data base before field data-collection activities begin. During field activities, field notes are to be completed and sample bottles appropriately labeled a nd stored. After field activities, all paperwork is to be completed promptly and samples transferred to the laboratory within allowable holding times. All equipment used by District personnel for the collection and processing of water-quality samples is to be properly operated, maintained, and calibrated by project personnel. This includes equipment for onsite measurement of water-quality characteristics (temperature, specific conductance, pH, dissolved oxygen, alkalinity, acidity, and turbidity) and equipment and instruments used for biological sampling. The District Water-Quality Specialist and District Laboratory Coordinator are responsible for preventive maintenance and calibration of equipment in the Ohio District laboratory. The USGS National Water Quality Laboratory in Arvada, Colo., is the primary source of analytical services for most project work done by the Ohio District. Analyses done at the Ohio District laboratory are usually those that must be completed within a few hours of sample collection. Contract laboratories or other USGS laboratories are sometimes used instead of the NWQL or the Ohio District laboratory. When a contract laboratory is used, the projec
Application of biospeckles for assessment of structural and cellular changes in muscle tissue
NASA Astrophysics Data System (ADS)
Maksymenko, Oleksandr P.; Muravsky, Leonid I.; Berezyuk, Mykola I.
2015-09-01
A modified spatial-temporal speckle correlation technique for operational assessment of structural changes in muscle tissues after slaughtering is considered. Coefficient of biological activity as a quantitative indicator of structural changes of biochemical processes in biological tissues is proposed. The experimental results have shown that this coefficient properly evaluates the biological activity of pig and chicken muscle tissue samples. Studying the degradation processes in muscle tissue during long-time storage in a refrigerator by measuring the spatial-temporal dynamics of biospeckle patterns is carried out. The reduction of the bioactivity level of refrigerated muscle tissue samples connected with the initiation of muscle fiber cracks and ruptures, reduction of sarcomeres, nuclei deformation, nuclear chromatin diminishing, and destruction of mitochondria is analyzed.
Martyna, Agnieszka; Zadora, Grzegorz; Neocleous, Tereza; Michalska, Aleksandra; Dean, Nema
2016-08-10
Many chemometric tools are invaluable and have proven effective in data mining and substantial dimensionality reduction of highly multivariate data. This becomes vital for interpreting various physicochemical data due to rapid development of advanced analytical techniques, delivering much information in a single measurement run. This concerns especially spectra, which are frequently used as the subject of comparative analysis in e.g. forensic sciences. In the presented study the microtraces collected from the scenarios of hit-and-run accidents were analysed. Plastic containers and automotive plastics (e.g. bumpers, headlamp lenses) were subjected to Fourier transform infrared spectrometry and car paints were analysed using Raman spectroscopy. In the forensic context analytical results must be interpreted and reported according to the standards of the interpretation schemes acknowledged in forensic sciences using the likelihood ratio approach. However, for proper construction of LR models for highly multivariate data, such as spectra, chemometric tools must be employed for substantial data compression. Conversion from classical feature representation to distance representation was proposed for revealing hidden data peculiarities and linear discriminant analysis was further applied for minimising the within-sample variability while maximising the between-sample variability. Both techniques enabled substantial reduction of data dimensionality. Univariate and multivariate likelihood ratio models were proposed for such data. It was shown that the combination of chemometric tools and the likelihood ratio approach is capable of solving the comparison problem of highly multivariate and correlated data after proper extraction of the most relevant features and variance information hidden in the data structure. Copyright © 2016 Elsevier B.V. All rights reserved.
Automated liver sampling using a gradient dual-echo Dixon-based technique.
Bashir, Mustafa R; Dale, Brian M; Merkle, Elmar M; Boll, Daniel T
2012-05-01
Magnetic resonance spectroscopy of the liver requires input from a physicist or physician at the time of acquisition to insure proper voxel selection, while in multiecho chemical shift imaging, numerous regions of interest must be manually selected in order to ensure analysis of a representative portion of the liver parenchyma. A fully automated technique could improve workflow by selecting representative portions of the liver prior to human analysis. Complete volumes from three-dimensional gradient dual-echo acquisitions with two-point Dixon reconstruction acquired at 1.5 and 3 T were analyzed in 100 subjects, using an automated liver sampling algorithm, based on ratio pairs calculated from signal intensity image data as fat-only/water-only and log(in-phase/opposed-phase) on a voxel-by-voxel basis. Using different gridding variations of the algorithm, the average correct liver volume samples ranged from 527 to 733 mL. The average percentage of sample located within the liver ranged from 95.4 to 97.1%, whereas the average incorrect volume selected was 16.5-35.4 mL (2.9-4.6%). Average run time was 19.7-79.0 s. The algorithm consistently selected large samples of the hepatic parenchyma with small amounts of erroneous extrahepatic sampling, and run times were feasible for execution on an MRI system console during exam acquisition. Copyright © 2011 Wiley Periodicals, Inc.
30 CFR 70.209 - Respirable dust samples; transmission by operator.
Code of Federal Regulations, 2010 CFR
2010-07-01
... operator shall not open or tamper with the seal of any filter cassette or alter the weight of any filter... accordance with § 70.202 (Certified person; sampling) shall properly complete the dust data card that is.... Respirable dust samples with data cards not properly completed will be voided by MSHA. (d) All respirable...
30 CFR 70.209 - Respirable dust samples; transmission by operator.
Code of Federal Regulations, 2011 CFR
2011-07-01
... operator shall not open or tamper with the seal of any filter cassette or alter the weight of any filter... accordance with § 70.202 (Certified person; sampling) shall properly complete the dust data card that is.... Respirable dust samples with data cards not properly completed will be voided by MSHA. (d) All respirable...
Sahajpal, Vivek; Goyal, S P
2010-06-01
The exhibits obtained in wildlife offence cases quite often present a challenging situation for the forensic expert. The selection of proper approach for analysis is vital for a successful analysis. A generalised forensic analysis approach should proceed from the use of non-destructive techniques (morphological and microscopic examination) to partially destructive and finally destructive techniques (DNA analysis). The findings of non-destructive techniques may sometime be inconclusive but they definitely help in steering further forensic analysis in a proper direction. We describe a recent case where a very small dried skin piece (<0.05 mg) with just one small trimmed guard hair (0.4 cm) on it was received for species identification. The single guard hair was examined microscopically to get an indication of the type of species. We also describe the extraction procedure with a lower amount of sample, using an automated extraction method (Qiagen Biorobot EZ1) and PCR amplification of three mitochondrial genes (16s rRNA, 12s rRNA and cytochrome b) for species identification. Microscopic examination of the single hair indicated a viverrid species but the initial DNA analysis with 16s rRNA (through NCBI BLAST) showed the highest homology (93%) with a hyaenid species (Hyaena hyaena). However, further DNA analysis based on 12s rRNA and cytochrome b gene proved that the species was indeed a viverrid i.e. Viverricula indica (small Indian civet). The highest homology shown with a Hyaenid species by the 16s rRNA sequence from the case sample was due to lack of a 16s rRNA sequence for Viverricula indica in the NCBI data base. The case highlights the importance of morphological and microscopic examinations in wildlife offence cases. With respect to DNA extraction technology we found that automatic extraction method of Biorobot EZ1 (Qiagen) is quite useful with less amount of sample (much below recommended amount). Copyright 2009 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics
NASA Technical Reports Server (NTRS)
Pohorille, Andrew
2006-01-01
The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described by rate constants. These problems are isomorphic with chemical kinetics problems. Recently, several efficient techniques for this purpose have been developed based on the approach originally proposed by Gillespie. Although the utility of the techniques mentioned above for Bayesian problems has not been determined, further research along these lines is warranted
Optimized optical clearing method for imaging central nervous system
NASA Astrophysics Data System (ADS)
Yu, Tingting; Qi, Yisong; Gong, Hui; Luo, Qingming; Zhu, Dan
2015-03-01
The development of various optical clearing methods provides a great potential for imaging entire central nervous system by combining with multiple-labelling and microscopic imaging techniques. These methods had made certain clearing contributions with respective weaknesses, including tissue deformation, fluorescence quenching, execution complexity and antibody penetration limitation that makes immunostaining of tissue blocks difficult. The passive clarity technique (PACT) bypasses those problems and clears the samples with simple implementation, excellent transparency with fine fluorescence retention, but the passive tissue clearing method needs too long time. In this study, we not only accelerate the clearing speed of brain blocks but also preserve GFP fluorescence well by screening an optimal clearing temperature. The selection of proper temperature will make PACT more applicable, which evidently broaden the application range of this method.
The UKIDSS-2MASS proper motion survey - I. Ultracool dwarfs from UKIDSS DR4
NASA Astrophysics Data System (ADS)
Deacon, N. R.; Hambly, N. C.; King, R. R.; McCaughrean, M. J.
2009-04-01
The UK Infrared Telescope Infrared Deep Sky Survey (UKIDSS) is the first of a new generation of infrared surveys. Here, we combine the data from two UKIDSS components, the Large Area Survey (LAS) and the Galactic Cluster Survey (GCS), with Two-Micron All-Sky Survey (2MASS) data to produce an infrared proper motion survey for low-mass stars and brown dwarfs. In total, we detect 267 low-mass stars and brown dwarfs with significant proper motions. We recover all 10 known single L dwarfs and the one known T dwarf above the 2MASS detection limit in our LAS survey area and identify eight additional new candidate L dwarfs. We also find one new candidate L dwarf in our GCS sample. Our sample also contains objects from 11 potential common proper motion binaries. Finally, we test our proper motions and find that while the LAS objects have proper motions consistent with absolute proper motions, the GCS stars may have proper motions which are significantly underestimated. This is possibly due to the bulk motion of some of the local astrometric reference stars used in the proper motion determination.
Svarcová, Silvie; Kocí, Eva; Bezdicka, Petr; Hradil, David; Hradilová, Janka
2010-09-01
The uniqueness and limited amounts of forensic samples and samples from objects of cultural heritage together with the complexity of their composition requires the application of a wide range of micro-analytical methods, which are non-destructive to the samples, because these must be preserved for potential late revision. Laboratory powder X-ray micro-diffraction (micro-XRD) is a very effective non-destructive technique for direct phase analysis of samples smaller than 1 mm containing crystal constituents. It compliments optical and electron microscopy with elemental micro-analysis, especially in cases of complicated mixtures containing phases with similar chemical composition. However, modification of X-ray diffraction to the micro-scale together with its application for very heterogeneous real samples leads to deviations from the standard procedure. Knowledge of both the limits and the phenomena which can arise during the analysis is crucial for the meaningful and proper application of the method. We evaluated basic limits of micro-XRD equipped with a mono-capillary with an exit diameter of 0.1 mm, for example the size of irradiated area, appropriate grain size, and detection limits allowing identification of given phases. We tested the reliability and accuracy of quantitative phase analysis based on micro-XRD data in comparison with conventional XRD (reflection and transmission), carrying out experiments with two-phase model mixtures simulating historic colour layers. Furthermore, we demonstrate the wide use of micro-XRD for investigation of various types of micro-samples (contact traces, powder traps, colour layers) and we show how to enhance data quality by proper choice of experiment geometry and conditions.
Kim, Il Kwang; Lee, Soo Il
2016-05-01
The modal decomposition of tapping mode atomic force microscopy microcantilevers in liquid environments was studied experimentally. Microcantilevers with different lengths and stiffnesses and two sample surfaces with different elastic moduli were used in the experiment. The response modes of the microcantilevers were extracted as proper orthogonal modes through proper orthogonal decomposition. Smooth orthogonal decomposition was used to estimate the resonance frequency directly. The effects of the tapping setpoint and the elastic modulus of the sample under test were examined in terms of their multi-mode responses with proper orthogonal modes, proper orthogonal values, smooth orthogonal modes and smooth orthogonal values. Regardless of the stiffness of the microcantilever under test, the first mode was dominant in tapping mode atomic force microscopy under normal operating conditions. However, at lower tapping setpoints, the flexible microcantilever showed modal distortion and noise near the tip when tapping on a hard sample. The stiff microcantilever had a higher mode effect on a soft sample at lower tapping setpoints. Modal decomposition for tapping mode atomic force microscopy can thus be used to estimate the characteristics of samples in liquid environments.
NASA Astrophysics Data System (ADS)
Mikado, S.; Yanagie, H.; Yasuda, N.; Higashi, S.; Ikushima, I.; Mizumachi, R.; Murata, Y.; Morishita, Y.; Nishimura, R.; Shinohara, A.; Ogura, K.; Sugiyama, H.; Iikura, H.; Ando, H.; Ishimoto, M.; Takamoto, S.; Eriguchi, M.; Takahashi, H.; Kimura, M.
2009-06-01
It is necessary to accumulate the 10B atoms selectively to the tumor cells for effective Boron Neutron Capture Therapy (BNCT). In order to achieve an accurate measurement of 10B accumulations in the biological samples, we employed a technique of neutron capture autoradiography (NCAR) of sliced samples of tumor tissues using CR-39 plastic track detectors. The CR-39 track detectors attached with the biological samples were exposed to thermal neutrons in the thermal column of the JRR3 of Japan Atomic Energy Agency (JAEA). We obtained quantitative NCAR images of the samples for VX-2 tumor in rabbit liver after injection of 10BSH entrapped water-in-oil-in-water (WOW) emulsion by intra-arterial injection via proper hepatic artery. The 10B accumulations and distributions in VX-2 tumor and normal liver of rabbit were investigated by means of alpha-track density measurements. In this study, we showed the selective accumulation of 10B atoms in the VX-2 tumor by intra-arterial injection of 10B entrapped WOW emulsion until 3 days after injection by using digitized NCAR images (i.e. alpha-track mapping).
Analysis of soil samples from Gebeng area using NAA technique
NASA Astrophysics Data System (ADS)
Elias, Md Suhaimi; Wo, Yii Mei; Hamzah, Mohd Suhaimi; Shukor, Shakirah Abd; Rahman, Shamsiah Ab; Salim, Nazaratul Ashifa Abdullah; Azman, Muhamad Azfar; Hashim, Azian
2017-01-01
Rapid development and urbanization will increase number of residence and industrial area. Without proper management and control of pollution, these will give an adverse effect to environment and human life. The objective of this study to identify and quantify key contaminants into the environment of the Gebeng area as a result of industrial and human activities. Gebeng area was gazetted as one of the industrial estate in Pahang state. Assessment of elemental pollution in soil of Gebeng area base on level of concentration, enrichment factor and geo-accumulation index. The enrichment factors (EFs) were determined by the elemental rationing method, whilst the geo-accumulation index (Igeo) by comparing of current to continental crustal average concentration of element. Twenty-seven of soil samples were collected from Gebeng area. Soil samples were analysed by using Neutron Activation Analyses (NAA) technique. The obtained data showed higher concentration of iron (Fe) due to abundance in soil compared to other elements. The results of enrichment factor showed that Gebeng area have enrich with elements of As, Br, Hf, Sb, Th and U. Base on the geo-accumulation index (Igeo) classification, the soil quality of Gebeng area can be classified as class 0, (uncontaminated) to Class 3, (moderately to heavily contaminated).
Chapter 9: Planting hardwood tree seedlings on reclaimed mine land in the Appalachian region
V. Davis; J. Franklin; C. Zipper; P. Angel
2017-01-01
The Forestry Reclamation Approach (FRA) is a method of reclaiming surface coal mines to forested postmining land use (Chapter 2, this volume). "Use proper tree planting techniques" is Step 5 of the FRA; when used with the other FRA steps, proper tree planting can help to ensure successful reforestation. Proper care and planting of tree seedlings is essential...
NASA Astrophysics Data System (ADS)
Pasyanos, Michael E.; Franz, Gregory A.; Ramirez, Abelardo L.
2006-03-01
In an effort to build seismic models that are the most consistent with multiple data sets we have applied a new probabilistic inverse technique. This method uses a Markov chain Monte Carlo (MCMC) algorithm to sample models from a prior distribution and test them against multiple data types to generate a posterior distribution. While computationally expensive, this approach has several advantages over deterministic models, notably the seamless reconciliation of different data types that constrain the model, the proper handling of both data and model uncertainties, and the ability to easily incorporate a variety of prior information, all in a straightforward, natural fashion. A real advantage of the technique is that it provides a more complete picture of the solution space. By mapping out the posterior probability density function, we can avoid simplistic assumptions about the model space and allow alternative solutions to be identified, compared, and ranked. Here we use this method to determine the crust and upper mantle structure of the Yellow Sea and Korean Peninsula region. The model is parameterized as a series of seven layers in a regular latitude-longitude grid, each of which is characterized by thickness and seismic parameters (Vp, Vs, and density). We use surface wave dispersion and body wave traveltime data to drive the model. We find that when properly tuned (i.e., the Markov chains have had adequate time to fully sample the model space and the inversion has converged), the technique behaves as expected. The posterior model reflects the prior information at the edge of the model where there is little or no data to constrain adjustments, but the range of acceptable models is significantly reduced in data-rich regions, producing values of sediment thickness, crustal thickness, and upper mantle velocities consistent with expectations based on knowledge of the regional tectonic setting.
Separation techniques for the clean-up of radioactive mixed waste for ICP-AES/ICP-MS analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swafford, A.M.; Keller, J.M.
1993-03-17
Two separation techniques were investigated for the clean-up of typical radioactive mixed waste samples requiring elemental analysis by Inductively Coupled Plasma-Atomic Emission Spectroscopy (ICP-AES) or Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). These measurements frequently involve regulatory or compliance criteria which include the determination of elements on the EPA Target Analyte List (TAL). These samples usually consist of both an aqueous phase and a solid phase which is mostly an inorganic sludge. Frequently, samples taken from the waste tanks contain high levels of uranium and thorium which can cause spectral interferences in ICP-AES or ICP-MS analysis. The removal of these interferences ismore » necessary to determine the presence of the EPA TAL elements in the sample. Two clean-up methods were studied on simulated aqueous waste samples containing the EPA TAL elements. The first method studied was a classical procedure based upon liquid-liquid extraction using tri-n- octylphosphine oxide (TOPO) dissolved in cyclohexane. The second method investigated was based on more recently developed techniques using extraction chromatography; specifically the use of a commercially available Eichrom TRU[center dot]Spec[trademark] column. Literature on these two methods indicates the efficient removal of uranium and thorium from properly prepared samples and provides considerable qualitative information on the extraction behavior of many other elements. However, there is a lack of quantitative data on the extraction behavior of elements on the EPA Target Analyte List. Experimental studies on these two methods consisted of determining whether any of the analytes were extracted by these methods and the recoveries obtained. Both methods produced similar results; the EPA target analytes were only slightly or not extracted. Advantages and disadvantages of each method were evaluated and found to be comparable.« less
Data-driven sensor placement from coherent fluid structures
NASA Astrophysics Data System (ADS)
Manohar, Krithika; Kaiser, Eurika; Brunton, Bingni W.; Kutz, J. Nathan; Brunton, Steven L.
2017-11-01
Optimal sensor placement is a central challenge in the prediction, estimation and control of fluid flows. We reinterpret sensor placement as optimizing discrete samples of coherent fluid structures for full state reconstruction. This permits a drastic reduction in the number of sensors required for faithful reconstruction, since complex fluid interactions can often be described by a small number of coherent structures. Our work optimizes point sensors using the pivoted matrix QR factorization to sample coherent structures directly computed from flow data. We apply this sampling technique in conjunction with various data-driven modal identification methods, including the proper orthogonal decomposition (POD) and dynamic mode decomposition (DMD). In contrast to POD-based sensors, DMD demonstrably enables the optimization of sensors for prediction in systems exhibiting multiple scales of dynamics. Finally, reconstruction accuracy from pivot sensors is shown to be competitive with sensors obtained using traditional computationally prohibitive optimization methods.
NASA Astrophysics Data System (ADS)
Cardoso, S. L.; Dias, C. M. F.; Lima, J. A. P.; Massunaga, M. S. O.; da Silva, M. G.; Vargas, H.
2003-01-01
This work reports on the use of the optothermal window and a well-proven phenanthroline colorimetry method for determination of iron (II) content in a commercial fortified milk. Initially, iron (II) in distilled water was determined using a series of calibration samples with ferrous sulfate acting as the source of iron (II). In the following phase, this calibration methodology was applied to commercial milk as the sample matrix. The phenanthroline colorimetry [American Public Health Association, Washington, DC (1998)] was chosen in an attempt to achieve proper selectivity (i.e., to obtain the absorption band, the wavelength of which is centered near the radiation wavelength available for our experiments: Excitation wavelength at a 514-nm line of a 20-mW tunable Ar ion laser). Finally, samples of commercially available fortified milk were analyzed in an attempt to access Fe (II) content.
Radiometry in medicine and biology
NASA Astrophysics Data System (ADS)
Nahm, Kie-Bong; Choi, Eui Y.
2012-10-01
Diagnostics in medicine plays a critical role in helping medical professionals deliver proper diagnostic decisions. Most samples in this trade are of the human origin and a great portion of methodologies practiced in biology labs is shared in clinical diagnostic laboratories as well. Most clinical tests are quantitative in nature and recent increase in interests in preventive medicine requires the determination of minimal concentration of target analyte: they exist in small quantities at the early stage of various diseases. Radiometry or the use of optical radiation is the most trusted and reliable means of converting biologic concentrations into quantitative physical quantities. Since optical energy is readily available in varying energies (or wavelengths), the appropriate combination of light and the sample absorption properties provides reliable information about the sample concentration through Beer-Lambert law to a decent precision. In this article, the commonly practiced techniques in clinical and biology labs are reviewed from the standpoint of radiometry.
Methodological issues in microdialysis sampling for pharmacokinetic studies.
de Lange, E C; de Boer, A G; Breimer, D D
2000-12-15
Microdialysis is an in vivo technique that permits monitoring of local concentrations of drugs and metabolites at specific sites in the body. Microdialysis has several characteristics, which makes it an attractive tool for pharmacokinetic research. About a decade ago the microdialysis technique entered the field of pharmacokinetic research, in the brain, and later also in peripheral tissues and blood. Within this period much has been learned on the proper use of this technique. Today, it has outgrown its child diseases and its potentials and limitations have become more or less well defined. As microdialysis is a delicate technique for which experimental factors appear to be critical with respect to the validity of the experimental outcomes, several factors should be considered. These include the probe; the perfusion solution; post-surgery interval in relation to surgical trauma, tissue integrity and repeated experiments; the analysis of microdialysate samples; and the quantification of microdialysate data. Provided that experimental conditions are optimized to give valid and quantitative results, microdialysis can provide numerous data points from a relatively small number of individual animals to determine detailed pharmacokinetic information. An example of one of the added values of this technique compared with other in vivo pharmacokinetic techniques, is that microdialysis reflects free concentrations in tissues and plasma. This gives the opportunity to assess information on drug transport equilibration across membranes such as the blood-brain barrier, which already has provided new insights. With the progress of analytical methodology, especially with respect to low volume/low concentration measurements and simultaneous measurement of multiple compounds, the applications and importance of the microdialysis technique in pharmacokinetic research will continue to increase.
In situ AFM investigation of slow crack propagation mechanisms in a glassy polymer
NASA Astrophysics Data System (ADS)
George, M.; Nziakou, Y.; Goerke, S.; Genix, A.-C.; Bresson, B.; Roux, S.; Delacroix, H.; Halary, J.-L.; Ciccotti, M.
2018-03-01
A novel experimental technique based on in situ AFM monitoring of the mechanisms of damage and the strain fields associated to the slow steady-state propagation of a fracture in glassy polymers is presented. This micron-scale investigation is complemented by optical measurements of the sample deformation up to the millimetric macroscopic scale of the sample in order to assess the proper crack driving conditions. These multi-scale observations provide important insights towards the modeling of the fracture toughness of glassy polymers and its relationship with the macromolecular structure and non-linear rheological properties. This novel technique is first tested on a standard PMMA thermoplastic in order to both evaluate its performance and the richness of this new kind of observations. Although the fracture propagation in PMMA is well known to proceed through crazing in the bulk of the samples, our observations provide a clear description and quantitative evaluation of a change of fracture mechanism towards shear yielding fracture accompanied by local necking close to the free surface of the sample, which can be explained by the local change of stress triaxiality. Moreover, this primary surface necking mechanism is shown to be accompanied by a network of secondary grooves that can be related to surface crazes propagating towards the interior of the sample. This overall scenario is validated by post-mortem fractographic investigations by scanning electron microscopy.
Classification of urine sediment based on convolution neural network
NASA Astrophysics Data System (ADS)
Pan, Jingjing; Jiang, Cunbo; Zhu, Tiantian
2018-04-01
By designing a new convolution neural network framework, this paper breaks the constraints of the original convolution neural network framework requiring large training samples and samples of the same size. Move and cropping the input images, generate the same size of the sub-graph. And then, the generated sub-graph uses the method of dropout, increasing the diversity of samples and preventing the fitting generation. Randomly select some proper subset in the sub-graphic set and ensure that the number of elements in the proper subset is same and the proper subset is not the same. The proper subsets are used as input layers for the convolution neural network. Through the convolution layer, the pooling, the full connection layer and output layer, we can obtained the classification loss rate of test set and training set. In the red blood cells, white blood cells, calcium oxalate crystallization classification experiment, the classification accuracy rate of 97% or more.
NASA Astrophysics Data System (ADS)
Thomsen, Helge Abildhauge; Ikävalko, Johanna
1997-01-01
The sea ice biota of polar regions contains numerous heterotrophic flagellates very few of which have been properly identified. The whole mount technique for transmission electron microscopy enables the identification of loricate and scaly forms. A survey of Arctic ice samples (North-East Water Polynya, NE Greenland) revealed the presence of ca. 12 taxa belonging to the phagotrophic genus Thaumatomastix (Protista incertae sedis). Species of Thaumatomastix possess siliceous body scales and one naked and one scale-covered flagellum. The presence in both Arctic samples and sea ice material previously examined from the Antarctic indicates that this genus is most likely ubiquitous in polar sea ice and may be an important component in sea ice biota microbial activities.
Update of membership and mean proper motion of open clusters from UCAC5 catalog
NASA Astrophysics Data System (ADS)
Dias, W. S.; Monteiro, H.; Assafin, M.
2018-06-01
We present mean proper motions and membership probabilities of individual stars for optically visible open clusters, which have been determined using data from the UCAC5 catalog. This follows our previous studies with the UCAC2 and UCAC4 catalogs, but now using improved proper motions in the GAIA reference frame. In the present study results were obtained for a sample of 1108 open clusters. For five clusters, this is the first determination of mean proper motion, and for the whole sample, we present results with a much larger number of identified astrometric member stars than on previous studies. It is the last update of our Open cluster Catalog based on proper motion data only. Future updates will count on astrometric, photometric and spectroscopic GAIA data as input for analyses.
Quantitation of heat-shock proteins in clinical samples using mass spectrometry.
Kaur, Punit; Asea, Alexzander
2011-01-01
Mass spectrometry (MS) is a powerful analytical tool for proteomics research and drug and biomarker discovery. MS enables identification and quantification of known and unknown compounds by revealing their structural and chemical properties. Proper sample preparation for MS-based analysis is a critical step in the proteomics workflow because the quality and reproducibility of sample extraction and preparation for downstream analysis significantly impact the separation and identification capabilities of mass spectrometers. The highly expressed proteins represent potential biomarkers that could aid in diagnosis, therapy, or drug development. Because the proteome is so complex, there is no one standard method for preparing protein samples for MS analysis. Protocols differ depending on the type of sample, source, experiment, and method of analysis. Molecular chaperones play significant roles in almost all biological functions due to their capacity for detecting intracellular denatured/unfolded proteins, initiating refolding or denaturation of such malfolded protein sequences and more recently for their role in the extracellular milieu as chaperokines. In this chapter, we describe the latest techniques for quantitating the expression of molecular chaperones in human clinical samples.
Detection of Tetracycline in Milk using NIR Spectroscopy and Partial Least Squares
NASA Astrophysics Data System (ADS)
Wu, Nan; Xu, Chenshan; Yang, Renjie; Ji, Xinning; Liu, Xinyuan; Yang, Fan; Zeng, Ming
2018-02-01
The feasibility of measuring tetracycline in milk was investigated by near infrared (NIR) spectroscopic technique combined with partial least squares (PLS) method. The NIR transmittance spectra of 40 pure milk samples and 40 tetracycline adulterated milk samples with different concentrations (from 0.005 to 40 mg/L) were obtained. The pure milk and tetracycline adulterated milk samples were properly assigned to the categories with 100% accuracy in the calibration set, and the rate of correct classification of 96.3% was obtained in the prediction set. For the quantitation of tetracycline in adulterated milk, the root mean squares errors for calibration and prediction models were 0.61 mg/L and 4.22 mg/L, respectively. The PLS model had good fitting effect in calibration set, however its predictive ability was limited, especially for low tetracycline concentration samples. Totally, this approach can be considered as a promising tool for discrimination of tetracycline adulterated milk, as a supplement to high performance liquid chromatography.
Total Water Content Measurements with an Isokinetic Sampling Probe
NASA Technical Reports Server (NTRS)
Reehorst, Andrew L.; Miller, Dean R.; Bidwell, Colin S.
2010-01-01
The NASA Glenn Research Center has developed a Total Water Content (TWC) Isokinetic Sampling Probe. Since it is not sensitive to cloud water particle phase nor size, it is particularly attractive to support super-cooled large droplet and high ice water content aircraft icing studies. The instrument is comprised of the Sampling Probe, Sample Flow Control, and Water Vapor Measurement subsystems. Analysis and testing have been conducted on the subsystems to ensure their proper function and accuracy. End-to-end bench testing has also been conducted to ensure the reliability of the entire instrument system. A Stokes Number based collection efficiency correction was developed to correct for probe thickness effects. The authors further discuss the need to ensure that no condensation occurs within the instrument plumbing. Instrument measurements compared to facility calibrations from testing in the NASA Glenn Icing Research Tunnel are presented and discussed. There appears to be liquid water content and droplet size effects in the differences between the two measurement techniques.
Boeris, Valeria; Arancibia, Juan A; Olivieri, Alejandro C
2017-07-01
In this work, the combination of chemometric techniques with kinetic-spectroscopic data allowed quantifying two dyes (tartrazine and carminic acid) in complex matrices as mustard, ketchup, asparagus soup powder, pumpkin soup powder, plum jam and orange-strawberry juice. Quantitative analysis was performed without the use of tedious sample pretreatment, due to the achievement of the second-order advantage. The results obtained showed an improvement in simplicity, speed and cost with respect to usual separation techniques, allowing to properly quantifying these dyes obtaining limits of detection below 0.6mgL -1 . In addition, to the best of our knowledge, is the first time that kinetic-spectroscopic data are obtained from the action of laccase for analytical purposes. Copyright © 2017 Elsevier B.V. All rights reserved.
Functional Wigner representation of quantum dynamics of Bose-Einstein condensate
NASA Astrophysics Data System (ADS)
Opanchuk, B.; Drummond, P. D.
2013-04-01
We develop a method of simulating the full quantum field dynamics of multi-mode multi-component Bose-Einstein condensates in a trap. We use the truncated Wigner representation to obtain a probabilistic theory that can be sampled. This method produces c-number stochastic equations which may be solved using conventional stochastic methods. The technique is valid for large mode occupation numbers. We give a detailed derivation of methods of functional Wigner representation appropriate for quantum fields. Our approach describes spatial evolution of spinor components and properly accounts for nonlinear losses. Such techniques are applicable to calculating the leading quantum corrections, including effects such as quantum squeezing, entanglement, EPR correlations, and interactions with engineered nonlinear reservoirs. By using a consistent expansion in the inverse density, we are able to explain an inconsistency in the nonlinear loss equations found by earlier authors.
Environmental Monitoring and the Gas Industry: Program Manager Handbook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gregory D. Gillispie
1997-12-01
This document has been developed for the nontechnical gas industry manager who has the responsibility for the development of waste or potentially contaminated soil and groundwater data or must make decisions based on such data for the management or remediation of these materials. It explores the pse of common analytical chemistry instrumentation and associated techniques for identification of environmentally hazardous materials. Sufficient detail is given to familiarize the nontechnical reader with the principles behind the operation of each technique. The scope and realm of the techniques and their constituent variations are portrayed through a discussion of crucial details and, wheremore » appropriate, the depiction of real-life data. It is the author's intention to provide an easily understood handbook for gas industry management. Techniques which determine the presence, composition, and quantification of gas industry wastes are discussed. Greater focus is given to traditional techniques which have been the mainstay of modem analytical benchwork. However, with the continual advancement of instrumental principles and design, several techniques have been included which are likely to receive greater attention in fiture considerations for waste-related detection. Definitions and concepts inherent to a thorough understanding of the principles common to analytical chemistry are discussed. It is also crucial that gas industry managers understand the effects of the various actions which take place before, during, and after the actual sampling step. When a series of sample collection, storage, and transport activities occur, new or inexperienced project managers may overlook or misunderstand the importance of the sequence. Each step has an impact on the final results of the measurement process; errors in judgment or decision making can be costly. Specific techniques and methodologies for the collection, storage, and transport of environmental media samples are not described or discussed in detail in thk handbook. However, the underlying philosophy regarding the importance of proper collection, storage, and transport practices, as well as pertinent references, are presented.« less
30 CFR 90.209 - Respirable dust samples; transmission by operator.
Code of Federal Regulations, 2010 CFR
2010-07-01
... designated by the District Manager. (b) The operator shall not open or tamper with the seal of any filter... properly complete the dust data card that is provided by the manufacturer for each filter cassette. The... include that person's certification number. Respirable dust samples with data cards not properly completed...
30 CFR 90.209 - Respirable dust samples; transmission by operator.
Code of Federal Regulations, 2011 CFR
2011-07-01
... designated by the District Manager. (b) The operator shall not open or tamper with the seal of any filter... properly complete the dust data card that is provided by the manufacturer for each filter cassette. The... include that person's certification number. Respirable dust samples with data cards not properly completed...
30 CFR 71.209 - Respirable dust samples; transmission by operator.
Code of Federal Regulations, 2010 CFR
2010-07-01
... designated by the District Manager. (b) The operator shall not open or tamper with the seal of any filter... properly complete the dust data card that is provided by the manufacturer for each filter cassette. The... include that person's certification number. Respirable dust samples with data cards not properly completed...
30 CFR 71.209 - Respirable dust samples; transmission by operator.
Code of Federal Regulations, 2011 CFR
2011-07-01
... designated by the District Manager. (b) The operator shall not open or tamper with the seal of any filter... properly complete the dust data card that is provided by the manufacturer for each filter cassette. The... include that person's certification number. Respirable dust samples with data cards not properly completed...
Giardiasis: an update review on sensitivity and specificity of methods for laboratorial diagnosis.
Soares, Renata; Tasca, Tiana
2016-10-01
Giardiasis is a major cause of diarrhoea transmitted by ingestion of contaminated water and food with cysts, and it has been spread among people with poor oral hygiene. The traditional diagnosis is performed by identifying trophozoites and cysts of Giardia duodenalis through microscopy of faecal samples. In addition to microscopy, different methods have been validated for giardiasis diagnosis which are based on immunologic and molecular analyses. The aim of this study was to conduct a review of the main methods applied in clinical laboratory for diagnosis of giardiasis, in the last 10years, regarding the specificity and sensitivity criteria. It was observed high variability in the performance of the same methodology across studies; however, several techniques have been considered better than microscopy. The later, although gold standard, presents low sensitivity in cases of low number of cysts in the sample, and the experience of the microscopist must also be considered. We conclude that microscopy should still be held and complementary technique is recommended, in order to provide a reliable diagnosis and a proper treatment of the patient. Copyright © 2016 Elsevier B.V. All rights reserved.
Current techniques in acid-chloride corrosion control and monitoring at The Geysers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirtz, Paul; Buck, Cliff; Kunzman, Russell
1991-01-01
Acid chloride corrosion of geothermal well casings, production piping and power plant equipment has resulted in costly corrosion damage, frequent curtailments of power plants and the permanent shut-in of wells in certain areas of The Geysers. Techniques have been developed to mitigate these corrosion problems, allowing continued production of steam from high chloride wells with minimal impact on production and power generation facilities.The optimization of water and caustic steam scrubbing, steam/liquid separation and process fluid chemistry has led to effective and reliable corrosion mitigation systems currently in routine use at The Geysers. When properly operated, these systems can yield steammore » purities equal to or greater than those encountered in areas of The Geysers where chloride corrosion is not a problem. Developments in corrosion monitoring techniques, steam sampling and analytical methodologies for trace impurities, and computer modeling of the fluid chemistry has been instrumental in the success of this technology.« less
Williams, M S; Ebel, E D; Cao, Y
2013-01-01
The fitting of statistical distributions to microbial sampling data is a common application in quantitative microbiology and risk assessment applications. An underlying assumption of most fitting techniques is that data are collected with simple random sampling, which is often times not the case. This study develops a weighted maximum likelihood estimation framework that is appropriate for microbiological samples that are collected with unequal probabilities of selection. A weighted maximum likelihood estimation framework is proposed for microbiological samples that are collected with unequal probabilities of selection. Two examples, based on the collection of food samples during processing, are provided to demonstrate the method and highlight the magnitude of biases in the maximum likelihood estimator when data are inappropriately treated as a simple random sample. Failure to properly weight samples to account for how data are collected can introduce substantial biases into inferences drawn from the data. The proposed methodology will reduce or eliminate an important source of bias in inferences drawn from the analysis of microbial data. This will also make comparisons between studies and the combination of results from different studies more reliable, which is important for risk assessment applications. © 2012 No claim to US Government works.
NASA Astrophysics Data System (ADS)
Guidi, Giovanni; Scannapieco, Cecilia; Walcher, C. Jakob
2015-12-01
We study the sources of biases and systematics in the derivation of galaxy properties from observational studies, focusing on stellar masses, star formation rates, gas and stellar metallicities, stellar ages, magnitudes and colours. We use hydrodynamical cosmological simulations of galaxy formation, for which the real quantities are known, and apply observational techniques to derive the observables. We also analyse biases that are relevant for a proper comparison between simulations and observations. For our study, we post-process the simulation outputs to calculate the galaxies' spectral energy distributions (SEDs) using stellar population synthesis models and also generate the fully consistent far-UV-submillimetre wavelength SEDs with the radiative transfer code SUNRISE. We compared the direct results of simulations with the observationally derived quantities obtained in various ways, and found that systematic differences in all studied galaxy properties appear, which are caused by: (1) purely observational biases, (2) the use of mass-weighted and luminosity-weighted quantities, with preferential sampling of more massive and luminous regions, (3) the different ways of constructing the template of models when a fit to the spectra is performed, and (4) variations due to different calibrations, most notably for gas metallicities and star formation rates. Our results show that large differences can appear depending on the technique used to derive galaxy properties. Understanding these differences is of primary importance both for simulators, to allow a better judgement of similarities and differences with observations, and for observers, to allow a proper interpretation of the data.
Pecoraro, Carlo; Babbucci, Massimiliano; Villamor, Adriana; Franch, Rafaella; Papetti, Chiara; Leroy, Bruno; Ortega-Garcia, Sofia; Muir, Jeff; Rooker, Jay; Arocha, Freddy; Murua, Hilario; Zudaire, Iker; Chassot, Emmanuel; Bodin, Nathalie; Tinti, Fausto; Bargelloni, Luca; Cariani, Alessia
2016-02-01
Global population genetic structure of yellowfin tuna (Thunnus albacares) is still poorly understood despite its relevance for the tuna fishery industry. Low levels of genetic differentiation among oceans speak in favour of the existence of a single panmictic population worldwide of this highly migratory fish. However, recent studies indicated genetic structuring at a much smaller geographic scales than previously considered, pointing out that YFT population genetic structure has not been properly assessed so far. In this study, we demonstrated for the first time, the utility of 2b-RAD genotyping technique for investigating population genetic diversity and differentiation in high gene-flow species. Running de novo pipeline in Stacks, a total of 6772 high-quality genome-wide SNPs were identified across Atlantic, Indian and Pacific population samples representing all major distribution areas. Preliminary analyses showed shallow but significant population structure among oceans (FST=0.0273; P-value<0.01). Discriminant Analysis of Principal Components endorsed the presence of genetically discrete yellowfin tuna populations among three oceanic pools. Although such evidence needs to be corroborated by increasing sample size, these results showed the efficiency of this genotyping technique in assessing genetic divergence in a marine fish with high dispersal potential. Copyright © 2015 Elsevier B.V. All rights reserved.
Molecular Modeling of Nucleic Acid Structure: Electrostatics and Solvation
Bergonzo, Christina; Galindo-Murillo, Rodrigo; Cheatham, Thomas E.
2014-01-01
This unit presents an overview of computer simulation techniques as applied to nucleic acid systems, ranging from simple in vacuo molecular modeling techniques to more complete all-atom molecular dynamics treatments that include an explicit representation of the environment. The third in a series of four units, this unit focuses on critical issues in solvation and the treatment of electrostatics. UNITS 7.5 & 7.8 introduced the modeling of nucleic acid structure at the molecular level. This included a discussion of how to generate an initial model, how to evaluate the utility or reliability of a given model, and ultimately how to manipulate this model to better understand the structure, dynamics, and interactions. Subject to an appropriate representation of the energy, such as a specifically parameterized empirical force field, the techniques of minimization and Monte Carlo simulation, as well as molecular dynamics (MD) methods, were introduced as means to sample conformational space for a better understanding of the relevance of a given model. From this discussion, the major limitations with modeling, in general, were highlighted. These are the difficult issues in sampling conformational space effectively—the multiple minima or conformational sampling problems—and accurately representing the underlying energy of interaction. In order to provide a realistic model of the underlying energetics for nucleic acids in their native environments, it is crucial to include some representation of solvation (by water) and also to properly treat the electrostatic interactions. These are discussed in detail in this unit. PMID:18428877
Molecular modeling of nucleic Acid structure: electrostatics and solvation.
Bergonzo, Christina; Galindo-Murillo, Rodrigo; Cheatham, Thomas E
2014-12-19
This unit presents an overview of computer simulation techniques as applied to nucleic acid systems, ranging from simple in vacuo molecular modeling techniques to more complete all-atom molecular dynamics treatments that include an explicit representation of the environment. The third in a series of four units, this unit focuses on critical issues in solvation and the treatment of electrostatics. UNITS 7.5 & 7.8 introduced the modeling of nucleic acid structure at the molecular level. This included a discussion of how to generate an initial model, how to evaluate the utility or reliability of a given model, and ultimately how to manipulate this model to better understand its structure, dynamics, and interactions. Subject to an appropriate representation of the energy, such as a specifically parameterized empirical force field, the techniques of minimization and Monte Carlo simulation, as well as molecular dynamics (MD) methods, were introduced as a way of sampling conformational space for a better understanding of the relevance of a given model. This discussion highlighted the major limitations with modeling in general. When sampling conformational space effectively, difficult issues are encountered, such as multiple minima or conformational sampling problems, and accurately representing the underlying energy of interaction. In order to provide a realistic model of the underlying energetics for nucleic acids in their native environments, it is crucial to include some representation of solvation (by water) and also to properly treat the electrostatic interactions. These subjects are discussed in detail in this unit. Copyright © 2014 John Wiley & Sons, Inc.
Water quality determination by photographic analysis. [optical density and water turbidity
NASA Technical Reports Server (NTRS)
Klooster, S. A.; Scherz, J. P.
1973-01-01
Aerial reconnaissance techniques to extract water quality parameters from aerial photos are reported. The turbidity can be correlated with total suspended solids if the constituent parts of the effluent remain the same and the volumetric flow remains relatively constant. A monochromator is used for the selection of the bandwidths containing the most information. White reflectance panels are used to locate sampling points and eliminate inherent energy changes from lens flare, radial lens fall-off, and changing subject illumination. Misleading information resulting from bottom effects is avoided by the use of Secchi disc readings and proper choice of wavelength for analyzing the photos.
Statistical auditing of toxicology reports.
Deaton, R R; Obenchain, R L
1994-06-01
Statistical auditing is a new report review process used by the quality assurance unit at Eli Lilly and Co. Statistical auditing allows the auditor to review the process by which the report was generated, as opposed to the process by which the data was generated. We have the flexibility to use different sampling techniques and still obtain thorough coverage of the report data. By properly implementing our auditing process, we can work smarter rather than harder and continue to help our customers increase the quality of their products (reports). Statistical auditing is helping our quality assurance unit meet our customers' need, while maintaining or increasing the quality of our regulatory obligations.
Searching cause of death through different autopsy methods: A new initiative
Das, Abhishek; Chowdhury, Ranadip
2017-01-01
A lawful disposal of human dead body is only possible after establishment of proper and valid cause of death. If the cause is obscure, autopsy is the only mean of search. Inadequacy and unavailability of health care facility often makes this situation more complicated in developing countries where many deaths remain unexplained and proper mortality statistics is missing, especially for infant and children. Tissue sampling by needle autopsy or use of various imaging technique in virtopsy have been tried globally to find out an easier alternative. An exclusive and unique initiative, by limited autopsy through tissue biopsy and body fluid analysis, has been taken to meet this dire need in African and some of Asian developing countries, as worldwide accepted institutional data are even missing or conflicting at times. Traditional autopsy has changed little in last century, consisting of external examination and evisceration, dissection of organs with identification of macroscopic pathologies and injuries, followed by histopathology. As some population groups have religious objections to autopsy, demand for minimally invasive alternative has increased of late. But assessment of cause of death is most important for medico-legal, epidemiological and research purposes. Thus minimally invasive technique is of high importance in primary care settings too. In this article, we have made a journey through different autopsy methods, their relevance and applicability in modern day perspective considering scientific research articles, textbooks and interviews. PMID:29302514
SSAGES: Software Suite for Advanced General Ensemble Simulations
NASA Astrophysics Data System (ADS)
Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.
2018-01-01
Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.
Minimum-Impact Camping in the Front Woods.
ERIC Educational Resources Information Center
Schatz, Curt
1994-01-01
Minimum-impact camping techniques that can be applied to resident camp programs include controlling group size and behavior, designing camp sites, moving groups frequently, proper use of fires, proper disposal of food and human wastes, use of biodegradable soaps, and encouraging staff and camper awareness of impacts on the environment. (LP)
Continous Representation Learning via User Feedback
DOE Office of Scientific and Technical Information (OSTI.GOV)
Representation learning is a deep-learning based technique for extracting features from data for the purpose of machine learning. This requires a large amount of data, on order tens of thousands to millions of samples, to properly teach the deep neural network. This a system for continuous representation learning, where the system may be improved with a small number of additional samples (order 10-100). The unique characteristics of this invention include a human-computer feedback component, where assess the quality of the current representation and then provides a better representation to the system. The system then mixes the new data with oldmore » training examples to avoid overfitting and improve overall performance of the system. The model can be exported and shared with other users, and it may be applied to additional images the system hasn't seen before.« less
Calibration of fluorescence resonance energy transfer in microscopy
Youvan, Dougalas C.; Silva, Christopher M.; Bylina, Edward J.; Coleman, William J.; Dilworth, Michael R.; Yang, Mary M.
2003-12-09
Imaging hardware, software, calibrants, and methods are provided to visualize and quantitate the amount of Fluorescence Resonance Energy Transfer (FRET) occurring between donor and acceptor molecules in epifluorescence microscopy. The MicroFRET system compensates for overlap among donor, acceptor, and FRET spectra using well characterized fluorescent beads as standards in conjunction with radiometrically calibrated image processing techniques. The MicroFRET system also provides precisely machined epifluorescence cubes to maintain proper image registration as the sample is illuminated at the donor and acceptor excitation wavelengths. Algorithms are described that pseudocolor the image to display pixels exhibiting radiometrically-corrected fluorescence emission from the donor (blue), the acceptor (green) and FRET (red). The method is demonstrated on samples exhibiting FRET between genetically engineered derivatives of the Green Fluorescent Protein (GFP) bound to the surface of Ni chelating beads by histidine-tags.
Calibration of fluorescence resonance energy transfer in microscopy
Youvan, Douglas C.; Silva, Christopher M.; Bylina, Edward J.; Coleman, William J.; Dilworth, Michael R.; Yang, Mary M.
2002-09-24
Imaging hardware, software, calibrants, and methods are provided to visualize and quantitate the amount of Fluorescence Resonance Energy Transfer (FRET) occurring between donor and acceptor molecules in epifluorescence microscopy. The MicroFRET system compensates for overlap among donor, acceptor, and FRET spectra using well characterized fluorescent beads as standards in conjunction with radiometrically calibrated image processing techniques. The MicroFRET system also provides precisely machined epifluorescence cubes to maintain proper image registration as the sample is illuminated at the donor and acceptor excitation wavelengths. Algorithms are described that pseudocolor the image to display pixels exhibiting radiometrically-corrected fluorescence emission from the donor (blue), the acceptor (green) and FRET (red). The method is demonstrated on samples exhibiting FRET between genetically engineered derivatives of the Green Fluorescent Protein (GFP) bound to the surface of Ni chelating beads by histidine-tags.
Aquino-Pérez, Dulce María; Peña-Cadena, Daniel; Trujillo-García, José Ubaldo; Jiménez-Sandoval, Jaime Omar; Machorro-Muñoz, Olga Stephanie
2013-01-01
The use of metered dose inhaler (MDI) is key in the treatment of asthma; its effectiveness is related to proper technique. The purpose of this study is to evaluate the use of the technique of metered dose inhalers for the parents or guardians of school children with asthma. In this cross-sectional study, we used a sample of 221 individual caregivers (parent or guardian) of asthmatic children from 5 to 12 years old, who use MDI. We designed a validated questionnaire consisting of 27 items which addressed the handling of inhaler technique. Descriptive statistics was used. Caregivers were rated as "good technique" in 41 fathers (18.6%), 77 mothers (34.8%) and 9 tutors (4.1%), and with a "regular technique" 32 fathers (14.5%), 48 mothers (21.2%) and 14 guardians (6.3%). Asthmatic children aged 9 were rated as with "good technique" in 24 (10.9%). According to gender, we found a "good technique" in 80 boys (36.2%) and 47 girls (21.3%) and with a "regular technique" in 59 boys (26.7%) and 35 girls (15.8%), P 0.0973, RP 0.9. We found with a "regular technique" mainly those asthmatic children diagnosed at ages between 1 to 3 years. Most of the participants had a good technical qualification; however major mistakes were made at key points in the performance of it.
Pérez-Zárate, Pamela; Aragón-Piña, Antonio; Soria-Guerra, Ruth Elena; González-Amaro, Ana María; Pérez-Urizar, José; Pérez-González, Luis Fernando; Martinez-Gutierrez, Fidel
2015-11-01
To determinate the significance of risk factors with the presence of biofilm on catheters of patients attended at tertiary hospital cares. A total of 126 patients were included, data collection by observing the handling of the CVC, clinical history and microbiological isolation methods of CVCs tips (Roll-plate, sonication and scanning electron microscopy) were evaluated. Certain factors, such as the lack of proper hand washing, the use of primary barriers and preparing medications in the same hospital service, showed an important relationship between biofilm formation in CVCs. The sonication method presented that most of the samples had isolation of multispecies 29 samples (64%); in contrast with the roll-plate method, just one sample (3%) was isolated. The importance of the strict aseptic techniques of insertion and of the handlings of CVC was highlighted, the failure of both techniques was related to the biofilm formation and was evidenced using the scanning electron microscopy. Since this tool is not available in most hospitals, we present the correlation of those evidences with other standard microbiological methods and risk factors, which are necessary for the sensible detection of the different steps of the biofilm formation on CVC and their correct interpretation with clinical evidences. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Alonso-Floriano, F. J.
2015-11-01
This thesis is focused on the study of low-mass objects that can be targets of exoplanet searches with near-infrared spectrographs in general and CARMENES (Calar Alto high-Resolution search for M dwarfs with Exo-earths with Near-infrared and optical Echelle Spectrographs; see Quirrenbach et al. 2014) in particular. The CARMENES consortium comprises 11 institutions from Germany and Spain that are building a high-resolution spectrograph (R=82,000) with two channels, visible (0.55 - 1.05 um) and infrared (0.95 - 1.7 um), for the 3.5 m Calar Alto telescope. It will observe a sample of 300 M dwarfs in 600 nights of guaranteed time during at least three years, starting in January 2016. The final sample will be chosen from the 2200 M dwarfs included in the CARMENCITA input catalogue. For these stars, we have obtained and collected a large amount of data: spectral types, radial and rotational velocities, photometry in several bands, etc. Part of the e effort of the science preparation necessary for the final selection of targets for CARMENES and other near-infrared spectrographs has been collected in two publications, which are presented in this PhD thesis. In the first publication (Alonso-Floriano et al., 2015A&A...577A.128A), we obtained low-resolution spectra for 753 stars using the CAFOS spectrograph at the 2.2 m Calar Alto telescope. The main goal was to derive accurate spectral types, which are fundamental parameters for the sample selection. We used a grid of 49 standard stars, from spectral types K3V to M8V, together with a double least-square minimisation technique and 31 spectral indices previously defined by other authors. In addition, we quantified the surface gravity, metallicity and chromospheric activity of the sample, in order to detect low-gravity stars (giants and very young), metal-poor and very metal-poor stars (subdwarfs), and very active stars. In the second publication (Alonso-Floriano et al., 2015A&A...583A..85A), we searched for common proper motion companions, especially of low mass, to members of the near young beta Pictoris moving group. First, we compiled a list of 185 members and candidate members to beta Pictoris from 35 representatives studies on this moving group. Next, we used the Aladin and STILTS virtual observatory tools, as well as the PPMXL proper motion and Washington double stars catalogues. The objects that showed similar proper motions to those stars of the sample were targets of an astro-photometric follow-up. The 36 common proper motion companion eventually obtained were subjects of a study of binding energies to determine their physical ligation.
Extending neutron autoradiography technique for boron concentration measurements in hard tissues.
Provenzano, Lucas; Olivera, María Silvina; Saint Martin, Gisela; Rodríguez, Luis Miguel; Fregenal, Daniel; Thorp, Silvia I; Pozzi, Emiliano C C; Curotto, Paula; Postuma, Ian; Altieri, Saverio; González, Sara J; Bortolussi, Silva; Portu, Agustina
2018-07-01
The neutron autoradiography technique using polycarbonate nuclear track detectors (NTD) has been extended to quantify the boron concentration in hard tissues, an application of special interest in Boron Neutron Capture Therapy (BNCT). Chemical and mechanical processing methods to prepare thin tissue sections as required by this technique have been explored. Four different decalcification methods governed by slow and fast kinetics were tested in boron-loaded bones. Due to the significant loss of the boron content, this technique was discarded. On the contrary, mechanical manipulation to obtain bone powder and tissue sections of tens of microns thick proved reproducible and suitable, ensuring a proper conservation of the boron content in the samples. A calibration curve that relates the 10 B concentration of a bone sample and the track density in a Lexan NTD is presented. Bone powder embedded in boric acid solution with known boron concentrations between 0 and 100 ppm was used as a standard material. The samples, contained in slim Lexan cases, were exposed to a neutron fluence of 10 12 cm -2 at the thermal column central facility of the RA-3 reactor (Argentina). The revealed tracks in the NTD were counted with an image processing software. The effect of track overlapping was studied and corresponding corrections were implemented in the presented calibration curve. Stochastic simulations of the track densities produced by the products of the 10 B thermal neutron capture reaction for different boron concentrations in bone were performed and compared with the experimental results. The remarkable agreement between the two curves suggested the suitability of the obtained experimental calibration curve. This neutron autoradiography technique was finally applied to determine the boron concentration in pulverized and compact bone samples coming from a sheep experimental model. The obtained results for both type of samples agreed with boron measurements carried out by ICP-OES within experimental uncertainties. The fact that the histological structure of bone sections remains preserved allows for future boron microdistribution analysis. Copyright © 2018 Elsevier Ltd. All rights reserved.
Vanderford, Brett J; Mawhinney, Douglas B; Trenholm, Rebecca A; Zeigler-Holady, Janie C; Snyder, Shane A
2011-02-01
Proper collection and preservation techniques are necessary to ensure sample integrity and maintain the stability of analytes until analysis. Data from improperly collected and preserved samples could lead to faulty conclusions and misinterpretation of the occurrence and fate of the compounds being studied. Because contaminants of emerging concern, such as pharmaceuticals and personal care products (PPCPs) and steroids, generally occur in surface and drinking water at ng/L levels, these compounds in particular require such protocols to accurately assess their concentrations. In this study, sample bottle types, residual oxidant quenching agents, preservation agents, and hold times were assessed for 21 PPCPs and steroids in surface water and finished drinking water. Amber glass bottles were found to have the least effect on target analyte concentrations, while high-density polyethylene bottles had the most impact. Ascorbic acid, sodium thiosulfate, and sodium sulfite were determined to be acceptable quenching agents and preservation with sodium azide at 4 °C led to the stability of the most target compounds. A combination of amber glass bottles, ascorbic acid, and sodium azide preserved analyte concentrations for 28 days in the tested matrices when held at 4 °C. Samples without a preservation agent were determined to be stable for all but two of the analytes when stored in amber glass bottles at 4 °C for 72 h. Results suggest that if improper protocols are utilized, reported concentrations of target PPCPs and steroids may be inaccurate.
When properly conducted, sediment removal is an effective lake management technique. This chapter describes: (1) purposes of sediment removal, (2) environmental concerns, (3) appropriate depth of sediment removal, (4) sediment removal techniques, (5) suitable lake conditions, (6)...
Functional Wigner representation of quantum dynamics of Bose-Einstein condensate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Opanchuk, B.; Drummond, P. D.
2013-04-15
We develop a method of simulating the full quantum field dynamics of multi-mode multi-component Bose-Einstein condensates in a trap. We use the truncated Wigner representation to obtain a probabilistic theory that can be sampled. This method produces c-number stochastic equations which may be solved using conventional stochastic methods. The technique is valid for large mode occupation numbers. We give a detailed derivation of methods of functional Wigner representation appropriate for quantum fields. Our approach describes spatial evolution of spinor components and properly accounts for nonlinear losses. Such techniques are applicable to calculating the leading quantum corrections, including effects such asmore » quantum squeezing, entanglement, EPR correlations, and interactions with engineered nonlinear reservoirs. By using a consistent expansion in the inverse density, we are able to explain an inconsistency in the nonlinear loss equations found by earlier authors.« less
Minaire, P; Flores, J L; Cherpin, J; Weber, D
1987-01-01
A survey of the total population of a small rural village was undertaken to discern the amount of impairment and disability present, using both questionnaire and examination techniques. It was confirmed that the concept of disability was applicable to such populations and that it was related to age, at both extremes of life. There was a good correlation between self-reporting and examination of locomotor and sensory impairments but it was poor when applied to prehensile or dexterity skills. This survey showed that such techniques can be applied to populations. As yet the handicap code in the International Classification of Impairment, Disability and Handicap has not been related to this population but this is seen as an important next step in the proper understanding of chronic disability in the community.
Aerosol profiling during the large scale field campaign CINDI-2
NASA Astrophysics Data System (ADS)
Apituley, Arnoud; Roozendael, Michel Van; Richter, Andreas; Wagner, Thomas; Friess, Udo; Hendrick, Francois; Kreher, Karin; Tirpitz, Jan-Lukas
2018-04-01
For the validation of space borne observations of NO2 and other trace gases from hyperspectral imagers, ground based instruments based on the MAXDOAS technique are an excellent choice, since they rely on similar retrieval techniques as the observations from orbit. To ensure proper traceability of the MAXDOAS observations, a thorough validation and intercomparison is mandatory. Advanced MAXDOAS observation and retrieval techniques enable inferring vertical structure of trace gases and aerosols. These techniques and their results need validation by e.g. lidar techniques. For the proper understanding of the results from passive remote sensing techniques, independent observations are needed that include parameters needed to understand the light paths, i.e. in-situ aerosol observations of optical and microphysical properties, and essential are in particular the vertical profiles of aerosol optical properties by (Raman) lidar. The approach used in the CINDI-2 campaign held in Cabauw in 2016 is presented in this paper and the results will be discussed in the presentation at the conference.
Proper motion separation of Be stars in the Milky Way and the Magellanic Clouds
NASA Astrophysics Data System (ADS)
Vieira, K.; García, A.; Sabogal, B.
2018-01-01
We present a proper motion investigation of a sample of Be stars candidates towards the Large Magellanic Cloud (LMC), which has resulted in the identification of two separate populations, in the Galactic foreground and in the Magellanic background. OGLE BVI and 2MASS JHK photometry were used with the SPM4 proper motions to discriminate the different populations located towards the LMC. Two populations with distinctive infrared colours and noticeable different kinematics were found, the bluer sample is consistent with being in the LMC and the redder one with belonging to the Milky Way (MW) disk. This settles the nature of the redder sample which had been described in previous publications as a possible unknown subclass of stars among the Be candidates in the LMC.
Andrade, Mariane A; Lanças, Fernando M
2017-04-14
Ochratoxin A (OTA), a widely studied mycotoxin, can be found in a variety of food matrices. As its concentration in food is generally low (in the order of μg kg -1 ), sample preparation techniques are necessary for the analyte purification and pre-concentration in order to achieve the required low detection limits. The separation and detection methods used for OTA analysis should also offer proper sensitivity in order to allow the adequate quantification of the analyte. This manuscript addresses the development of a methodology aiming the analysis of OTA in wine samples by packed in-tube SPME in flow through extraction mode coupled to HPLC-MS/MS. The in-tube SPME set up utilized a PEEK tube packed with C18 particles as the extraction column. The method was optimized by a central composite design 2 2 +3 extra central points, having as factors the percentage of ACN and time in the sample load step. The functionalities of the method were attested and its analytical conditions, enhanced by using 22% of ACN and 6min in the sample load step. Validation of the method was also accomplished prior to analyses of both dry red wine and dry white wine samples. The method demonstrated proper sensitivity, with detection and quantification limits equal to 0.02 and 0.05μgL -1 , respectively. Linearity and precision exhibited a 0.996 correlation coefficient and RSD under 6%, respectively. The method proved to be accurate at medium and higher concentration levels with a maximum recovery of 73% at higher concentration levels. OTA was not detected in either dry red and dry white wine samples evaluated in this work. If present, it would be at concentrations lower than the detection and quantification limits established for the proposed method, and considered not a potential danger to human health according to our present knowledge. Copyright © 2017 Elsevier B.V. All rights reserved.
Study of consumer fireworks post-blast residues by ATR-FTIR.
Martín-Alberca, Carlos; Zapata, Félix; Carrascosa, Héctor; Ortega-Ojeda, Fernando E; García-Ruiz, Carmen
2016-03-01
Specific analytical procedures are requested for the forensic analysis of pre- and post-blast consumer firework samples, which present significant challenges. Up to date, vibrational spectroscopic techniques such as Fourier transform infrared spectroscopy (FTIR) have not been tested for the analysis of post-blast residues in spite of their interesting strengths for the forensic field. Therefore, this work proposes a simple and fast procedure for the sampling and analysis of consumer firework post-blast residues by a portable FTIR instrument with an Attenuated Total Reflection (ATR) accessory. In addition, the post-blast residues spectra of several consumer fireworks were studied in order to achieve the identification of their original chemical compositions. Hence, this work analysed 22 standard reagents usually employed to make consumer fireworks, or because they are related to their combustion products. Then, 5 different consumer fireworks were exploded, and their residues were sampled with dry cotton swabs and directly analysed by ATR-FTIR. In addition, their pre-blast fuses and charges were also analysed in order to stablish a proper comparison. As a result, the identification of the original chemical compositions of the post-blast samples was obtained. Some of the compounds found were potassium chlorate, barium nitrate, potassium nitrate, potassium perchlorate or charcoal. An additional study involving chemometric tools found that the results might greatly depend on the swab head type used for the sampling, and its sampling efficiency. The proposed procedure could be used as a complementary technique for the analysis of consumer fireworks post-blast residues. Copyright © 2015 Elsevier B.V. All rights reserved.
Multivariate localization methods for ensemble Kalman filtering
NASA Astrophysics Data System (ADS)
Roh, S.; Jun, M.; Szunyogh, I.; Genton, M. G.
2015-05-01
In ensemble Kalman filtering (EnKF), the small number of ensemble members that is feasible to use in a practical data assimilation application leads to sampling variability of the estimates of the background error covariances. The standard approach to reducing the effects of this sampling variability, which has also been found to be highly efficient in improving the performance of EnKF, is the localization of the estimates of the covariances. One family of localization techniques is based on taking the Schur (entry-wise) product of the ensemble-based sample covariance matrix and a correlation matrix whose entries are obtained by the discretization of a distance-dependent correlation function. While the proper definition of the localization function for a single state variable has been extensively investigated, a rigorous definition of the localization function for multiple state variables has been seldom considered. This paper introduces two strategies for the construction of localization functions for multiple state variables. The proposed localization functions are tested by assimilating simulated observations experiments into the bivariate Lorenz 95 model with their help.
Korkut, Süleyman; Kök, M Samil; Korkut, Derya Sevim; Gürleyen, Tuğba
2008-04-01
Heat treatment is often used to improve the dimensional stability of wood. In this study, the effects of heat treatment on technological properties of Red-bud maple (Acer trautvetteri Medw.) wood were examined. Samples obtained from Düzce Forest Enterprises, Turkey, were subjected to heat treatment at varying temperatures (120 degrees C, 150 degrees C and 180 degrees C) and for varying durations (2h, 6h and 10h). The technological properties of heat-treated wood samples and control samples were tested. Compression strength parallel to grain, bending strength, modulus of elasticity in bending, janka-hardness, impact bending strength, and tension strength perpendicular to grain were determined. The results showed that technological strength values decreased with increasing treatment temperature and treatment times. Red-bud maple wood could be utilized by using proper heat treatment techniques with minimal losses in strength values in areas where working, and stability such as in window frames, are important factors.
NASA Astrophysics Data System (ADS)
Chen, Bo; Li, Yi; Sun, Zhen-Ya
2018-06-01
In this study, PbSe bulk samples were prepared by a high-pressure high-temperature (HPHT) sintering technique, and the phase compositions, band gaps and thermoelectric properties of the samples were systematically investigated. The sintering pressure exerts a significant influence on the preferential orientation, band gap and thermoelectric properties of PbSe. With increasing pressure, the preferential orientation decreases, mainly due to the decreased crystallinity, while the band gap first decreases and then increases. The electrical conductivity and power factor decrease gradually with increasing pressure, mainly attributed to the decreased carrier concentration and mobility. Consequently, the sample prepared by 2 GPa shows the highest thermoelectric figure-of-merit, ZT, of 0.55 at ˜ 475 K. The ZT of the HPHT-sintered PbSe could be further improved by properly doping or optimizing the HPHT parameters. This study further demonstrates that the sintering pressure could be another degree of freedom to manipulate the band structure and thermoelectric properties of materials.
Optimal space communications techniques. [all digital phase locked loop for FM demodulation
NASA Technical Reports Server (NTRS)
Schilling, D. L.
1973-01-01
The design, development, and analysis are reported of a digital phase-locked loop (DPLL) for FM demodulation and threshold extension. One of the features of the developed DPLL is its synchronous, real time operation. The sampling frequency is constant and all the required arithmetic and logic operations are performed within one sampling period, generating an output sequence which is converted to analog form and filtered. An equation relating the sampling frequency to the carrier frequency must be satisfied to guarantee proper DPLL operation. The synchronous operation enables a time-shared operation of one DPLL to demodulate several FM signals simultaneously. In order to obtain information about the DPLL performance at low input signal-to-noise ratios, a model of an input noise spike was introduced, and the DPLL equation was solved using a digital computer. The spike model was successful in finding a second order DPLL which yielded a five db threshold extension beyond that of a first order DPLL.
NASA Astrophysics Data System (ADS)
Mehdikhani, Mehdi; Ghaziof, Sharareh
2018-01-01
In this research, poly-ɛ-caprolactone (PCL), polyethylene glycol (PEG), multi-wall carbon nanotubes (MWCNTs), and nanocomposite scaffolds containing 0.5 and 1% (w/w) MWCNTs coated with fibrin glue (FG) were prepared via solvent casting and freeze-drying technique for cardiac tissue engineering. Scanning electron microscopy, transmission electron microscopy, Fourier transform-infrared spectroscopy, and X-ray diffraction were used to characterize the samples. Furthermore, mechanical properties, electrical conductivity, degradation, contact angle, and cytotoxicity of the samples were evaluated. Results showed the uniform distribution of the MWCNTs with some aggregates in the prepared nanocomposite scaffolds. The scaffolds containing 1% (w/w) MWCNTs with and without FG coating illustrated optimum modulus of elasticity, high electrical conductivity, and wettability compared with PCL/PEG and PCL/PEG/0.5%(w/w) MWCNTs' scaffolds. FG coating enhanced electrical conductivity and cell response, and increased wettability of the constructs. The prepared scaffolds were degraded significantly after 60 days of immersion in PBS. Meanwhile, the nanocomposite containing 1% (w/w) MWCNTs with FG coating (S3) showed proper spreading and viability of the myoblasts seeded on it after 1, 4, and 7 days of culture. The scaffold containing 1% (w/w) MWCNTs with FG coating demonstrated optimal properties including acceptable mechanical properties, proper wettability, high electrical conductivity, satisfactory degradation, and excellent myoblasts response to it.
Efficacy of a sperm-selection chamber in terms of morphology, aneuploidy and DNA packaging.
Seiringer, M; Maurer, M; Shebl, O; Dreier, K; Tews, G; Ziehr, S; Schappacher-Tilp, G; Petek, E; Ebner, T
2013-07-01
Since most current techniques analysing spermatozoa will inevitably exclude these gametes from further use, attempts have been made to enrich semen samples with physiological spermatozoa with good prognosis using special sperm-processing methods. A particular sperm-selection chamber, called the Zech-selector, was found to be effective in completely eliminating spermatozoa with DNA strand breaks. The aim of this study was to further analyse the subgroup of spermatozoa accumulated using the Zech-selector. In detail, the potential of the chamber to select for proper sperm morphology, DNA status and chromatin condensation was tested. Two samples, native and processed semen, of 53 patients were analysed for sperm morphology (×1000, ×6300), DNA packaging (fragmentation, chromatin condensation) and chromosomal status (X, Y, 18). Migration time (the time needed for proper sperm accumulation) was significantly correlated to fast progressive motility (P=0.002). The present sperm-processing method was highly successful with respect to all parameters analysed (P<0.001). In particular, spermatozoa showing numeric (17.4% of patients without aneuploidy) or structural chromosomal abnormalities (90% of patients without strand-breaks) were separated most effectively. To summarize, further evidence is provided that separating spermatozoa without exposure to centrifugation stress results in a population of highly physiological spermatozoa. Copyright © 2013 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Vesga, Fidson-Juarismy; Moreno, Yolanda; Ferrús, María Antonia; Campos, Claudia; Trespalacios, Alba Alicia
2018-05-01
Helicobacter pylori is one of the most common causes of chronic bacterial infection in humans, and a predisposing factor for peptic ulcer and gastric cancer. The infection has been consistently associated with lack of access to clean water and proper sanitation. H. pylori has been detected in surface water, wastewater and drinking water. However, its ability to survive in an infectious state in the environment is hindered because it rapidly loses its cultivability. The aim of this study was to determine the presence of cultivable and therefore viable H. pylori in influent and effluent water from drinking water treatment plants (DWTP). A total of 310 influent and effluent water samples were collected from three drinking water treatment plants located at Bogotá city, Colombia. Specific detection of H. pylori was achieved by culture, qPCR and FISH techniques. Fifty-six positive H. pylori cultures were obtained from the water samples. Characteristic colonies were covered by the growth of a large number of other bacteria present in the water samples, making isolation difficult to perform. Thus, the mixed cultures were submitted to Fluorescent in situ Hybridization (FISH) and qPCR analysis, followed by sequencing of the amplicons for confirmation. By qPCR, 77 water samples, both from the influent and the effluent, were positive for the presence of H. pylori. The results of our study demonstrate that viable H. pylori cells were present in both, influent and effluent water samples obtained from drinking water treatment plants in Bogotá and provide further evidence that contaminated water may act as a transmission vehicle for H. pylori. Moreover, FISH and qPCR methods result rapid and specific techniques to identify H. pylori from complex environmental samples such as influent water. Copyright © 2018 Elsevier GmbH. All rights reserved.
Wölfel, Roman; Pfeffer, Martin; Essbauer, Sandra; Nerkelun, Sylke; Dobler, Gerhard
2006-11-01
Human adenoviruses (HAdV) may cause pharyngoconjunctival fever, follicular conjunctivitis or epidemic keratoconjunctivitis (EKC). Especially, outbreaks of the latter may lead to severe economic losses when preventive measures are implemented too late. Thus, a safe sampling method, proper specimen transport conditions and a fast and sensitive diagnostic technique is mandatory. Two commercially available virus transport systems (VTS) were compared with two NaCl-moisturised sampling devices, one of which comprises Dacron-tipped plastic-shafted swabs and the other a cotton-tipped wood-shafted swab, available in most ophthalmologists' offices. Downstream methods for specific detection of HAdV included direct immunofluorescence assay (IFA) of conjunctival swabs, virus isolation by cell culture and quantitative real-time polymerase chain reaction (qPCR). Furthermore, the influence of application of local anaesthetics prior to swabbing on subsequent detection of HAdV was investigated. Application of local anaesthetics had a positive influence on the amount of swabbed cells, thus increasing the chance of obtaining positive results by IFA. Neither isolation of HAdV by cell culture nor by qPCR was negatively influenced by this pretreatment. Surprisingly, both commercially available VTS performed significantly worse than the NaCl-moisturised swabs. This was shown with regard to virus recovery rates in cell culture as well as viral genome copy numbers in the qPCR. Based on our results, the following recommendations are provided to improve sampling, transport and diagnostic techniques regarding conjunctival swabs for diagnosis of human adenovirus infection: (1) application of local anaesthetics, (2) NaCl-moisturised VTS for shipment of specimens, and (3) detection of HAdV by qPCR. The latter method proved to be superior to virus isolation by cell culture, including subsequent identification by IFA, because it is faster, more sensitive and allows simultaneous handling of a number of samples. Hence, countermeasures to prevent further virus spread in an outbreak situation can be implemented earlier, thus reducing the number of subsequent adenoviral infections.
Yawn, Barbara P; Colice, Gene L; Hodder, Rick
2012-01-01
Sustained bronchodilation using inhaled medications in moderate to severe chronic obstructive pulmonary disease (COPD) grades 2 and 3 (Global Initiative for Chronic Obstructive Lung Disease guidelines) has been shown to have clinical benefits on long-term symptom control and quality of life, with possible additional benefits on disease progression and longevity. Aggressive diagnosis and treatment of symptomatic COPD is an integral and pivotal part of COPD management, which usually begins with primary care physicians. The current standard of care involves the use of one or more inhaled bronchodilators, and depending on COPD severity and phenotype, inhaled corticosteroids. There is a wide range of inhaler devices available for delivery of inhaled medications, but suboptimal inhaler use is a common problem that can limit the clinical effectiveness of inhaled therapies in the real-world setting. Patients' comorbidities, other physical or mental limitations, and the level of inhaler technique instruction may limit proper inhaler use. This paper presents information that can overcome barriers to proper inhaler use, including issues in device selection, steps in correct technique for various inhaler devices, and suggestions for assessing and monitoring inhaler techniques. Ensuring proper inhaler technique can maximize drug effectiveness and aid clinical management at all grades of COPD.
Yawn, Barbara P; Colice, Gene L; Hodder, Rick
2012-01-01
Sustained bronchodilation using inhaled medications in moderate to severe chronic obstructive pulmonary disease (COPD) grades 2 and 3 (Global Initiative for Chronic Obstructive Lung Disease guidelines) has been shown to have clinical benefits on long-term symptom control and quality of life, with possible additional benefits on disease progression and longevity. Aggressive diagnosis and treatment of symptomatic COPD is an integral and pivotal part of COPD management, which usually begins with primary care physicians. The current standard of care involves the use of one or more inhaled bronchodilators, and depending on COPD severity and phenotype, inhaled corticosteroids. There is a wide range of inhaler devices available for delivery of inhaled medications, but suboptimal inhaler use is a common problem that can limit the clinical effectiveness of inhaled therapies in the real-world setting. Patients’ comorbidities, other physical or mental limitations, and the level of inhaler technique instruction may limit proper inhaler use. This paper presents information that can overcome barriers to proper inhaler use, including issues in device selection, steps in correct technique for various inhaler devices, and suggestions for assessing and monitoring inhaler techniques. Ensuring proper inhaler technique can maximize drug effectiveness and aid clinical management at all grades of COPD. PMID:22888221
Gas analysis system for the Eight Foot High Temperature Tunnel
NASA Technical Reports Server (NTRS)
Leighty, Bradley D.; Davis, Patricia P.; Upchurch, Billy T.; Puster, Richard L.
1992-01-01
This paper describes the development of a gas collection and analysis system that is to be installed in the Eight-Foot High Temperature Tunnel (8' HTT) at NASA's Langley Research Center. This system will be used to analyze the test gas medium that results after burning a methane-air mixture to achieve the proper tunnel test parameters. The system consists of a sampling rake, a gas sample storage array, and a gas chromatographic system. Gas samples will be analyzed after each run to assure that proper combustion takes place in the tunnel resulting in a correctly balanced composition of the test gas medium. The proper ratio of gas species is critically necessary in order for the proper operation and testing of scramjet engines in the tunnel. After a variety of methane-air burn conditions have been analyzed, additional oxygen will be introduced into the combusted gas and the enriched test gas medium analyzed. The pre/post enrichment sets of data will be compared to verify that the gas species of the test gas medium is correctly balanced for testing of air-breathing engines.
[Effect of near infrared spectrum on the precision of PLS model for oil yield from oil shale].
Wang, Zhi-Hong; Liu, Jie; Chen, Xiao-Chao; Sun, Yu-Yang; Yu, Yang; Lin, Jun
2012-10-01
It is impossible to use present measurement methods for the oil yield of oil shale to realize in-situ detection and these methods unable to meet the requirements of the oil shale resources exploration and exploitation. But in-situ oil yield analysis of oil shale can be achieved by the portable near infrared spectroscopy technique. There are different correlativities of NIR spectrum data formats and contents of sample components, and the different absorption specialities of sample components shows in different NIR spectral regions. So with the proportioning samples, the PLS modeling experiments were done by 3 formats (reflectance, absorbance and K-M function) and 4 regions of modeling spectrum, and the effect of NIR spectral format and region to the precision of PLS model for oil yield from oil shale was studied. The results show that the best data format is reflectance and the best modeling region is combination spectral range by PLS model method and proportioning samples. Therefore, the appropriate data format and the proper characteristic spectral region can increase the precision of PLS model for oil yield form oil shale.
Daemi, Hamed; Barikani, Mehdi; Barmar, Mohammad
2014-05-01
A number of different ionic aqueous polyurethane dispersions (PUDs) were synthesized based on NCO-terminated prepolymers. Two different anionic and cationic polyurethane samples were synthesized using dimethylol propionic acid and N-methyldiethanolamine emulsifiers, respectively. Then, proper amounts of PUDs and sodium alginate were mixed to obtain a number of aqueous polyurethane dispersions-sodium alginate (PUD/SA) elastomers. The chemical structure, thermal, morphological, thermo-mechanical and mechanical properties, and hydrophilicity content of the prepared samples were studied by FTIR, EDX, DSC, TGA, SEM, DMTA, tensile testing and contact angle techniques. The cationic polyurethanes and their blends with sodium alginate showed excellent miscibility and highly stretchable properties, while the samples containing anionic polyurethanes and alginate illustrated a poor compatibility and no significant miscibility. The morphology of alginate particles shifted from nanoparticles to microparticles by changing the nature of PUDs from cationic to anionic types. The final cationic elastomers not only showed better mechanical properties but also were formulated easier than anionic samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Xu, Yong; Li, Dan; Yin, Zongqi; He, Aijuan; Lin, Miaomiao; Jiang, Gening; Song, Xiao; Hu, Xuefei; Liu, Yi; Wang, Jinpeng; Wang, Xiaoyun; Duan, Liang; Zhou, Guangdong
2017-08-01
Tissue-engineered trachea provides a promising approach for reconstruction of long segmental tracheal defects. However, a lack of ideal biodegradable scaffolds greatly restricts its clinical translation. Decellularized trachea matrix (DTM) is considered a proper scaffold for trachea cartilage regeneration owing to natural tubular structure, cartilage matrix components, and biodegradability. However, cell residual and low porosity of DTM easily result in immunogenicity and incomplete cartilage regeneration. To address these problems, a laser micropore technique (LMT) was applied in the current study to modify trachea sample porosity to facilitate decellular treatment and cell ingrowth. Decellularization processing demonstrated that cells in LMT treated samples were more easily removed compared with untreated native trachea. Furthermore, after optimizing the protocols of LMT and decellular treatments, the LMT-treated DTM (LDTM) could retain their original tubular shape with only mild extracellular matrix damage. After seeding with chondrocytes and culture in vitro for 8 weeks, the cell-LDTM constructs formed tubular cartilage with relatively homogenous cell distribution in both micropores and bilateral surfaces. In vivo results further confirmed that the constructs could form mature tubular cartilage with increased DNA and cartilage matrix contents, as well as enhanced mechanical strength, compared with native trachea. Collectively, these results indicate that LDTM is an ideal scaffold for tubular cartilage regeneration and, thus, provides a promising strategy for functional reconstruction of trachea cartilage. Lacking ideal biodegradable scaffolds greatly restricts development of tissue-engineered trachea. Decellularized trachea matrix (DTM) is considered a proper scaffold for trachea cartilage regeneration. However, cell residual and low porosity of DTM easily result in immunogenicity and incomplete cartilage regeneration. By laser micropore technique (LMT), the current study efficiently enhanced the porosity and decellularized efficacy of DTM. The LMT-treated DTM basically retained the original tubular shape with mild matrix damage. After chondrocyte seeding followed by in vitro culture and in vivo implantation, the constructs formed mature tubular cartilage with matrix content and mechanical strength similar to native trachea. The current study provides an ideal scaffold and a promising strategy for cartilage regeneration and functional reconstruction of trachea. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
MANAGEMENT OF LAKES THROUGH SEDIMENT REMOVAL
When properly conducted, sediment removal is an effective lake management technique. This paper describes: (1) the purpose of sediment removal, (2) environmental concerns, (3) depth of sediment removal, (4) sediment removal techniques, (5) suitable lake conditions, (6) exemplary ...
A comparative physical evaluation of four X-ray films.
Egyed, M; Shearer, D R
1981-09-01
In this study, four general purpose radiographic films (Agfa Gevaert Curix RP-1, duPont Cronex 4, Fuji RX, and Kodak XRP-1) were compared using three independent techniques. By examining the characteristic curves for the four films, film speed and contrast were compared over the diagnostically useful density range. These curves were generated using three methods: (1) irradiation of a standard film cassette lined with high-speed screens, covered by a twelve-step aluminum wedge; (2) direct exposure of film strips to an electro-luminescent sensitometer; and (3) direct irradiation of a standard film cassette lined with high-speed screens. The latter technique provided quantitative values for film speed and relative contrast. All three techniques provided virtually properly identical results and indicate that under properly controlled conditions simplified methods of film testing can give results equivalent to those obtained by more sophisticated techniques.
Optimizing the diagnostic testing of Clostridium difficile infection.
Bouza, Emilio; Alcalá, Luis; Reigadas, Elena
2016-09-01
Clostridium difficile infection (CDI) is the leading cause of hospital-acquired diarrhea and is associated with a considerable health and cost burden. However, there is still not a clear consensus on the best laboratory diagnosis approach and a wide variation of testing methods and strategies can be encountered. We aim to review the most practical aspects of CDI diagnosis providing our own view on how to optimize CDI diagnosis. Expert commentary: Laboratory diagnosis in search of C. difficile toxins should be applied to all fecal diarrheic samples reaching the microbiology laboratory in patients > 2 years old, with or without classic risk factors for CDI. Detection of toxins either directly in the fecal sample or in the bacteria isolated in culture confirm CDI in the proper clinical setting. Nuclear Acid Assay techniques (NAAT) allow to speed up the process with epidemiological and therapeutic consequences.
Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek
2016-05-01
This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.
A robust approach to optimal matched filter design in ultrasonic non-destructive evaluation (NDE)
NASA Astrophysics Data System (ADS)
Li, Minghui; Hayward, Gordon
2017-02-01
The matched filter was demonstrated to be a powerful yet efficient technique to enhance defect detection and imaging in ultrasonic non-destructive evaluation (NDE) of coarse grain materials, provided that the filter was properly designed and optimized. In the literature, in order to accurately approximate the defect echoes, the design utilized the real excitation signals, which made it time consuming and less straightforward to implement in practice. In this paper, we present a more robust and flexible approach to optimal matched filter design using the simulated excitation signals, and the control parameters are chosen and optimized based on the real scenario of array transducer, transmitter-receiver system response, and the test sample, as a result, the filter response is optimized and depends on the material characteristics. Experiments on industrial samples are conducted and the results confirm the great benefits of the method.
Core-shifts and proper-motion constraints in the S5 polar cap sample at the 15 and 43 GHz bands
NASA Astrophysics Data System (ADS)
Abellán, F. J.; Martí-Vidal, I.; Marcaide, J. M.; Guirado, J. C.
2018-06-01
We have studied a complete radio sample of active galactic nuclei with the very-long-baseline-interferometry (VLBI) technique and for the first time successfully obtained high-precision phase-delay astrometry at Q band (43 GHz) from observations acquired in 2010. We have compared our astrometric results with those obtained with the same technique at U band (15 GHz) from data collected in 2000. The differences in source separations among all the source pairs observed in common at the two epochs are compatible at the 1σ level between U and Q bands. With the benefit of quasi-simultaneous U and Q band observations in 2010, we have studied chromatic effects (core-shift) at the radio source cores with three different methods. The magnitudes of the core-shifts are of the same order (about 0.1 mas) for all methods. However, some discrepancies arise in the orientation of the core-shifts determined through the different methods. In some cases these discrepancies are due to insufficient signal for the method used. In others, the discrepancies reflect assumptions of the methods and could be explained by curvatures in the jets and departures from conical jets.
Data re-arranging techniques leading to proper variable selections in high energy physics
NASA Astrophysics Data System (ADS)
Kůs, Václav; Bouř, Petr
2017-12-01
We introduce a new data based approach to homogeneity testing and variable selection carried out in high energy physics experiments, where one of the basic tasks is to test the homogeneity of weighted samples, mainly the Monte Carlo simulations (weighted) and real data measurements (unweighted). This technique is called ’data re-arranging’ and it enables variable selection performed by means of the classical statistical homogeneity tests such as Kolmogorov-Smirnov, Anderson-Darling, or Pearson’s chi-square divergence test. P-values of our variants of homogeneity tests are investigated and the empirical verification through 46 dimensional high energy particle physics data sets is accomplished under newly proposed (equiprobable) quantile binning. Particularly, the procedure of homogeneity testing is applied to re-arranged Monte Carlo samples and real DATA sets measured at the particle accelerator Tevatron in Fermilab at DØ experiment originating from top-antitop quark pair production in two decay channels (electron, muon) with 2, 3, or 4+ jets detected. Finally, the variable selections in the electron and muon channels induced by the re-arranging procedure for homogeneity testing are provided for Tevatron top-antitop quark data sets.
NASA Astrophysics Data System (ADS)
Wan, Gengping; Peng, Xiange; Zeng, Min; Yu, Lei; Wang, Kan; Li, Xinyue; Wang, Guizhen
2017-09-01
This paper reports the synthesis of a new type of Au@TiO2 yolk-shell nanostructures by integrating ion sputtering method with atomic layer deposition (ALD) technique and its applications as visible light-driven photocatalyst and surface-enhanced Raman spectroscopy (SERS) substrate. Both the size and amount of gold nanoparticles confined in TiO2 nanotubes could be facilely controlled via properly adjusting the sputtering time. The unique structure and morphology of the resulting Au@TiO2 samples were investigated by using various spectroscopic and microscopic techniques in detail. It is found that all tested samples can absorb visible light with a maximum absorption at localized surface plasmon resonance (LSPR) wavelengths (550-590 nm) which are determined by the size of gold nanoparticles. The Au@TiO2 yolk-shell composites were used as the photocatalyst for the degradation of methylene blue (MB). As compared with pure TiO2 nanotubes, Au@TiO2 composites exhibit improved photocatalytic properties towards the degradation of MB. The SERS effect of Au@TiO2 yolk-shell composites was also performed to investigate the detection sensitivity of MB.
Wan, Gengping; Peng, Xiange; Zeng, Min; Yu, Lei; Wang, Kan; Li, Xinyue; Wang, Guizhen
2017-09-18
This paper reports the synthesis of a new type of Au@TiO 2 yolk-shell nanostructures by integrating ion sputtering method with atomic layer deposition (ALD) technique and its applications as visible light-driven photocatalyst and surface-enhanced Raman spectroscopy (SERS) substrate. Both the size and amount of gold nanoparticles confined in TiO 2 nanotubes could be facilely controlled via properly adjusting the sputtering time. The unique structure and morphology of the resulting Au@TiO 2 samples were investigated by using various spectroscopic and microscopic techniques in detail. It is found that all tested samples can absorb visible light with a maximum absorption at localized surface plasmon resonance (LSPR) wavelengths (550-590 nm) which are determined by the size of gold nanoparticles. The Au@TiO 2 yolk-shell composites were used as the photocatalyst for the degradation of methylene blue (MB). As compared with pure TiO 2 nanotubes, Au@TiO 2 composites exhibit improved photocatalytic properties towards the degradation of MB. The SERS effect of Au@TiO 2 yolk-shell composites was also performed to investigate the detection sensitivity of MB.
Development of the symmetrical laser shock test for weak bond inspection.
NASA Astrophysics Data System (ADS)
Sagnard, Maxime; Berthe, Laurent; Ecault, Romain; Touchard, Fabienne; Boustie, Michel
2017-06-01
This paper presents the LAser Shock Adhesion Test (LASAT) using symmetrical laser shocks. The study is part of ComBoNDT European project that develops new Non-Destructive Tests (NDT) to assess adherence properties of bonded composite structures. This NDT technique relies on the creation of a plasma on both side of the sample using two lasers. The plasma expands and generates shockwaves inside the material. When combined, the shockwaves create a local tensile strength. Properly set, this stress can be used to test interfaces adherence. Numerous experiments have shown that this adaptive technique can discriminate a good bond from a weak one, without damaging the composite structure. Weak bonds are usually created by contaminated surfaces (residues of release agent, finger prints, ...) and were artificially recreated for ComBoNDT test samples. Numerical simulations are being developed as well, to improve the comprehension of the physical phenomenon. And ultimately, using these numerical results, one should be able to find the correct laser parameters (intensity, laser spot diameter) to generate the right tensile strength at the desired location. This project has received funding from the European Union's Horizon 2020 research and innovation program under Grant agreement N 63649.
Perchlorate as an emerging contaminant in soil, water and food.
Kumarathilaka, Prasanna; Oze, Christopher; Indraratne, S P; Vithanage, Meththika
2016-05-01
Perchlorate ( [Formula: see text] ) is a strong oxidizer and has gained significant attention due to its reactivity, occurrence, and persistence in surface water, groundwater, soil and food. Stable isotope techniques (i.e., ((18)O/(16)O and (17)O/(16)O) and (37)Cl/(35)Cl) facilitate the differentiation of naturally occurring perchlorate from anthropogenic perchlorate. At high enough concentrations, perchlorate can inhibit proper function of the thyroid gland. Dietary reference dose (RfD) for perchlorate exposure from both food and water is set at 0.7 μg kg(-1) body weight/day which translates to a drinking water level of 24.5 μg L(-1). Chromatographic techniques (i.e., ion chromatography and liquid chromatography mass spectrometry) can be successfully used to detect trace level of perchlorate in environmental samples. Perchlorate can be effectively removed by wide variety of remediation techniques such as bio-reduction, chemical reduction, adsorption, membrane filtration, ion exchange and electro-reduction. Bio-reduction is appropriate for large scale treatment plants whereas ion exchange is suitable for removing trace level of perchlorate in aqueous medium. The environmental occurrence of perchlorate, toxicity, analytical techniques, removal technologies are presented. Copyright © 2016 Elsevier Ltd. All rights reserved.
The effect of sample hydration on 13C CPMAS NMR spectra of fulvic acids
Hatcher, P.G.; Wilson, M.A.
1991-01-01
Three fulvic acids, two of which have been well studied by a number of other groups (Armadale and Suwannee river fulvic acids) have been examined by high resolution solid-state 13C-NMR techniques to delineate the effect of absorbed water. Two main effects of absorbed water were observed: (1) changes in spin lattice relaxation times in the rotating frame and cross polarization times and (2) total loss of signal so that some fulvic acid is effectively in solution. These results suggest that discrepancies in the literature concerning observed relative signal intensities from different structural groups are due to absorbed water and emphasize the necessity for proper precautionary drying before spectroscopic analysis. ?? 1991.
APIC position paper: safe injection, infusion, and medication vial practices in health care.
Dolan, Susan A; Felizardo, Gwenda; Barnes, Sue; Cox, Tracy R; Patrick, Marcia; Ward, Katherine S; Arias, Kathleen Meehan
2010-04-01
Outbreaks involving the transmission of bloodborne pathogens or other microbial pathogens to patients in various types of health care settings due to unsafe injection, infusion, and medication vial practices are unacceptable. Each of the outbreaks could have been prevented by the use of proper aseptic technique in conjunction with basic infection prevention practices for handling parenteral medications, administration of injections, and procurement and sampling of blood. This document provides practice guidance for health care facilities on essential safe injection, infusion, and vial practices that should be consistently implemented in such settings. 2010 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.
Sampling and sample processing in pesticide residue analysis.
Lehotay, Steven J; Cook, Jo Marie
2015-05-13
Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.
An Optimal Strategy for Accurate Bulge-to-disk Decomposition of Disk Galaxies
NASA Astrophysics Data System (ADS)
Gao, Hua; Ho, Luis C.
2017-08-01
The development of two-dimensional (2D) bulge-to-disk decomposition techniques has shown their advantages over traditional one-dimensional (1D) techniques, especially for galaxies with non-axisymmetric features. However, the full potential of 2D techniques has yet to be fully exploited. Secondary morphological features in nearby disk galaxies, such as bars, lenses, rings, disk breaks, and spiral arms, are seldom accounted for in 2D image decompositions, even though some image-fitting codes, such as GALFIT, are capable of handling them. We present detailed, 2D multi-model and multi-component decomposition of high-quality R-band images of a representative sample of nearby disk galaxies selected from the Carnegie-Irvine Galaxy Survey, using the latest version of GALFIT. The sample consists of five barred and five unbarred galaxies, spanning Hubble types from S0 to Sc. Traditional 1D decomposition is also presented for comparison. In detailed case studies of the 10 galaxies, we successfully model the secondary morphological features. Through a comparison of best-fit parameters obtained from different input surface brightness models, we identify morphological features that significantly impact bulge measurements. We show that nuclear and inner lenses/rings and disk breaks must be properly taken into account to obtain accurate bulge parameters, whereas outer lenses/rings and spiral arms have a negligible effect. We provide an optimal strategy to measure bulge parameters of typical disk galaxies, as well as prescriptions to estimate realistic uncertainties of them, which will benefit subsequent decomposition of a larger galaxy sample.
An Optimal Strategy for Accurate Bulge-to-disk Decomposition of Disk Galaxies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao Hua; Ho, Luis C.
The development of two-dimensional (2D) bulge-to-disk decomposition techniques has shown their advantages over traditional one-dimensional (1D) techniques, especially for galaxies with non-axisymmetric features. However, the full potential of 2D techniques has yet to be fully exploited. Secondary morphological features in nearby disk galaxies, such as bars, lenses, rings, disk breaks, and spiral arms, are seldom accounted for in 2D image decompositions, even though some image-fitting codes, such as GALFIT, are capable of handling them. We present detailed, 2D multi-model and multi-component decomposition of high-quality R -band images of a representative sample of nearby disk galaxies selected from the Carnegie-Irvine Galaxymore » Survey, using the latest version of GALFIT. The sample consists of five barred and five unbarred galaxies, spanning Hubble types from S0 to Sc. Traditional 1D decomposition is also presented for comparison. In detailed case studies of the 10 galaxies, we successfully model the secondary morphological features. Through a comparison of best-fit parameters obtained from different input surface brightness models, we identify morphological features that significantly impact bulge measurements. We show that nuclear and inner lenses/rings and disk breaks must be properly taken into account to obtain accurate bulge parameters, whereas outer lenses/rings and spiral arms have a negligible effect. We provide an optimal strategy to measure bulge parameters of typical disk galaxies, as well as prescriptions to estimate realistic uncertainties of them, which will benefit subsequent decomposition of a larger galaxy sample.« less
NASA Astrophysics Data System (ADS)
Clenet, A.; Ravera, L.; Bertrand, B.; den Hartog, R.; Jackson, B.; van Leeuwen, B.-J.; van Loon, D.; Parot, Y.; Pointecouteau, E.; Sournac, A.
2014-11-01
IRAP is developing the readout electronics of the SPICA-SAFARI's TES bolometer arrays. Based on the frequency domain multiplexing technique the readout electronics provides the AC-signals to voltage-bias the detectors; it demodulates the data; and it computes a feedback to linearize the detection chain. The feedback is computed with a specific technique, so called baseband feedback (BBFB) which ensures that the loop is stable even with long propagation and processing delays (i.e. several μ s) and with fast signals (i.e. frequency carriers of the order of 5 MHz). To optimize the power consumption we took advantage of the reduced science signal bandwidth to decouple the signal sampling frequency and the data processing rate. This technique allowed a reduction of the power consumption of the circuit by a factor of 10. Beyond the firmware architecture the optimization of the instrument concerns the characterization routines and the definition of the optimal parameters. Indeed, to operate an array TES one has to properly define about 21000 parameters. We defined a set of procedures to automatically characterize these parameters and find out the optimal settings.
Gonzalez-Dominguez, Alvaro; Duran-Guerrero, Enrique; Fernandez-Recamales, Angeles; Lechuga-Sancho, Alfonso Maria; Sayago, Ana; Schwarz, Monica; Segundo, Carmen; Gonzalez-Dominguez, Raul
2017-01-01
The analytical bias introduced by most of the commonly used techniques in metabolomics considerably hinders the simultaneous detection of all metabolites present in complex biological samples. In order to solve this limitation, the combination of complementary approaches is emerging in recent years as the most suitable strategy in order to maximize metabolite coverage. This review article presents a general overview of the most important analytical techniques usually employed in metabolomics: nuclear magnetic resonance, mass spectrometry and hybrid approaches. Furthermore, we emphasize the potential of integrating various tools in the form of metabolomic multi-platforms in order to get a deeper metabolome characterization, for which a revision of the existing literature in this field is provided. This review is not intended to be exhaustive but, rather, to give a practical and concise guide to readers not familiar with analytical chemistry on the considerations to account for the proper selection of the technique to be used in a metabolomic experiment in biomedical research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Creation of the first Hartford Consensus compliant elementary school in the USA.
Ramly, Elie; Bohnen, Jordan D; Fagenholz, Peter; Yeh, Dante; Velmahos, George; DeMoya, Marc; Kaafarani, Haytham; Butler, Katheryn; Lee, Jarone; King, David R
2016-01-01
The Hartford Consensus established a framework for minimizing deaths due to mass shootings, specifically eliminating preventable deaths due to limb exsanguination. Two major principles defined within this framework are (1) redefining the first responder role and (2) the ubiquitous availability of proper training in application of hemorrhage control techniques, including tourniquets. We hypothesized that this hemorrhage control posture could be fully translated into an elementary school. Following institutional review board approval, all teachers at a prekindergarten through 8th grade elementary school underwent short, intensive instruction on their role as a first responder, as well as indications and proper technique for hemorrhage control and tourniquet application for limb exsanguination. All teachers self-reported their confidence in their role as a first responder as well as tourniquet application indications and technique before and after instruction. Following instruction, teachers were evaluated on proper tourniquet application technique on a simulated limb to assess competence. 26 elementary school teachers and 2 administrative staff underwent training. All reported low confidence in their role as a first responder and in tourniquet application indication and technique before training. Following training, all teachers reported high confidence. Testing demonstrated all teachers were competent in the tourniquet application technique. Following training, each classroom was equipped with a purpose-made commercial tourniquet, and a dedicated hemorrhage control bag was placed in the school's central administrative office. All teachers were successfully trained to act as first responders and in correct hemorrhage control techniques, which was verified by testing. This is the first elementary school to universally adopt a hemorrhage control posture to eliminate preventable deaths from limb exsanguination advocated by the Hartford Consensus.
Real-time simulation of biological soft tissues: a PGD approach.
Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F
2013-05-01
We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.
Ramia, S; Sattar, S A
1980-03-01
There is mounting evidence for the waterborne transmission of diarrhea caused by rotaviruses. As a result, proper techniques are required for their recovery from samples of incriminated water. The combined efficiency of the talc-Celite technique and polyethylene glycol 6000 hydroextraction was, therefore, tested for this purpose, using Simian rotavirus SA-11 and MA-104 cells. Conditioning of the dechlorinated tap water samples was carried out by pH adjustment to 6.0 and the addition of Earle balanced salt solution to a final concentration of 1:100. Passage of a 1-liter volume of such a conditioned sample through a layer containing a mixture of talc (300 mg) and Celite 503 (100 mg) led to the adsorption of nearly 93% of the added SA-11 plaque-forming units. For the recovery of the layer-adsorbed virus, 3% beef extract and 1x tryptose phosphate broth were found to be superior to a variety of other eluents tested. When we tested 100-liter sample volumes, layers containing 1.2 g of talc and 0.4 g of Celite were employed. Virus elution was carried out with 100 ml of tryptose phosphate broth. The eluate was concentrated 10-fold by overnight (4 degrees C) hydroextraction with polyethylene glycol. With a total input virus of 7.0 x 10(5) and 1.4 x 10(2) plaque-forming units, the recoveries were about 71 and 59%, respectively.
Extraction of high-quality DNA from ethanol-preserved tropical plant tissues.
Bressan, Eduardo A; Rossi, Mônica L; Gerald, Lee T S; Figueira, Antonio
2014-04-24
Proper conservation of plant samples, especially during remote field collection, is essential to assure quality of extracted DNA. Tropical plant species contain considerable amounts of secondary compounds, such as polysaccharides, phenols, and latex, which affect DNA quality during extraction. The suitability of ethanol (96% v/v) as a preservative solution prior to DNA extraction was evaluated using leaves of Jatropha curcas and other tropical species. Total DNA extracted from leaf samples stored in liquid nitrogen or ethanol from J. curcas and other tropical species (Theobroma cacao, Coffea arabica, Ricinus communis, Saccharum spp., and Solanum lycopersicon) was similar in quality, with high-molecular-weight DNA visualized by gel electrophoresis. DNA quality was confirmed by digestion with EcoRI or HindIII and by amplification of the ribosomal gene internal transcribed spacer region. Leaf tissue of J. curcas was analyzed by light and transmission electron microscopy before and after exposure to ethanol. Our results indicate that leaf samples can be successfully preserved in ethanol for long periods (30 days) as a viable method for fixation and conservation of DNA from leaves. The success of this technique is likely due to reduction or inactivation of secondary metabolites that could contaminate or degrade genomic DNA. Tissue conservation in 96% ethanol represents an attractive low-cost alternative to commonly used methods for preservation of samples for DNA extraction. This technique yields DNA of equivalent quality to that obtained from fresh or frozen tissue.
Extraction of high-quality DNA from ethanol-preserved tropical plant tissues
2014-01-01
Background Proper conservation of plant samples, especially during remote field collection, is essential to assure quality of extracted DNA. Tropical plant species contain considerable amounts of secondary compounds, such as polysaccharides, phenols, and latex, which affect DNA quality during extraction. The suitability of ethanol (96% v/v) as a preservative solution prior to DNA extraction was evaluated using leaves of Jatropha curcas and other tropical species. Results Total DNA extracted from leaf samples stored in liquid nitrogen or ethanol from J. curcas and other tropical species (Theobroma cacao, Coffea arabica, Ricinus communis, Saccharum spp., and Solanum lycopersicon) was similar in quality, with high-molecular-weight DNA visualized by gel electrophoresis. DNA quality was confirmed by digestion with EcoRI or HindIII and by amplification of the ribosomal gene internal transcribed spacer region. Leaf tissue of J. curcas was analyzed by light and transmission electron microscopy before and after exposure to ethanol. Our results indicate that leaf samples can be successfully preserved in ethanol for long periods (30 days) as a viable method for fixation and conservation of DNA from leaves. The success of this technique is likely due to reduction or inactivation of secondary metabolites that could contaminate or degrade genomic DNA. Conclusions Tissue conservation in 96% ethanol represents an attractive low-cost alternative to commonly used methods for preservation of samples for DNA extraction. This technique yields DNA of equivalent quality to that obtained from fresh or frozen tissue. PMID:24761774
Conservation and Preservation of Archives.
ERIC Educational Resources Information Center
Kathpalia, Y. P.
1982-01-01
Presents concept of preventive conservation of archival records as a new science resulting from the use of modern techniques and chemicals. Various techniques for storage, proper environment, preventive de-acidification, fire prevention, restoration, and staff considerations are described. References are provided. (EJS)
Investigation of high-strength bolt-tightening verification techniques.
DOT National Transportation Integrated Search
2016-03-01
The current means and methods of verifying that high-strength bolts have been properly tightened are very laborious and time : consuming. In some cases, the techniques require special equipment and, in other cases, the verification itself may be some...
An algol program for dissimilarity analysis: a divisive-omnithetic clustering technique
Tipper, J.C.
1979-01-01
Clustering techniques are used properly to generate hypotheses about patterns in data. Of the hierarchical techniques, those which are divisive and omnithetic possess many theoretically optimal properties. One such method, dissimilarity analysis, is implemented here in ALGOL 60, and determined to be competitive computationally with most other methods. ?? 1979.
Authentication techniques for smart cards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, R.A.
1994-02-01
Smart card systems are most cost efficient when implemented as a distributed system, which is a system without central host interaction or a local database of card numbers for verifying transaction approval. A distributed system, as such, presents special card and user authentication problems. Fortunately, smart cards offer processing capabilities that provide solutions to authentication problems, provided the system is designed with proper data integrity measures. Smart card systems maintain data integrity through a security design that controls data sources and limits data changes. A good security design is usually a result of a system analysis that provides a thoroughmore » understanding of the application needs. Once designers understand the application, they may specify authentication techniques that mitigate the risk of system compromise or failure. Current authentication techniques include cryptography, passwords, challenge/response protocols, and biometrics. The security design includes these techniques to help prevent counterfeit cards, unauthorized use, or information compromise. This paper discusses card authentication and user identity techniques that enhance security for microprocessor card systems. It also describes the analysis process used for determining proper authentication techniques for a system.« less
Secular Extragalactic Parallax and Geometric Distances with Gaia Proper Motions
NASA Astrophysics Data System (ADS)
Paine, Jennie; Darling, Jeremiah K.
2018-06-01
The motion of the Solar System with respect to the cosmic microwave background (CMB) rest frame creates a well measured dipole in the CMB, which corresponds to a linear solar velocity of about 78 AU/yr. This motion causes relatively nearby extragalactic objects to appear to move compared to more distant objects, an effect that can be measured in the proper motions of nearby galaxies. An object at 1 Mpc and perpendicular to the CMB apex will exhibit a secular parallax, observed as a proper motion, of 78 µas/yr. The relatively large peculiar motions of galaxies make the detection of secular parallax challenging for individual objects. Instead, a statistical parallax measurement can be made for a sample of objects with proper motions, where the global parallax signal is modeled as an E-mode dipole that diminishes linearly with distance. We present preliminary results of applying this model to a sample of nearby galaxies with Gaia proper motions to detect the statistical secular parallax signal. The statistical measurement can be used to calibrate the canonical cosmological “distance ladder.”
Zheng, Cao; Zhao, Jing; Bao, Peng; Gao, Jin; He, Jin
2011-06-24
A novel, simple and efficient dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO) technique coupled with high-performance liquid chromatography with ultraviolet detection (HPLC-UV) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed for the determination of triclosan and its degradation product 2,4-dichlorophenol in real water samples. The extraction solvent used in this work is of low density, low volatility, low toxicity and proper melting point around room temperature. The extractant droplets can be collected easily by solidifying it at a lower temperature. Parameters that affect the extraction efficiency, including type and volume of extraction solvent and dispersive solvent, salt effect, pH and extraction time, were investigated and optimized in a 5 mL sample system by HPLC-UV. Under the optimum conditions (extraction solvent: 12 μL of 1-dodecanol; dispersive solvent: 300 of μL acetonitrile; sample pH: 6.0; extraction time: 1 min), the limits of detection (LODs) of the pretreatment method combined with LC-MS/MS were in the range of 0.002-0.02 μg L(-1) which are lower than or comparable with other reported approaches applied to the determination of the same compounds. Wide linearities, good precisions and satisfactory relative recoveries were also obtained. The proposed technique was successfully applied to determine triclosan and 2,4-dichlorophenol in real water samples. Copyright © 2011 Elsevier B.V. All rights reserved.
Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta
2018-05-15
Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gould, Andrew; Yee, Jennifer C., E-mail: gould@astronomy.ohio-state.edu, E-mail: jyee@astronomy.ohio-state.edu
While of order of a million asteroids have been discovered, the number in rigorously controlled samples that have precise orbits and rotation periods, as well as well-measured colors, is relatively small. In particular, less than a dozen main-belt asteroids with estimated diameters D < 3 km have excellent rotation periods. We show how existing and soon-to-be-acquired microlensing data can yield a large asteroid sample with precise orbits and rotation periods, which will include roughly 6% of all asteroids with maximum brightness I < 18.1 and lying within 10 Degree-Sign of the ecliptic. This sample will be dominated by small andmore » very small asteroids, down to D {approx} 1 km. We also show how asteroid astrometry could turn current narrow-angle OGLE proper motions of bulge stars into wide-angle proper motions. This would enable one to measure the proper-motion gradient across the Galactic bar.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walz-Flannigan, A; Lucas, J; Buchanan, K
Purpose: Manual technique selection in radiography is needed for imaging situations where there is difficulty in proper positioning for AEC, prosthesis, for non-bucky imaging, or for guiding image repeats. Basic information about how to provide consistent image signal and contrast for various kV and tissue thickness is needed to create manual technique charts, and relevant for physicists involved in technique chart optimization. Guidance on technique combinations and rules-of-thumb to provide consistent image signal still in use today are based on measurements with optical density of screen-film combinations and older generation x-ray systems. Tools such as a kV-scale chart can bemore » useful to know how to modify mAs when kV is changed in order to maintain consistent image receptor signal level. We evaluate these tools for modern equipment for use in optimizing proper size scaled techniques. Methods: We used a water phantom to measure calibrated signal change for CR and DR (with grid) for various beam energies. Tube current values were calculated that would yield a consistent image signal response. Data was fit to provide sufficient granularity of detail to compose technique-scale chart. Tissue thickness approximated equivalence to 80% of water depth. Results: We created updated technique-scale charts, providing mAs and kV combinations to achieve consistent signal for CR and DR for various tissue equivalent thicknesses. We show how this information can be used to create properly scaled size-based manual technique charts. Conclusion: Relative scaling of mAs and kV for constant signal (i.e. the shape of the curve) appears substantially similar between film-screen and CR/DR. This supports the notion that image receptor related differences are minor factors for relative (not absolute) changes in mAs with varying kV. However, as demonstrated creation of these difficult to find detailed technique-scales are useful tools for manual chart optimization.« less
Burton, Casey; Shi, Honglan; Ma, Yinfa
2013-11-19
Recent preliminary studies have implicated urinary pteridines as candidate biomarkers in a growing number of malignancies including breast cancer. While the developments of capillary electrophoresis-laser induced fluorescence (CE-LIF), high performance liquid chromatography (HPLC), and liquid chromatography-mass spectroscopy (LC-MS) pteridine urinalyses among others have helped to enable these findings, limitations including poor pteridine specificity, asynchronous or nonexistent renal dilution normalization, and a lack of information regarding adduct formation in mass spectrometry techniques utilizing electrospray ionization (ESI) have prevented application of these techniques to a larger clinical setting. In this study, a simple, rapid, specific, and sensitive high-performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) method has been developed and optimized for simultaneous detection of six pteridines previously implicated in breast cancer and creatinine as a renal dilution factor in urine. In addition, this study reports cationic adduct formation of urinary pteridines under ESI-positive ionization for the first time. This newly developed technique separates and detects the following six urinary pteridines: 6-biopterin, 6-hydroxymethylpterin, d-neopterin, pterin, isoxanthopterin, and xanthopterin, as well as creatinine. The method detection limit for the pteridines is between 0.025 and 0.5 μg/L, and for creatinine, it is 0.15 μg/L. The method was also validated by spiked recoveries (81-105%), reproducibility (RSD: 1-6%), and application to 25 real urine samples from breast cancer positive and negative samples through a double-blind study. The proposed technique was finally compared directly with a previously reported CE-LIF technique, concluding that additional or alternative renal dilution factors are needed for proper investigation of urinary pteridines as breast cancer biomarkers.
The application of machine learning techniques in the clinical drug therapy.
Meng, Huan-Yu; Jin, Wan-Lin; Yan, Cheng-Kai; Yang, Huan
2018-05-25
The development of a novel drug is an extremely complicated process that includes the target identification, design and manufacture, and proper therapy of the novel drug, as well as drug dose selection, drug efficacy evaluation, and adverse drug reaction control. Due to the limited resources, high costs, long duration, and low hit-to-lead ratio in the development of pharmacogenetics and computer technology, machine learning techniques have assisted novel drug development and have gradually received more attention by researchers. According to current research, machine learning techniques are widely applied in the process of the discovery of new drugs and novel drug targets, the decision surrounding proper therapy and drug dose, and the prediction of drug efficacy and adverse drug reactions. In this article, we discussed the history, workflow, and advantages and disadvantages of machine learning techniques in the processes mentioned above. Although the advantages of machine learning techniques are fairly obvious, the application of machine learning techniques is currently limited. With further research, the application of machine techniques in drug development could be much more widespread and could potentially be one of the major methods used in drug development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Abe, Hitoshi; Niwa, Yasuhiro; Kimura, Masao; Murakami, Youichi; Yokoyama, Toshiharu; Hosono, Hideo
2016-04-05
A gritty surface sample holder has been invented to obtain correct XAFS spectra for concentrated samples by fluorescence yield (FY). Materials are usually mixed with boron nitride (BN) to prepare proper concentrations to measure XAFS spectra. Some materials, however, could not be mixed with BN and would be measured in too concentrated conditions to obtain correct XAFS spectra. Consequently, XAFS spectra will be incorrect typically with decreased intensities of the peaks. We have invented the gritty surface sample holders to obtain correct XAFS spectra even for concentrated materials for FY measurements. Pure Cu and CuO powders were measured mounted on the sample holders, and the same spectra were obtained as transmission spectra of properly prepared samples. This sample holder is useful to measure XAFS for any concentrated materials.
Active Learning through Online Instruction
ERIC Educational Resources Information Center
Gulbahar, Yasemin; Kalelioglu, Filiz
2010-01-01
This article explores the use of proper instructional techniques in online discussions that lead to meaningful learning. The research study looks at the effective use of two instructional techniques within online environments, based on qualitative measures. "Brainstorming" and "Six Thinking Hats" were selected and implemented…
Analytical technique characterizes all trace contaminants in water
NASA Technical Reports Server (NTRS)
Foster, J. N.; Lysyj, I.; Nelson, K. H.
1967-01-01
Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.
Common procedures in reptiles and amphibians.
de la Navarre, Byron J S
2006-05-01
Reptiles and amphibians continue to be popular as pets in the United States and throughout the world. It therefore behooves veterinarians interested in caring for these exotic species to continually gather knowledge concerning both their proper husbandry and the conditions that require medical and/or surgical intervention. This article covers husbandry, physical examination, and clinical and diagnostic techniques in an effort to present guidelines for the evaluation of the reptile or amphibian patient. Gathering clinical data will aid veterinarians in arriving at the proper diagnosis,increasing the chances of success with treatment protocols, and educating the clients in proper nutrition and husbandry for their pets.
Chemically Dissected Rotation Curves of the Galactic Bulge from Main-sequence Proper Motions
NASA Astrophysics Data System (ADS)
Clarkson, William I.; Calamida, Annalisa; Sahu, Kailash C.; Brown, Thomas M.; Gennaro, Mario; Avila, Roberto J.; Valenti, Jeff; Debattista, Victor P.; Rich, R. Michael; Minniti, Dante; Zoccali, Manuela; Aufdemberge, Emily R.
2018-05-01
We report results from an exploratory study implementing a new probe of Galactic evolution using archival Hubble Space Telescope imaging observations. Precise proper motions are combined with photometric relative metallicity and temperature indices, to produce the proper-motion rotation curves of the Galactic bulge separately for metal-poor and metal-rich main-sequence samples. This provides a “pencil-beam” complement to large-scale wide-field surveys, which to date have focused on the more traditional bright giant branch tracers. We find strong evidence that the Galactic bulge rotation curves drawn from “metal-rich” and “metal-poor” samples are indeed discrepant. The “metal-rich” sample shows greater rotation amplitude and a steeper gradient against line-of-sight distance, as well as possibly a stronger central concentration along the line of sight. This may represent a new detection of differing orbital anisotropy between metal-rich and metal-poor bulge objects. We also investigate selection effects that would be implied for the longitudinal proper-motion cut often used to isolate a “pure-bulge” sample. Extensive investigation of synthetic stellar populations suggests that instrumental and observational artifacts are unlikely to account for the observed rotation curve differences. Thus, proper-motion-based rotation curves can be used to probe chemodynamical correlations for main-sequence tracer stars, which are orders of magnitude more numerous in the Galactic bulge than the bright giant branch tracers. We discuss briefly the prospect of using this new tool to constrain detailed models of Galactic formation and evolution. Based on observations made with the NASA/ESA Hubble Space Telescope and obtained from the data archive at the Space Telescope Science Institute. STScI is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS 5-26555.
The Search for Extension: 7 Steps to Help People Find Research-Based Information on the Internet
ERIC Educational Resources Information Center
Hill, Paul; Rader, Heidi B.; Hino, Jeff
2012-01-01
For Extension's unbiased, research-based content to be found by people searching the Internet, it needs to be organized in a way conducive to the ranking criteria of a search engine. With proper web design and search engine optimization techniques, Extension's content can be found, recognized, and properly indexed by search engines and…
Innocent Bystanders: Carbon Stars from the Sloan Digital Sky Survey
NASA Astrophysics Data System (ADS)
Green, Paul
2013-03-01
Among stars showing carbon molecular bands (C stars), the main-sequence dwarfs, likely in post-mass transfer binaries, are numerically dominant in the Galaxy. Via spectroscopic selection from the Sloan Digital Sky Survey, we retrieve 1220 high galactic latitude C stars, ~5 times more than previously known, including a wider variety than past techniques such as color or grism selection have netted, and additionally yielding 167 DQ white dwarfs. Of the C stars with proper motion measurements, we identify 69% clearly as dwarfs (dCs), while ~7% are giants. The dCs likely span absolute magnitudes Mi from ~6.5 to 10.5. "G-type" dC stars with weak CN and relatively blue colors are probably the most massive dCs still cool enough to show C2 bands. We report Balmer emission in 22 dCs, none of which are G-types. We find 8 new DA/dC stars in composite spectrum binaries, quadrupling the total sample of these "smoking guns" for AGB binary mass transfer. Eleven very red C stars with strong red CN bands appear to be "N"-type AGB stars at large Galactocentric distances, one likely a new discovery in the dIrr galaxy Leo A. Two such stars within 30' of each other may trace a previously unidentified dwarf galaxy or tidal stream at ~40 kpc. We explore the multiwavelength properties of the sample and report the first X-ray detection of a dC star, which shows strong Balmer emission. Our own spectroscopic survey additionally provides the dC surface density from a complete sample of dwarfs limited by magnitude, color, and proper motion.
Asghari, Alireza; Fahimi, Ebrahim; Bazregar, Mohammad; Rajabi, Maryam; Boutorabi, Leila
2017-05-01
Simple and rapid determinations of some psychotropic drugs in some pharmaceutical wastewater and human plasma samples were successfully accomplished via the tandem dispersive liquid-liquid microextraction combined with high performance liquid chromatography-ultraviolet detection (TDLLME-HPLC-UV). TDLLME of the three psychotropic drugs clozapine, chlorpromazine, and thioridazine was easily performed through two consecutive dispersive liquid-liquid microextractions. By performing this convenient method, proper sample preconcentrations and clean-ups were achieved in just about 7min. In order to achieve the best extraction efficiency, the effective parameters involved were optimized. The optimal experimental conditions consisted of 100μL of CCl 4 (as the extraction organic solvent), and the pH values of 13 and 2 for the donor and acceptor phases, respectively. Under these optimum experimental conditions, the proposed TDLLME-HPLC-UV technique provided a good linearity in the range of 5-3000ngmL -1 for the three psychotropic drugs with the correlation of determinations (R 2 s) higher than 0.996. The limits of quantification (LOQs) and limits of detection (LODs) obtained were 5.0ngmL -1 and 1.0-1.5ngmL -1 , respectively. Also the proper enrichment factors (EFs) of 96, 99, and 88 for clozapine, chlorpromazine, and thioridazine, respectively, and good extraction repeatabilities (relative standard deviations below 9.3%, n=5) were obtained. Copyright © 2017 Elsevier B.V. All rights reserved.
Wind Farm Flow Modeling using an Input-Output Reduced-Order Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Annoni, Jennifer; Gebraad, Pieter; Seiler, Peter
Wind turbines in a wind farm operate individually to maximize their own power regardless of the impact of aerodynamic interactions on neighboring turbines. There is the potential to increase power and reduce overall structural loads by properly coordinating turbines. To perform control design and analysis, a model needs to be of low computational cost, but retains the necessary dynamics seen in high-fidelity models. The objective of this work is to obtain a reduced-order model that represents the full-order flow computed using a high-fidelity model. A variety of methods, including proper orthogonal decomposition and dynamic mode decomposition, can be used tomore » extract the dominant flow structures and obtain a reduced-order model. In this paper, we combine proper orthogonal decomposition with a system identification technique to produce an input-output reduced-order model. This technique is used to construct a reduced-order model of the flow within a two-turbine array computed using a large-eddy simulation.« less
Upper-limb biomechanical analysis of wheelchair transfer techniques in two toilet configurations.
Tsai, Chung-Ying; Boninger, Michael L; Bass, Sarah R; Koontz, Alicia M
2018-06-01
Using proper technique is important for minimizing upper limb kinetics during wheelchair transfers. The objective of the study was to 1) evaluate the transfer techniques used during toilet transfers and 2) determine the impact of technique on upper limb joint loading for two different toilet configurations. Twenty-six manual wheelchair users (23 men and 3 women) performed transfers in a side and front wheelchair-toilet orientation while their habitual transfer techniques were evaluated using the Transfer Assessment Instrument. A motion analysis system and force sensors were used to record biomechanical data during the transfers. More than 20% of the participants failed to complete five transfer skills in the side setup compared to three skills in the front setup. Higher quality skills overall were associated with lower peak forces and moments in both toilet configurations (-0.68 < r < -0.40, p < 0.05). In the side setup, participants who properly placed their hands in a stable position and used proper leading handgrips had lower shoulder resultant joint forces and moments than participants who did not perform these skills correctly (p ≤ 0.04). In the front setup, positioning the wheelchair within three inches of the transfer target was associated with reduced peak trailing forces and moments across all three upper limb joints (p = 0.02). Transfer skills training, making toilet seats level with the wheelchair seat, positioning the wheelchair closer to the toilet and mounting grab bars in a more ideal location for persons who do sitting pivot transfers may facilitate better quality toilet transfers. Published by Elsevier Ltd.
Proper motion separation of Be star candidates in the Magellanic Clouds and the Milky Way
NASA Astrophysics Data System (ADS)
Vieira, Katherine; García-Varela, Alejandro; Sabogal, Beatriz
2017-08-01
We present a proper motion investigation of a sample of Be star candidates towards the Magellanic Clouds, which has resulted in the identification of separate populations, in the Galactic foreground and in the Magellanic background. Be stars are broadly speaking B-type stars that have shown emission lines in their spectra. In this work, we studied a sample of 2446 and 1019 Be star candidates towards the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC), respectively, taken from the literature and proposed as possible Be stars due to their variability behaviour in the OGLE-II I band. JHKs magnitudes from the InfraRed Survey Facility catalogue and proper motions from the Southern Proper Motion 4 catalogue were obtained for 1188 and 619 LMC and SMC Be stars candidates, respectively. Colour-colour and vector-point diagrams were used to identify different populations amongst the Be star candidates. In the LMC sample, two populations with distinctive infrared colours and kinematics were found, the bluer sample is consistent with being in the LMC and the redder one with belonging to the Milky Way disc. This settles the nature of the redder sample that had been described in previous publications as a possible unknown subclass of stars amongst the Be candidates in the LMC. In the SMC sample, a similar but less evident result was obtained, since this apparent unknown subclass was not seen in this galaxy. We confirm that in the selection of Be stars by their variability, although generally successful, there is a higher risk of contamination by Milky Way objects towards redder B - V and V - I colours.
duVerle, David A; Yotsukura, Sohiya; Nomura, Seitaro; Aburatani, Hiroyuki; Tsuda, Koji
2016-09-13
Single-cell RNA sequencing is fast becoming one the standard method for gene expression measurement, providing unique insights into cellular processes. A number of methods, based on general dimensionality reduction techniques, have been suggested to help infer and visualise the underlying structure of cell populations from single-cell expression levels, yet their models generally lack proper biological grounding and struggle at identifying complex differentiation paths. Here we introduce cellTree: an R/Bioconductor package that uses a novel statistical approach, based on document analysis techniques, to produce tree structures outlining the hierarchical relationship between single-cell samples, while identifying latent groups of genes that can provide biological insights. With cellTree, we provide experimentalists with an easy-to-use tool, based on statistically and biologically-sound algorithms, to efficiently explore and visualise single-cell RNA data. The cellTree package is publicly available in the online Bionconductor repository at: http://bioconductor.org/packages/cellTree/ .
CFD Analysis of Hypersonic Flowfields With Surface Thermochemistry and Ablation
NASA Technical Reports Server (NTRS)
Henline, W. D.
1997-01-01
In the past forty years much progress has been made in computational methods applied to the solution of problems in spacecraft hypervelocity flow and heat transfer. Although the basic thermochemical and physical modeling techniques have changed little in this time, several orders of magnitude increase in the speed of numerically solving the Navier-Stokes and associated energy equations have been achieved. The extent to which this computational power can be applied to the design of spacecraft heat shields is dependent on the proper coupling of the external flow equations to the boundary conditions and governing equations representing the thermal protection system in-depth conduction, pyrolysis and surface ablation phenomena. A discussion of the techniques used to do this in past problems as well as the current state-of-art is provided. Specific examples, including past missions such as Galileo, together with the more recent case studies of ESA/Rosetta Sample Comet Return, Mars Pathfinder and X-33 will be discussed. Modeling assumptions, design approach and computational methods and results are presented.
Vedelago, J; Mattea, F; Valente, M
2018-03-01
The use and implementation of nanoparticles in medicine has grown exponentially in the last twenty years. Their main applications include drug delivery, theranostics, tissue engineering and magneto function. Dosimetry techniques can take advantage of inorganic nanoparticles properties and their combination with gel dosimetry techniques could be used as a first step for their later inclusion in radio-diagnostics or radiotherapy treatments. The present study presents preliminary results of properly synthesized and purified silver nanoparticles integration with Fricke gel dosimeters. Used nanoparticles presented mean sizes ranging from 2 to 20 nm, with a lognormal distribution. Xylenol orange concentration in Fricke gel dosimeter was adjust in order to allow sample's optical readout, accounting nanoparticles plasmon. Dose enhancement was assessed irradiating dosimeters setting X-ray beams energies below and above silver K-edge. Monte Carlo simulations were used to estimate the dose enhancement in the experiments and compare with the trend obtained in the experimental results. Copyright © 2018 Elsevier Ltd. All rights reserved.
Recommendations for fluorescence instrument qualification: the new ASTM Standard Guide.
DeRose, Paul C; Resch-Genger, Ute
2010-03-01
Aimed at improving quality assurance and quantitation for modern fluorescence techniques, ASTM International (ASTM) is about to release a Standard Guide for Fluorescence, reviewed here. The guide's main focus is on steady state fluorometry, for which available standards and instrument characterization procedures are discussed along with their purpose, suitability, and general instructions for use. These include the most relevant instrument properties needing qualification, such as linearity and spectral responsivity of the detection system, spectral irradiance reaching the sample, wavelength accuracy, sensitivity or limit of detection for an analyte, and day-to-day performance verification. With proper consideration of method-inherent requirements and limitations, many of these procedures and standards can be adapted to other fluorescence techniques. In addition, procedures for the determination of other relevant fluorometric quantities including fluorescence quantum yields and fluorescence lifetimes are briefly introduced. The guide is a clear and concise reference geared for users of fluorescence instrumentation at all levels of experience and is intended to aid in the ongoing standardization of fluorescence measurements.
Growth of single wall carbon nanotubes using PECVD technique: An efficient chemiresistor gas sensor
NASA Astrophysics Data System (ADS)
Lone, Mohd Yaseen; Kumar, Avshish; Husain, Samina; Zulfequar, M.; Harsh; Husain, Mushahid
2017-03-01
In this work, the uniform and vertically aligned single wall carbon nanotubes (SWCNTs) have been grown on Iron (Fe) deposited Silicon (Si) substrate by plasma enhanced chemical vapor deposition (PECVD) technique at very low temperature of 550 °C. The as-grown samples of SWCNTS were characterized by field emission scanning electron microscope (FESEM), high resolution transmission electron microscope (HRTEM) and Raman spectrometer. SWCNT based chemiresistor gas sensing device was fabricated by making the proper gold contacts on the as-grown SWCNTs. The electrical conductance and sensor response of grown SWCNTs have been investigated. The fabricated SWCNT sensor was exposed to ammonia (NH3) gas at 200 ppm in a self assembled apparatus. The sensor response was measured at room temperature which was discussed in terms of adsorption of NH3 gas molecules on the surface of SWCNTs. The achieved results are used to develope a miniaturized gas sensor device for monitoring and control of environment pollutants.
Gaussian process regression for sensor networks under localization uncertainty
Jadaliha, M.; Xu, Yunfei; Choi, Jongeun; Johnson, N.S.; Li, Weiming
2013-01-01
In this paper, we formulate Gaussian process regression with observations under the localization uncertainty due to the resource-constrained sensor networks. In our formulation, effects of observations, measurement noise, localization uncertainty, and prior distributions are all correctly incorporated in the posterior predictive statistics. The analytically intractable posterior predictive statistics are proposed to be approximated by two techniques, viz., Monte Carlo sampling and Laplace's method. Such approximation techniques have been carefully tailored to our problems and their approximation error and complexity are analyzed. Simulation study demonstrates that the proposed approaches perform much better than approaches without considering the localization uncertainty properly. Finally, we have applied the proposed approaches on the experimentally collected real data from a dye concentration field over a section of a river and a temperature field of an outdoor swimming pool to provide proof of concept tests and evaluate the proposed schemes in real situations. In both simulation and experimental results, the proposed methods outperform the quick-and-dirty solutions often used in practice.
Cellular imaging using temporally flickering nanoparticles.
Ilovitsh, Tali; Danan, Yossef; Meir, Rinat; Meiri, Amihai; Zalevsky, Zeev
2015-02-04
Utilizing the surface plasmon resonance effect in gold nanoparticles enables their use as contrast agents in a variety of applications for compound cellular imaging. However, most techniques suffer from poor signal to noise ratio (SNR) statistics due to high shot noise that is associated with low photon count in addition to high background noise. We demonstrate an effective way to improve the SNR, in particular when the inspected signal is indistinguishable in the given noisy environment. We excite the temporal flickering of the scattered light from gold nanoparticle that labels a biological sample. By preforming temporal spectral analysis of the received spatial image and by inspecting the proper spectral component corresponding to the modulation frequency, we separate the signal from the wide spread spectral noise (lock-in amplification).
Multivariate localization methods for ensemble Kalman filtering
NASA Astrophysics Data System (ADS)
Roh, S.; Jun, M.; Szunyogh, I.; Genton, M. G.
2015-12-01
In ensemble Kalman filtering (EnKF), the small number of ensemble members that is feasible to use in a practical data assimilation application leads to sampling variability of the estimates of the background error covariances. The standard approach to reducing the effects of this sampling variability, which has also been found to be highly efficient in improving the performance of EnKF, is the localization of the estimates of the covariances. One family of localization techniques is based on taking the Schur (element-wise) product of the ensemble-based sample covariance matrix and a correlation matrix whose entries are obtained by the discretization of a distance-dependent correlation function. While the proper definition of the localization function for a single state variable has been extensively investigated, a rigorous definition of the localization function for multiple state variables that exist at the same locations has been seldom considered. This paper introduces two strategies for the construction of localization functions for multiple state variables. The proposed localization functions are tested by assimilating simulated observations experiments into the bivariate Lorenz 95 model with their help.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crockett, C.S.; Haas, C.N.
1996-11-01
Due to current proposed regulations requiring monitoring for protozoans and demonstration of adequate protozoan removal depending on source water concentrations detected, many utilities are considering or are engaged in protozoan monitoring activities within their watershed so that proper watershed management and treatment modifications can reduce their impact on drinking water safety and quality. However, due to the difficulties associated with the current analytical methods and sample collection many sampling efforts collect data that cannot be interpreted or lack the tools to interpret the information obtained. Therefore, it is necessary to determine how to develop an effective sampling program tailored tomore » a utility`s specific needs to provide interpretable data and develop tools for evaluating such data. The following case study describes the process in which a utility learned how to collect and interpret monitoring data for their specific needs and provides concepts and tools which other utilities can use to aid in their own macro and microwatershed management efforts.« less
Classification and treatment of periprosthetic supracondylar femur fractures.
Ricci, William
2013-02-01
Locked plating and retrograde nailing are two accepted methods for treatment of periprosthetic distal femur fractures. Each has relative benefits and potential pitfalls. Appropriate patient selection and knowledge of the specific femoral component geometry are required to optimally choose between these two methods. Locked plating may be applied to most periprosthetic distal femur fractures. The fracture pattern, simple or comminuted, will dictate the specific plating technique, compression plating or bridge plating. Nailing requires an open intercondylar box and a distal fragment of enough size to allow interlocking. With proper patient selection and proper techniques, good results can be obtained with either method. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Experimental determination of airplane mass and inertial characteristics
NASA Technical Reports Server (NTRS)
Wolowicz, C. H.; Yancey, R. B.
1974-01-01
Current practices are evaluated for experimentally determining airplane center of gravity, moments of inertia, and products of inertia. The techniques discussed are applicable to bodies other than airplanes. In pitching- and rolling-moment-of-inertia investigations with the airplane mounted on and pivoted about knife edges, the nonlinear spring moments that occur at large amplitudes of oscillation can be eliminated by using the proper spring configuration. The single-point suspension double-pendulum technique for obtaining yawing moments of inertia, products of inertia, and the inclination of the principal axis provides accurate results from yaw-mode oscillation data, provided that the sway-mode effects are minimized by proper suspension rig design. Rocking-mode effects in the data can be isolated.
Holmes, Robert R.; Singh, Vijay P.
2016-01-01
The importance of streamflow data to the world’s economy, environmental health, and public safety continues to grow as the population increases. The collection of streamflow data is often an involved and complicated process. The quality of streamflow data hinges on such things as site selection, instrumentation selection, streamgage maintenance and quality assurance, proper discharge measurement techniques, and the development and continued verification of the streamflow rating. This chapter serves only as an overview of the streamflow data collection process as proper treatment of considerations, techniques, and quality assurance cannot be addressed adequately in the space limitations of this chapter. Readers with the need for the detailed information on the streamflow data collection process are referred to the many references noted in this chapter.
NASA Astrophysics Data System (ADS)
Miller, C. J.; Yoder, T. S.
2010-06-01
Explosive trace detection equipment has been deployed to airports for more than a decade. During this time, the need for standardized procedures and calibrated trace amounts for ensuring that the systems are operating properly and detecting the correct explosive has been apparent but a standard representative of a fingerprint has been elusive. Standards are also necessary to evaluate instrumentation in the laboratories during development and prior to deployment to determine sample throughput, probability of detection, false positive/negative rates, ease of use by operator, mechanical and/or software problems that may be encountered, and other pertinent parameters that would result in the equipment being unusable during field operations. Since many laboratories do not have access to nor are allowed to handle explosives, the equipment is tested using techniques aimed at simulating the actual explosives fingerprint. This laboratory study focused on examining the similarities and differences in three different surface contamination techniques that are used to performance test explosive trace detection equipment in an attempt to determine how effective the techniques are at replicating actual field samples and to offer scenarios where each contamination technique is applicable. The three techniques used were dry transfer deposition of standard solutions using the Transportation Security Laboratory’s (TSL) patented dry transfer techniques (US patent 6470730), direct deposition of explosive standards onto substrates, and fingerprinting of actual explosives onto substrates. RDX was deposited on the surface of one of five substrates using one of the three different deposition techniques. The process was repeated for each substrate type using each contamination technique. The substrate types used were: 50% cotton/50% polyester as found in T-shirts, 100% cotton with a smooth surface such as that found in a cotton dress shirt, 100% cotton on a rough surface such as that found on canvas or denim, suede leather such as might be found on jackets, purses, or shoes, and painted metal obtained from a car hood at a junk yard. The samples were not pre-cleaned prior to testing and contained sizing agents, and in the case of the metal, oil and dirt. The substrates were photographed using a Zeiss Discover V12 stereoscope with Axiocam ICc1 3 megapixel digital camera to determine the difference in the crystalline structure and surface contamination in an attempt to determine differences and similarities associated with current contamination deposition techniques. Some samples were analyzed using scanning electron microscopy (SEM) and some were extracted and analyzed with high performance liquid chromatography (HPLC) or gas chromatography with an electron capture detector (GC-ECD) to quantify the data.
NASA Technical Reports Server (NTRS)
Stysley, Paul
2016-01-01
Applicability to Early Stage Innovation NIAC Cutting edge and innovative technologies are needed to achieve the demanding requirements for NASA origin missions that require sample collection as laid out in the NRC Decadal Survey. This proposal focused on fully understanding the state of remote laser optical trapping techniques for capturing particles and returning them to a target site. In future missions, a laser-based optical trapping system could be deployed on a lander that would then target particles in the lower atmosphere and deliver them to the main instrument for analysis, providing remote access to otherwise inaccessible samples. Alternatively, for a planetary mission the laser could combine ablation and trapping capabilities on targets typically too far away or too hard for traditional drilling sampling systems. For an interstellar mission, a remote laser system could gather particles continuously at a safe distance; this would avoid the necessity of having a spacecraft fly through a target cloud such as a comet tail. If properly designed and implemented, a laser-based optical trapping system could fundamentally change the way scientists designand implement NASA missions that require mass spectroscopy and particle collection.
Low frequency noise elimination technique for 24-bit Σ-Δ data acquisition systems.
Qu, Shao-Bo; Robert, Olivier; Lognonné, Philippe; Zhou, Ze-Bing; Yang, Shan-Qing
2015-03-01
Low frequency 1/f noise is one of the key limiting factors of high precision measurement instruments. In this paper, digital correlated double sampling is implemented to reduce the offset and low frequency 1/f noise of a data acquisition system with 24-bit sigma delta (Σ-Δ) analog to digital converter (ADC). The input voltage is modulated by cross-coupled switches, which are synchronized to the sampling clock, and converted into digital signal by ADC. By using a proper switch frequency, the unwanted parasitic signal frequencies generated by the switches are avoided. The noise elimination processing is made through the principle of digital correlated double sampling, which is equivalent to a time shifted subtraction for the sampled voltage. The low frequency 1/f noise spectrum density of the data acquisition system is reduced to be flat down to the measurement frequency lower limit, which is about 0.0001 Hz in this paper. The noise spectrum density is eliminated by more than 60 dB at 0.0001 Hz, with a residual noise floor of (9 ± 2) nV/Hz(1/2) which is limited by the intrinsic white noise floor of the ADC above its corner frequency.
Exploring Uncertainty with Projectile Launchers
ERIC Educational Resources Information Center
Orzel, Chad; Reich, Gary; Marr, Jonathan
2012-01-01
The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…
Computational Fluid Dynamics (CFD) techniques are increasingly being applied to air quality modeling of short-range dispersion, especially the flow and dispersion around buildings and other geometrically complex structures. The proper application and accuracy of such CFD techniqu...
FINDING THE BALANCE - QUALITY ASSURANCE REQUIREMENTS VS. RESEARCH NEEDS
Investigators often misapply quality assurance (QA) procedures and may consider QA as a hindrance to developing test plans for sampling and analysis. If used properly, however, QA is the driving force for collecting the right kind and proper amount of data. Researchers must use Q...
FINDING THE BALANCE - QUALITY ASSURANCE REQUIREMENTS VS. RESEARCH NEEDS
Investigators often misapply quality assurance (QA) procedures and may consider QA as a hindrance to developing test plans for
sampling and analysis. If used properly, however, QA is the driving force for collecting the right kind and proper amount of data.
Researchers must...
Xu, Jia-Min; Wang, Ce-Qun; Lin, Long-Nian
2014-06-25
Multi-channel in vivo recording techniques are used to record ensemble neuronal activity and local field potentials (LFP) simultaneously. One of the key points for the technique is how to process these two sets of recorded neural signals properly so that data accuracy can be assured. We intend to introduce data processing approaches for action potentials and LFP based on the original data collected through multi-channel recording system. Action potential signals are high-frequency signals, hence high sampling rate of 40 kHz is normally chosen for recording. Based on waveforms of extracellularly recorded action potentials, tetrode technology combining principal component analysis can be used to discriminate neuronal spiking signals from differently spatially distributed neurons, in order to obtain accurate single neuron spiking activity. LFPs are low-frequency signals (lower than 300 Hz), hence the sampling rate of 1 kHz is used for LFPs. Digital filtering is required for LFP analysis to isolate different frequency oscillations including theta oscillation (4-12 Hz), which is dominant in active exploration and rapid-eye-movement (REM) sleep, gamma oscillation (30-80 Hz), which is accompanied by theta oscillation during cognitive processing, and high frequency ripple oscillation (100-250 Hz) in awake immobility and slow wave sleep (SWS) state in rodent hippocampus. For the obtained signals, common data post-processing methods include inter-spike interval analysis, spike auto-correlation analysis, spike cross-correlation analysis, power spectral density analysis, and spectrogram analysis.
1987-01-01
exercise practices (Veninga 1962). Positive Co’ Mdchasms Coping mechanisms of a positive sort that favor eustressful outcomes are direct actions to deal...resistance through proper diet, exercise , sleep, and relaxation. Cognitive restructuring is another direct strategy used to consciously change the...developing social support, implementing relaxation techniques, meditating, along with proper diet. sleep, and exercise are all positive ways to cope with
Scintillation-based Search for Off-pulse Radio Emission from Pulsars
NASA Astrophysics Data System (ADS)
Ravi, Kumar; Deshpande, Avinash A.
2018-05-01
We propose a new method to detect off-pulse (unpulsed and/or continuous) emission from pulsars using the intensity modulations associated with interstellar scintillation. Our technique involves obtaining the dynamic spectra, separately for on-pulse window and off-pulse region, with time and frequency resolutions to properly sample the intensity variations due to diffractive scintillation and then estimating their mutual correlation as a measure of off-pulse emission, if any. We describe and illustrate the essential details of this technique with the help of simulations, as well as real data. We also discuss the advantages of this method over earlier approaches to detect off-pulse emission. In particular, we point out how certain nonidealities inherent to measurement setups could potentially affect estimations in earlier approaches and argue that the present technique is immune to such nonidealities. We verify both of the above situations with relevant simulations. We apply this method to the observation of PSR B0329+54 at frequencies of 730 and 810 MHz made with the Green Bank Telescope and present upper limits for the off-pulse intensity at the two frequencies. We expect this technique to pave the way for extensive investigations of off-pulse emission with the help of existing dynamic spectral data on pulsars and, of course, with more sensitive long-duration data from new observations.
A genetic algorithm-based framework for wavelength selection on sample categorization.
Anzanello, Michel J; Yamashita, Gabrielli; Marcelo, Marcelo; Fogliatto, Flávio S; Ortiz, Rafael S; Mariotti, Kristiane; Ferrão, Marco F
2017-08-01
In forensic and pharmaceutical scenarios, the application of chemometrics and optimization techniques has unveiled common and peculiar features of seized medicine and drug samples, helping investigative forces to track illegal operations. This paper proposes a novel framework aimed at identifying relevant subsets of attenuated total reflectance Fourier transform infrared (ATR-FTIR) wavelengths for classifying samples into two classes, for example authentic or forged categories in case of medicines, or salt or base form in cocaine analysis. In the first step of the framework, the ATR-FTIR spectra were partitioned into equidistant intervals and the k-nearest neighbour (KNN) classification technique was applied to each interval to insert samples into proper classes. In the next step, selected intervals were refined through the genetic algorithm (GA) by identifying a limited number of wavelengths from the intervals previously selected aimed at maximizing classification accuracy. When applied to Cialis®, Viagra®, and cocaine ATR-FTIR datasets, the proposed method substantially decreased the number of wavelengths needed to categorize, and increased the classification accuracy. From a practical perspective, the proposed method provides investigative forces with valuable information towards monitoring illegal production of drugs and medicines. In addition, focusing on a reduced subset of wavelengths allows the development of portable devices capable of testing the authenticity of samples during police checking events, avoiding the need for later laboratorial analyses and reducing equipment expenses. Theoretically, the proposed GA-based approach yields more refined solutions than the current methods relying on interval approaches, which tend to insert irrelevant wavelengths in the retained intervals. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip
2012-02-01
The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.
Ahmad Khan, Hayat; Kamal, Younis; Lone, Ansar Ul Haq
2014-04-01
Fishing is a leisure activity for some people around the world. Accidently the fish hook can get hooked in the hand. If the hook is barbed, removal becomes difficult. We report a case of such a injury in the hand and discuss the technique for its removal with a brief review of the literature. A thirty-two year old male accidently suffered a fishhook injury to his hand. He came to the orthopaedic ward two hours after the incident with pain; the fish hook was hanging from the hand. Unsuccessful attempts to remove it were made by his relatives. A push-through and cut-off technique was used for removal of barbed hook. Barbed hooks are to be removed atraumatically with controlled incision over properly anaesthetised skin. Proper wound management and prophylactic antibiotics suitable for treatment of Aeromonas species should be initiated to prevent complications.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
NASA Technical Reports Server (NTRS)
Daunton, N.; Damelio, F.; Krasnov, I.
1990-01-01
Frontal lobe samples of rat brains flown aboard Cosmos 1887 were processed for the study of muscarinic (cholinergic) and GABA (benzodiazepine) receptors and for immunocytochemical localization of the neurotransmitter gamma-aminobutyric acid (GABA) and glial fibrillary acidic protein (GFAP). Although radioactive labeling of both muscarinic cholinergic and GABA (benzodiazepine) receptors proved to be successful with the techniques employed, distinct receptor localization of individual laminae of the frontal neocortex was not possible since the sampling of the area was different in the various groups of animals. In spite of efforts made for proper orientation and regional identification of laminae, it was found that a densitometric (quantitation of autoradiograms) analysis of the tissue did not contribute to the final interpretation of the effects of weightlessness on these receptors. As to the immunocytochemical studies the use of both markers, GFAP and GABA antiserum, confirmed the suitability of the techniques for use in frozen material. However, similar problems to those encountered in the receptor studies prevented an adequate interpretation of the effects of micro-G exposure on the localization and distribution of GABA and GFAP. This study did, however, confirm the feasibility of investigating neurotransmitters and their receptors in future space flight experiments.
Color image analysis technique for measuring of fat in meat: an application for the meat industry
NASA Astrophysics Data System (ADS)
Ballerini, Lucia; Hogberg, Anders; Lundstrom, Kerstin; Borgefors, Gunilla
2001-04-01
Intramuscular fat content in meat influences some important meat quality characteristics. The aim of the present study was to develop and apply image processing techniques to quantify intramuscular fat content in beefs together with the visual appearance of fat in meat (marbling). Color images of M. longissimus dorsi meat samples with a variability of intramuscular fat content and marbling were captured. Image analysis software was specially developed for the interpretation of these images. In particular, a segmentation algorithm (i.e. classification of different substances: fat, muscle and connective tissue) was optimized in order to obtain a proper classification and perform subsequent analysis. Segmentation of muscle from fat was achieved based on their characteristics in the 3D color space, and on the intrinsic fuzzy nature of these structures. The method is fully automatic and it combines a fuzzy clustering algorithm, the Fuzzy c-Means Algorithm, with a Genetic Algorithm. The percentages of various colors (i.e. substances) within the sample are then determined; the number, size distribution, and spatial distributions of the extracted fat flecks are measured. Measurements are correlated with chemical and sensory properties. Results so far show that advanced image analysis is useful for quantify the visual appearance of meat.
The state of the art in raptor electrocution research: A global review
Lehman, Robert N.; Kennedy, P.L.; Savidge, J.A.
2007-01-01
We systematically reviewed the raptor electrocution literature to evaluate study designs and methods used in raptor electrocution research, mitigation, and monitoring, emphasizing original research published in English. Specifically, we wondered if three decades of effort to reduce raptor electrocutions has had positive effects. The majority of literature examined came from North America, western Europe, and South Africa. In spite of intensive and often sustained effort by industry and governments across three continents for 30 years, reductions in the incidence of electrocution have been demonstrated in only a few studies. Reliable rate estimates of electrocution mortality generally are unavailable, with some exceptions. Nearly half of 110 studies we analyzed in detail were retrospective reviews of historical mortality records, banding data, or results of necropsies on dead birds received at pathology and veterinary facilities. Among prospective studies, less than half used unbiased approaches to sampling and many did not provide enough detail to assess the sampling design used. At this time, few researchers can demonstrate the reliability of standardized retrofitting procedures or the effectiveness of monitoring techniques. Future progress in reducing raptor mortalities on power lines will benefit from properly designed studies that generate rate estimates of mortality, address biasing factors, and include predictions concerning risk and techniques to reduce risk that can be tested in the field or laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wunschel, David S.; Hill, Eric A.; Mclean, Jeffrey S.
Rapid identification of microorganisms using matrix assisted laser desorption/ionization (MALDI) is a rapidly growing area of research due to the minimal sample preparation, speed of analysis and broad applicability of the technique. This approach relies on protein markers to identify microorganisms. Therefore, variations in culture conditions that affect protein expression may limit the ability of MALDI-MS to correctly identify an organism. We have expanded our efforts to investigate the effects of culture conditions on MALDI-MS protein signatures to examine the effects of pH, growth rate and temperature. Continuous cultures maintained in bioreactors were used to maintain specific growth rates andmore » pH for E. coli HB 101. Despite measurable morphological differences between growth conditions, the MALDI-MS data associated each culture with the appropriate library entry (E. coli HB 101 generated using batch culture on a LB media), independent of pH or growth rate. The lone exception was for a biofilm sample collected from one of the reactors which had no appreciable degree of association with the correct library entry. Within the data set for planktonic organisms, variations in growth rate created the largest variation between fingerprints. The effect of varying growth temperature on Y. enterocolitica was also examined. While the anticipated effects on phenotype were observed, the MALDI-MS technique provided the proper identification.« less
El-Saeid, Mohamed H.; Kanu, Ijeoma; Anyanwu, Ebere C.; Saleh, Mahmoud A.
2005-01-01
It is an accepted fact that many food products that we eat today have the possibility of being contaminated by various chemicals used from planting to processing. These chemicals have been shown to cause illnesses for which some concerned government agencies have instituted regulatory mechanisms to minimize the risks and the effects on humans. It is for these concerns that reliable and accurate rapid determination techniques are needed to effect proper regulatory standards for the protection of people's nutritional health. This paper, therefore, reports the comparative evaluation of the extraction methods in the determination of atrazine (commonly used in agricultural as a herbicide) residues in foods using supercritical fluid chromatography (SFC) and enzyme-linked immunosorbent assay (ELISA) techniques. Supercritical fluid extraction (SFE) and microwave solvent extraction (MSE) methods were used to test samples of frozen vegetables, fruit juice, and jam from local food markets in Houston. Results showed a high recovery percentage of atrazine residues using supercritical fluid coupled with ELISA and SFC than with MSE. Comparatively, however, atrazine was detected 90.9 and 54.5% using SFC and ELISA techniques, respectively. ELISA technique was, however, less time consuming, lower in cost, and more sensitive with low detection limit of atrazine residues than SFC technique. PMID:15674445
NASA Astrophysics Data System (ADS)
Stremtan, Ciprian; Ashkanani, Hasan; Tykot, Robert H.
2013-04-01
The study of bi-phase (i.e. matrix and clasts) geochemical composition of ceramic artifacts is a very powerful tool in fingerprinting the raw materials used by ancient manufacturers (clay sources, tempering materials, coloring agents, etc.), as well as in understanding the physical parameters of the manufacturing techniques. Reliable datasets often require the deployment of destructive techniques that will irremediably damage the artifact. Recent advances in portable X-ray fluorescence instrumentation (pXRF) allow for quick measurements of a range of chemical elements that not too long ago were available only through complicated and often destructive means of analytical chemistry (instrumental neutron activation analysis - INAA, inductively coupled plasma mass spectrometry - ICP-MS, direct coupled plasma-optical emission spectroscopy - DCP-OES etc.). In this contribution we present a comparison of datasets acquired by means of pXRF, DCP-OES, and ICP-MS on Bronze Age ceramics from Failaka Island (Kuwait) and Bahrain. The samples chosen for this study are fine grained, with very well sorted mineral components, and lack any visible organic material fragments. The sample preparation for ICP-MS and DCP-OES analyses was carried out on powdered samples, by using LiBO2 flux fusion and Ge (for the DCP-OES) and In (for ICP-MS) were used as internal standards. The measurements were calibrated against certified reference materials ranging from shales to rhyolites (SGR-1, SDo-1, JA-2, and JR-1) and performed at Univerity of South Florida's Center for Geochemical Analyses. The analytical errors for major elements was smaller than 5 %, while for selected trace elements the error was usually smaller than 3 %. The same set of elements was measured on the same samples at University of South Florida's Anthropology Department using a pXRF device equipped with obsidian filter. Each sample was measured three times and the values were averaged. Two certified reference materials (NIST-612 glass and MACS-3 pressed powder) were also measured to check for accuracy and precision. Our preliminary data shows that most of the major and trace elemental data acquired by both methods are consistent. Some transition metals (e.g. Y, Fe, and Mn) yielded overall lower values when measured with pXRF device (ranging from 27 to 60 % difference), while Ni and Ba showed systematically higher values (20 to 53 %). If samples are chosen properly for pXRF measurements (i.e. thoroughly cleaned, fine grained, well sorted) and the device is properly calibrated, the results are comparable with DCP-OES and ICP-MS data, thus being suitable to use for geochemical fingerprinting
Computational Fluid Dynamics (CFD) techniques are increasingly being applied to air quality modeling of short-range dispersion, especially the flow and dispersion around buildings and other geometrically complex structures. The proper application and accuracy of such CFD techniqu...
Fixing the reference frame for PPMXL proper motions using extragalactic sources
Grabowski, Kathleen; Carlin, Jeffrey L.; Newberg, Heidi Jo; ...
2015-05-27
In this study, we quantify and correct systematic errors in PPMXL proper motions using extragalactic sources from the first two LAMOST data releases and the Vèron-Cetty & Vèron Catalog of Quasars. Although the majority of the sources are from the Vèron catalog, LAMOST makes important contributions in regions that are not well-sampled by previous catalogs, particularly at low Galactic latitudes and in the south Galactic cap. We show that quasars in PPMXL have measurable and significant proper motions, which reflect the systematic zero-point offsets present in the catalog. We confirm the global proper motion shifts seen by Wu et al.,more » and additionally find smaller-scale fluctuations of the QSO-derived corrections to an absolute frame. Finally, we average the proper motions of 158 106 extragalactic objects in bins of 3° × 3° and present a table of proper motion corrections.« less
Exploring the Role of Receptor Flexibility in Structure-Based Drug Discovery
Feixas, Ferran; Lindert, Steffen; Sinko, William; McCammon, J. Andrew
2015-01-01
The proper understanding of biomolecular recognition mechanisms that take place in a drug target is of paramount importance to improve the efficiency of drug discovery and development. The intrinsic dynamic character of proteins has a strong influence on biomolecular recognition mechanisms and models such as conformational selection have been widely used to account for this dynamic association process. However, conformational changes occurring in the receptor prior and upon association with other molecules are diverse and not obvious to predict when only a few structures of the receptor are available. In view of the prominent role of protein flexibility in ligand binding and its implications for drug discovery, it is of great interest to identify receptor conformations that play a major role in biomolecular recognition before starting rational drug design efforts. In this review, we discuss a number of recent advances in computer-aided drug discovery techniques that have been proposed to incorporate receptor flexibility into structure-based drug design. The allowance for receptor flexibility provided by computational techniques such as molecular dynamics simulations or enhanced sampling techniques helps to improve the accuracy of methods used to estimate binding affinities and, thus, such methods can contribute to the discovery of novel drug leads. PMID:24332165
Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura
2018-06-01
There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.
Meditation and mindfulness in clinical practice.
Simkin, Deborah R; Black, Nancy B
2014-07-01
This article describes the various forms of meditation and provides an overview of research using these techniques for children, adolescents, and their families. The most researched techniques in children and adolescents are mindfulness-based stress reduction, mindfulness-based cognitive therapy, yoga meditation, transcendental meditation, mind-body techniques (meditation, relaxation), and body-mind techniques (yoga poses, tai chi movements). Current data are suggestive of a possible value of meditation and mindfulness techniques for treating symptomatic anxiety, depression, and pain in youth. Clinicians must be properly trained before using these techniques. Copyright © 2014 Elsevier Inc. All rights reserved.
Live Cell Imaging and Measurements of Molecular Dynamics
Frigault, M.; Lacoste, J.; Swift, J.; Brown, C.
2010-01-01
w3-2 Live cell microscopy is becoming widespread across all fields of the life sciences, as well as, many areas of the physical sciences. In order to accurately obtain live cell microscopy data, the live specimens must be properly maintained on the imaging platform. In addition, the fluorescence light path must be optimized for efficient light transmission in order to reduce the intensity of excitation light impacting the living sample. With low incident light intensities the processes under study should not be altered due to phototoxic effects from the light allowing for the long term visualization of viable living samples. Aspects for maintaining a suitable environment for the living sample, minimizing incident light and maximizing detection efficiency will be presented for various fluorescence based live cell instruments. Raster Image Correlation Spectroscopy (RICS) is a technique that uses the intensity fluctuations within laser scanning confocal images, as well as the well characterized scanning dynamics of the laser beam, to extract the dynamics, concentrations and clustering of fluorescent molecules within the cell. In addition, two color cross-correlation RICS can be used to determine protein-protein interactions in living cells without the many technical difficulties encountered in FRET based measurements. RICS is an ideal live cell technique for measuring cellular dynamics because the potentially damaging high intensity laser bursts required for photobleaching recovery measurements are not required, rather low laser powers, suitable for imaging, can be used. The RICS theory will be presented along with examples of live cell applications.
Stabilization of Joule Heating in the Electropyroelectric Method
NASA Astrophysics Data System (ADS)
Ivanov, R.; Hernández, M.; Marín, E.; Araujo, C.; Alaniz, D.; Araiza, M.; Martínez-Ordoñez, E. I.
2012-11-01
Recently the so-called electropyroelectric technique for thermal characterization of liquids has been proposed (Ivanov et al., J. Phys. D: Appl. Phys. 43, 225501 (2010)). In this method a pyroelectric sensor, in good thermal contact with the investigated sample, is heated by passing an amplitude-modulated electrical current through the electrical contacts. As a result of the heat dissipated to the sample, the pyroelectric signal measured as a voltage drop across the electrical contacts changes in a periodical way. The amplitude and phase of this signal can be measured by lock-in detection as a function of the electrical current modulation frequency. Because the signal amplitude and phase depend on the thermal properties of the sample, these can be determined straightforwardly by fitting the experimental data to a theoretical model based on the solution of the heat diffusion equation with proper boundary conditions. In general, the experimental conditions are selected so that the thermal effusivity becomes the measured magnitude. The technique has the following handicap. As the result of heating and wear of the metal coating layers (previously etched to achieve a serpentine form) with time, their electrical resistance changes with time, so that the heat power dissipated by the Joule effect can vary, and thermal effusivity measurement can become inaccurate. To avoid this problem in this study, a method is proposed that allows maintaining stable the Joule dissipated power. An electronic circuit is designed whose stability and characteristics are investigated and discussed.
50 CFR 260.58 - Accessibility for sampling.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false Accessibility for sampling. 260.58 Section... Fishery Products for Human Consumption Sampling § 260.58 Accessibility for sampling. Each applicant shall cause the processed products for which inspection is requested to be made accessible for proper sampling...
50 CFR 260.58 - Accessibility for sampling.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 50 Wildlife and Fisheries 9 2011-10-01 2011-10-01 false Accessibility for sampling. 260.58 Section... Fishery Products for Human Consumption Sampling § 260.58 Accessibility for sampling. Each applicant shall cause the processed products for which inspection is requested to be made accessible for proper sampling...
Understanding the Sampling Distribution and the Central Limit Theorem.
ERIC Educational Resources Information Center
Lewis, Charla P.
The sampling distribution is a common source of misuse and misunderstanding in the study of statistics. The sampling distribution, underlying distribution, and the Central Limit Theorem are all interconnected in defining and explaining the proper use of the sampling distribution of various statistics. The sampling distribution of a statistic is…
Improving Short Term Instability for Quantitative Analyses with Portable Electronic Noses
Macías, Miguel Macías; Agudo, J. Enrique; Manso, Antonio García; Orellana, Carlos Javier García; Velasco, Horacio Manuel González; Caballero, Ramón Gallardo
2014-01-01
One of the main problems when working with electronic noses is the lack of reproducibility or repeatability of the sensor response, so that, if this problem is not properly considered, electronic noses can be useless, especially for quantitative analyses. On the other hand, irreproducibility is increased with portable and low cost electronic noses where laboratory equipment like gas zero generators cannot be used. In this work, we study the reproducibility of two portable electronic noses, the PEN3 (commercial) and CAPINose (a proprietary design) by using synthetic wine samples. We show that in both cases short term instability associated to the sensors' response to the same sample and under the same conditions represents a major problem and we propose an internal normalization technique that, in both cases, reduces the variability of the sensors' response. Finally, we show that the normalization proposed seems to be more effective in the CAPINose case, reducing, for example, the variability associated to the TGS2602 sensor from 12.19% to 2.2%. PMID:24932869
Virus characterization and discovery in formalin-fixed paraffin-embedded tissues.
Bodewes, Rogier; van Run, Peter R W A; Schürch, Anita C; Koopmans, Marion P G; Osterhaus, Albert D M E; Baumgärtner, Wolfgang; Kuiken, Thijs; Smits, Saskia L
2015-03-01
Detection and characterization of novel viruses is hampered frequently by the lack of properly stored materials. Especially for the retrospective identification of viruses responsible for past disease outbreaks, often only formalin-fixed paraffin-embedded (FFPE) tissue samples are available. Although FFPE tissues can be used to detect known viral sequences, the application of FFPE tissues for detection of novel viruses is currently unclear. In the present study it was shown that sequence-independent amplification in combination with next-generation sequencing can be used to detect sequences of known and unknown viruses, although with relatively low sensitivity. These findings indicate that this technique could be useful for detecting novel viral sequences in FFPE tissues collected from humans and animals with disease of unknown origin, when other samples are not available. In addition, application of this method to FFPE tissues allows to correlate with the presence of histopathological changes in the corresponding tissue sections. Copyright © 2015 Elsevier B.V. All rights reserved.
Hydrologic-information needs for oil-shale development, northwestern Colorado
Taylor, O.J.
1982-01-01
Hydrologic information is not adequate for proper development of the large oil-shale reserves of Piceance basin in northwestern Colorado. Exploratory drilling and aquifer testing are needed to define the hydrologic system, to provide wells for aquifer testing, to design mine-drainage techniques, and to explore for additional water supplies. Sampling networks are needed to supply hydrologic data on the quantity and quality of surface water, ground water, and springs. A detailed sampling network is proposed for the White River basin because of expected impacts related to water supplies and waste disposal. Emissions from oil-shale retorts to the atmosphere need additional study because of possible resulting corrosion problems and the destruction of fisheries. Studies of the leachate materials and the stability of disposed retorted shale piles are needed to insure that these materials will not cause problems. Hazards related to in-situ retorts, and the wastes related to oil-shale development in general also need further investigation. (USGS)
Lock-in thermography using a cellphone attachment infrared camera
NASA Astrophysics Data System (ADS)
Razani, Marjan; Parkhimchyk, Artur; Tabatabaei, Nima
2018-03-01
Lock-in thermography (LIT) is a thermal-wave-based, non-destructive testing, technique which has been widely utilized in research settings for characterization and evaluation of biological and industrial materials. However, despite promising research outcomes, the wide spread adaptation of LIT in industry, and its commercialization, is hindered by the high cost of the infrared cameras used in the LIT setups. In this paper, we report on the feasibility of using inexpensive cellphone attachment infrared cameras for performing LIT. While the cost of such cameras is over two orders of magnitude less than their research-grade counterparts, our experimental results on block sample with subsurface defects and tooth with early dental caries suggest that acceptable performance can be achieved through careful instrumentation and implementation of proper data acquisition and image processing steps. We anticipate this study to pave the way for development of low-cost thermography systems and their commercialization as inexpensive tools for non-destructive testing of industrial samples as well as affordable clinical devices for diagnostic imaging of biological tissues.
Half-life determination for {sup 108}Ag and {sup 110}Ag
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zahn, Guilherme S.; Genezini, Frederico A.
2014-11-11
In this work, the half-life of the short-lived silver radionuclides {sup 108}Ag and {sup 110}Ag were measured by following the activity of samples after they were irradiated in the IEA-R1 reactor. The results were then fitted using a non-paralizable dead time correction to the regular exponential decay and the individual half-life values obtained were then analyzed using both the Normalized Residuals and the Rajeval techniques, in order to reach the most exact and precise final values. To check the validity of dead-time correction, a second correction method was also employed by means of counting a long-lived {sup 60}Co radioactive sourcemore » together with the samples as a livetime chronometer. The final half-live values obtained using both dead-time correction methods were in good agreement, showing that the correction was properly assessed. The results obtained are partially compatible with the literature values, but with a lower uncertainty, and allow a discussion on the last ENSDF compilations' values.« less
Wood, Jessica L; Steiner, Robert R
2011-06-01
Forensic analysis of pharmaceutical preparations requires a comparative analysis with a standard of the suspected drug in order to identify the active ingredient. Purchasing analytical standards can be expensive or unattainable from the drug manufacturers. Direct Analysis in Real Time (DART™) is a novel, ambient ionization technique, typically coupled with a JEOL AccuTOF™ (accurate mass) mass spectrometer. While a fast and easy technique to perform, a drawback of using DART™ is the lack of component separation of mixtures prior to ionization. Various in-house pharmaceutical preparations were purified using thin-layer chromatography (TLC) and mass spectra were subsequently obtained using the AccuTOF™- DART™ technique. Utilizing TLC prior to sample introduction provides a simple, low-cost solution to acquiring mass spectra of the purified preparation. Each spectrum was compared against an in-house molecular formula list to confirm the accurate mass elemental compositions. Spectra of purified ingredients of known pharmaceuticals were added to an in-house library for use as comparators for casework samples. Resolving isomers from one another can be accomplished using collision-induced dissociation after ionization. Challenges arose when the pharmaceutical preparation required an optimized TLC solvent to achieve proper separation and purity of the standard. Purified spectra were obtained for 91 preparations and included in an in-house drug standard library. Primary standards would only need to be purchased when pharmaceutical preparations not previously encountered are submitted for comparative analysis. TLC prior to DART™ analysis demonstrates a time efficient and cost saving technique for the forensic drug analysis community. Copyright © 2011 John Wiley & Sons, Ltd. Copyright © 2011 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Cook, Neil J.; Scholz, Aleks; Jayawardhana, Ray
2017-12-01
Our understanding of the brown dwarf population in star-forming regions is dependent on knowing distances and proper motions and therefore will be improved through the Gaia space mission. In this paper, we select new samples of very low-mass objects (VLMOs) in Upper Scorpius using UKIDSS colors and optimized proper motions calculated using Gaia DR1. The scatter in proper motions from VLMOs in Upper Scorpius is now (for the first time) dominated by the kinematic spread of the region itself, not by the positional uncertainties. With age and mass estimates updated using Gaia parallaxes for early-type stars in the same region, we determine masses for all VLMOs. Our final most complete sample includes 453 VLMOs of which ˜125 are expected to be brown dwarfs. The cleanest sample is comprised of 131 VLMOs, with ˜105 brown dwarfs. We also compile a joint sample from the literature that includes 415 VLMOs, out of which 152 are likely brown dwarfs. The disk fraction among low-mass brown dwarfs (M< 0.05 {M}⊙ ) is substantially higher than in more massive objects, indicating that disks around low-mass brown dwarfs survive longer than in low-mass stars overall. The mass function for 0.01< M< 0.1 {M}⊙ is consistent with the Kroupa Initial Mass Function. We investigate the possibility that some “proper motion outliers” have undergone a dynamical ejection early in their evolution. Our analysis shows that the color-magnitude cuts used when selecting samples introduce strong bias into the population statistics due to varying levels of contamination and completeness.
Blumthaler, Ingrid; Oberst, Ulrich
2012-03-01
Control design belongs to the most important and difficult tasks of control engineering and has therefore been treated by many prominent researchers and in many textbooks, the systems being generally described by their transfer matrices or by Rosenbrock equations and more recently also as behaviors. Our approach to controller design uses, in addition to the ideas of our predecessors on coprime factorizations of transfer matrices and on the parametrization of stabilizing compensators, a new mathematical technique which enables simpler design and also new theorems in spite of the many outstanding results of the literature: (1) We use an injective cogenerator signal module ℱ over the polynomial algebra [Formula: see text] (F an infinite field), a saturated multiplicatively closed set T of stable polynomials and its quotient ring [Formula: see text] of stable rational functions. This enables the simultaneous treatment of continuous and discrete systems and of all notions of stability, called T-stability. We investigate stabilizing control design by output feedback of input/output (IO) behaviors and study the full feedback IO behavior, especially its autonomous part and not only its transfer matrix. (2) The new technique is characterized by the permanent application of the injective cogenerator quotient signal module [Formula: see text] and of quotient behaviors [Formula: see text] of [Formula: see text]-behaviors B. (3) For the control tasks of tracking, disturbance rejection, model matching, and decoupling and not necessarily proper plants we derive necessary and sufficient conditions for the existence of proper stabilizing compensators with proper and stable closed loop behaviors, parametrize all such compensators as IO behaviors and not only their transfer matrices and give new algorithms for their construction. Moreover we solve the problem of pole placement or spectral assignability for the complete feedback behavior. The properness of the full feedback behavior ensures the absence of impulsive solutions in the continuous case, and that of the compensator enables its realization by Kalman state space equations or elementary building blocks. We note that every behavior admits an IO decomposition with proper transfer matrix, but that most of these decompositions do not have this property, and therefore we do not assume the properness of the plant. (4) The new technique can also be applied to more general control interconnections according to Willems, in particular to two-parameter feedback compensators and to the recent tracking framework of Fiaz/Takaba/Trentelman. In contrast to these authors, however, we pay special attention to the properness of all constructed transfer matrices which requires more subtle algorithms.
New Astrometric Limits on the Stochastic Gravitational Wave Background
NASA Astrophysics Data System (ADS)
Darling, Jeremiah K.; Truebenbach, Alexandra; Paine, Jennie
2018-06-01
We present new limits on the low frequency (f < 10-8 Hz) stochastic gravitational wave background using correlated extragalactic proper motions. The familiar methods for gravitational wave detection are ground- and space-based laser interferometry, pulsar timing, and polarization of the cosmic microwave background. Astrometry offers an additional path to gravitational wave detection because gravitational waves deflect the light rays of extragalactic objects, creating apparent proper motions in a quadrupolar (and higher order modes) pattern. Astrometry is sensitive to gravitational waves with frequencies between roughly 10-18 Hz and 10-8 Hz (between H0 and 1/3 yr-1), which overlaps and bridges the pulsar timing and CMB polarization regimes. We present the methods and results of two complementary approaches to astrometric gravitational wave detection: (1) a small ~500-object radio interferometric sample with low per-source proper motion uncertainty but large intrinsic proper motions caused by radio jets, and (2) a thousand-fold larger sample with large per-source uncertainties that has small intrinsic proper motions (Gaia active galactic nuclei). Both approaches produce limits on ΩGW, the energy density of gravitational waves as a fraction of the cosmological critical energy density.The authors acknowledge support from the NSF grant AST-1411605 and the NASA grant 14-ATP14-0086.
45 CFR 153.350 - Risk adjustment data validation standards.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Risk adjustment data validation standards. 153.350... validation standards. (a) General requirement. The State, or HHS on behalf of the State, must ensure proper implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of...
45 CFR 153.350 - Risk adjustment data validation standards.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Risk adjustment data validation standards. 153.350... validation standards. (a) General requirement. The State, or HHS on behalf of the State, must ensure proper implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of...
Mattingly, G. E.
1992-01-01
Critical measurement performance of fluid flowmeters requires proper and quantified verification data. These data should be generated using calibration and traceability techniques established for these verification purposes. In these calibration techniques, the calibration facility should be well-characterized and its components and performance properly traced to pertinent higher standards. The use of this calibrator to calibrate flowmeters should be appropriately established and the manner in which the calibrated flowmeter is used should be specified in accord with the conditions of the calibration. These three steps: 1) characterizing the calibration facility itself, 2) using the characterized facility to calibrate a flowmeter, and 3) using the calibrated flowmeter to make a measurement are described and the pertinent equations are given for an encoded-stroke, piston displacement-type calibrator and a pulsed output flowmeter. It is concluded that, given these equations and proper instrumentation of this type of calibrator, very high levels of performance can be attained and, in turn, these can be used to achieve high fluid flow rate measurement accuracy with pulsed output flowmeters. PMID:28053444
ERIC Educational Resources Information Center
Goldgehn, Leslie A.
1990-01-01
A survey of 791 college admissions officers investigated the use and perceived effectiveness of 15 marketing techniques: publicity; target marketing; market segmentation; advertising; program development; market positioning; market research; access; marketing plan; pricing; marketing committee; advertising research; consultants; marketing audit;…
Using Kitchen Appliance Analogies to Improve Students' Reasoning about Neurological Results
ERIC Educational Resources Information Center
Vishton, Peter M.
2005-01-01
This article describes and evaluates a new technique for teaching students to interpret studies of patients with brain injuries. This technique asks students to consider how knives and blenders lose specific functionality when they are damaged. This approach better prepares students to make proper inferences from behavioral deficits observed after…
Bridging the Gap between Basic and Clinical Sciences: A Description of a Radiological Anatomy Course
ERIC Educational Resources Information Center
Torres, Anna; Staskiewicz, Grzegorz J.; Lisiecka, Justyna; Pietrzyk, Lukasz; Czekajlo, Michael; Arancibia, Carlos U.; Maciejewski, Ryszard; Torres, Kamil
2016-01-01
A wide variety of medical imaging techniques pervade modern medicine, and the changing portability and performance of tools like ultrasound imaging have brought these medical imaging techniques into the everyday practice of many specialties outside of radiology. However, proper interpretation of ultrasonographic and computed tomographic images…
7 CFR 58.244 - Number of samples.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Number of samples. 58.244 Section 58.244 Agriculture... Procedures § 58.244 Number of samples. As many samples shall be taken from each dryer production lot as is necessary to assure proper composition and quality control. A sufficient number of representative samples...
7 CFR 58.244 - Number of samples.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Number of samples. 58.244 Section 58.244 Agriculture... Procedures § 58.244 Number of samples. As many samples shall be taken from each dryer production lot as is necessary to assure proper composition and quality control. A sufficient number of representative samples...
A comparison between families obtained from different proper elements
NASA Technical Reports Server (NTRS)
Zappala, Vincenzo; Cellino, Alberto; Farinella, Paolo
1992-01-01
Using the hierarchical method of family identification developed by Zappala et al., the results coming from the data set of proper elements computed by Williams (about 2100 numbered + about 1200 PLS 2 asteroids) and by Milani and Knezevic (5.7 version, about 4200 asteroids) are compared. Apart from some expected discrepancies due to the different data sets and/or low accuracy of proper elements computed in peculiar dynamical zones, a good agreement was found in several cases. It follows that these high reliability families represent a sample which can be considered independent on the methods used for their proper elements computation. Therefore, they should be considered as the best candidates for detailed physical studies.
Kodama, Nao; Kose, Katsumi
2016-10-11
Echo-planar imaging (EPI) sequences were developed for a 9.4 Tesla vertical standard bore (~54 mm) superconducting magnet using an unshielded gradient coil optimized for live mice imaging and a data correction technique with reference scans. Because EPI requires fast switching of intense magnetic field gradients, eddy currents were induced in the surrounding metallic materials, e.g., the room temperature bore, and this produced serious artifacts on the EPI images. We solved the problem using an unshielded gradient coil set of proper size (outer diameter = 39 mm, inner diameter = 32 mm) with time control of the current rise and reference scans. The obtained EPI images of a phantom and a plant sample were almost artifact-free and demonstrated the promise of our approach.
NASA Technical Reports Server (NTRS)
Hilsenrath, E.; Kirschner, P. T.
1980-01-01
The chemiluminescent rocket ozonesonde utilizing rhodamine-B as a detector and self-pumping for air sampling has been improved. The instrument employs standard meteorological sounding systems and is the only technique available for routine nighttime ozone measurements above balloon altitudes. The chemiluminescent detector, when properly calibrated, is shown to be specific to ozone, stable, and of sufficient sensitivity for accurate measurements of ozone from about 65-20 km. An error analysis indicates that the measured ozone profiles have an absolute accuracy of about + or - 12% and a precision of about + or - 6%. Approximately 20 flights have been conducted for geophysical investigations, while additional flights were conducted with other rocket and satellite ozone soundings for comparisons. In general, these comparisons showed good agreement.
The Genome Austria Tissue Bank (GATiB).
Asslaber, M; Abuja, P M; Stark, K; Eder, J; Gottweis, H; Trauner, M; Samonigg, H; Mischinger, H J; Schippinger, W; Berghold, A; Denk, H; Zatloukal, K
2007-01-01
In the context of the Austrian Genome Program, a tissue bank is being established (Genome Austria Tissue Bank, GATiB) which is based on a collection of diseased and corresponding normal tissues representing a great variety of diseases at their natural frequency of occurrence from a non-selected Central European population of more than 700,000 patients. Major emphasis is put on annotation of archival tissue with comprehensive clinical data, including follow-up data. A specific IT infrastructure supports sample annotation, tracking of sample usage as well as sample and data storage. Innovative data protection tools were developed which prevent sample donor re-identification, particularly if detailed medical and genetic data are combined. For quality control of old archival tissues, new techniques were established to check RNA quality and antigen stability. Since 2003, GATiB has changed from a population-based tissue bank to a disease-focused biobank comprising major cancers such as colon, breast, liver, as well as metabolic liver diseases and organs affected by the metabolic syndrome. Prospectively collected tissues are associated with blood samples and detailed data on the sample donor's disease, lifestyle and environmental exposure, following standard operating procedures. Major emphasis is also placed on ethical, legal and social issues (ELSI) related to biobanks. A specific research project and an international advisory board ensure the proper embedding of GATiB in society and facilitate international networking. (c) 2007 S. Karger AG, Basel.
Overview of hybridization and detection techniques.
Hilario, Elena
2007-01-01
A misconception regarding the sensitivity of nonradioactive methods for screening genomic DNA libraries often hinders the establishment of these environmentally friendly techniques in molecular biology laboratories. Nonradioactive probes, properly prepared and quantified, can detect DNA target molecules to the femtomole range. However, appropriate hybridization techniques and detection methods should also be adopted for an efficient use of nonradioactive techniques. Detailed descriptions of genomic library handling before and during the nonradioactive hybridization and detection are often omitted from publications. This chapter aims to fill this void by providing a collection of technical tips on hybridization and detection techniques.
SAGITTARIUS STREAM THREE-DIMENSIONAL KINEMATICS FROM SLOAN DIGITAL SKY SURVEY STRIPE 82
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koposov, Sergey E.; Belokurov, Vasily; Evans, N. Wyn
2013-04-01
Using multi-epoch observations of the Stripe 82 region from the Sloan Digital Sky Survey (SDSS), we measure precise statistical proper motions of the stars in the Sagittarius (Sgr) stellar stream. The multi-band photometry and SDSS radial velocities allow us to efficiently select Sgr members and thus enhance the proper-motion precision to {approx}0.1 mas yr{sup -1}. We measure separately the proper motion of a photometrically selected sample of the main-sequence turn-off stars, as well as spectroscopically selected Sgr giants. The data allow us to determine the proper motion separately for the two Sgr streams in the south found in Koposov etmore » al. Together with the precise velocities from SDSS, our proper motions provide exquisite constraints of the three-dimensional motions of the stars in the Sgr streams.« less
Is the Milky Way still breathing? RAVE-Gaia streaming motions
NASA Astrophysics Data System (ADS)
Carrillo, I.; Minchev, I.; Kordopatis, G.; Steinmetz, M.; Binney, J.; Anders, F.; Bienaymé, O.; Bland-Hawthorn, J.; Famaey, B.; Freeman, K. C.; Gilmore, G.; Gibson, B. K.; Grebel, E. K.; Helmi, A.; Just, A.; Kunder, A.; McMillan, P.; Monari, G.; Munari, U.; Navarro, J.; Parker, Q. A.; Reid, W.; Seabroke, G.; Sharma, S.; Siebert, A.; Watson, F.; Wojno, J.; Wyse, R. F. G.; Zwitter, T.
2018-04-01
We use data from the Radial Velocity Experiment (RAVE) and the Tycho-Gaia astrometric solution (TGAS) catalogue to compute the velocity fields yielded by the radial (VR), azimuthal (Vϕ),and vertical (Vz) components of associated Galactocentric velocity. We search in particular for variation in all three velocity components with distance above and below the disc mid-plane, as well as how each component of Vz (line-of-sight and tangential velocity projections) modifies the obtained vertical structure. To study the dependence of velocity on proper motion and distance, we use two main samples: a RAVE sample including proper motions from the Tycho-2, PPMXL, and UCAC4 catalogues, and a RAVE-TGAS sample with inferred distances and proper motions from the TGAS and UCAC5 catalogues. In both samples, we identify asymmetries in VR and Vz. Below the plane, we find the largest radial gradient to be ∂VR/∂R = -7.01 ± 0.61 km s-1 kpc-1, in agreement with recent studies. Above the plane, we find a similar gradient with ∂VR/∂R = -9.42 ± 1.77 km s-1 kpc-1. By comparing our results with previous studies, we find that the structure in Vz is strongly dependent on the adopted proper motions. Using the Galaxia Milky Way model, we demonstrate that distance uncertainties can create artificial wave-like patterns. In contrast to previous suggestions of a breathing mode seen in RAVE data, our results support a combination of bending and breathing modes, likely generated by a combination of external or internal and external mechanisms.
Evaluation of heavy metals content in dietary supplements in Lebanon.
Korfali, Samira Ibrahim; Hawi, Tamer; Mroueh, Mohamad
2013-01-18
The consumption of dietary supplements is widely spread and on the rise. These dietary supplements are generally used without prescriptions, proper counseling or any awareness of their health risk. The current study aimed at analyzing the metals in 33 samples of imported dietary supplements highly consumed by the Lebanese population, using 3 different techniques, to ensure the safety and increase the awareness of the citizen to benefit from these dietary supplements. Some samples had levels of metals above their maximum allowable levels (Fe: 24%, Zn: 33%, Mn: 27%, Se: 15%, Mo: 12% of samples), but did not pose any health risk because they were below permitted daily exposure limit and recommended daily allowance except for Fe in 6% of the samples. On the other hand, 34% of the samples had Cu levels above allowable limit where 18% of them were above their permitted daily exposure and recommended daily allowance. In contrast, all samples had concentration of Cr, Hg, and Pb below allowable limits and daily exposure. Whereas, 30% of analyzed samples had levels of Cd above allowable levels, and were statistically correlated with Ca, and Zn essential minerals. Similarly 62% of the samples had levels of As above allowable limits and As levels were associated with Fe and Mn essential minerals. Dietary supplements consumed as essential nutrients for their Ca, Zn, Fe and Mn content should be monitored for toxic metal levels due to their natural geochemical association with these essential metals to provide citizens the safe allowable amounts.
Habitation Module Technology for Mars Sample Preservation and Return
NASA Astrophysics Data System (ADS)
Humphries., Peter.; Barez., Fred.; Brant., Tom.; Gutti Shashidhar Gowda., Aishwarya.
2018-04-01
Lunar-Mars sample return is of interest to the space community such as NASA, ESA, and private industry. Collected samples of Mars need to be preserved and properly treated in returnable cache, packed to stop back-contamination prior to the return mission.
Viet, Hung Nguyen; Frontasyeva, Marina Vladimirovna; Thi, Thu My Trinh; Gilbert, Daniel; Bernard, Nadine
2010-06-01
The moss technique is widely used to monitor atmospheric deposition of heavy metals in many countries in Europe, whereas this technique is scarcely used in Asia. To implement this international reliable and cheap methodology in the Asian countries, it is necessary to find proper moss types typical for the Asian environment and suitable for the biomonitoring purposes. Such a case study was undertaken in Vietnam for assessing the environmental situation in strongly contaminated areas using local species of moss Barbula indica. The study is focused on two areas characterized by different pollution sources: the Hanoi urban area and the Thainguyen metallurgical zone. Fifty-four moss samples were collected there according to standard sampling procedure adopted in Europe. Two complementary analytical techniques, atomic absorption spectrometry (AAS) and instrumental neutron activation analysis (INAA), were used for determination of elemental concentrations in moss samples. To characterize the pollution sources, multivariate statistical analysis was applied. A total of 38 metal elements were determined in the moss by the two analytical techniques. The results of descriptive statistics of metal concentration in moss from the city center and periphery of Hanoi determined by AAS are presented. The similar results for moss from Thainguyen province determined by INAA and AAS are given also. A comparison of mean elemental concentrations in moss of this work with those in different environmental conditions of other authors provides reasonable information on heavy metal atmospheric deposition levels. Factor loadings and factor scores were used to identify and apportion contamination sources at the sampling sites. The values of percentage of total of factors show two highly different types of pollution in the two examined areas-the Hanoi pollution composition with high portion of urban-traffic activity and soil dust (62%), and the one of Thainguyen with factors related to industrial activities (75%). Besides, the scatter of factors in factor planes represents the greater diversity of activities in Hanoi than in Thainguyen. Good relationship between the result of factor analysis and the pollution sources evidences that the moss technique is a potential method to assess the air quality in Vietnam. Moss B. indica widely distributed in Vietnam and Indo-China is shown to be a reliable bryophyte for biomonitoring purposes in sub-tropic and tropic climate. However, the necessity of moss interspecies calibration is obvious for further studies in the area to provide results compatible with those for other Asian countries and Europe.
Machine-learned Identification of RR Lyrae Stars from Sparse, Multi-band Data: The PS1 Sample
NASA Astrophysics Data System (ADS)
Sesar, Branimir; Hernitschek, Nina; Mitrović, Sandra; Ivezić, Željko; Rix, Hans-Walter; Cohen, Judith G.; Bernard, Edouard J.; Grebel, Eva K.; Martin, Nicolas F.; Schlafly, Edward F.; Burgett, William S.; Draper, Peter W.; Flewelling, Heather; Kaiser, Nick; Kudritzki, Rolf P.; Magnier, Eugene A.; Metcalfe, Nigel; Tonry, John L.; Waters, Christopher
2017-05-01
RR Lyrae stars may be the best practical tracers of Galactic halo (sub-)structure and kinematics. The PanSTARRS1 (PS1) 3π survey offers multi-band, multi-epoch, precise photometry across much of the sky, but a robust identification of RR Lyrae stars in this data set poses a challenge, given PS1's sparse, asynchronous multi-band light curves (≲ 12 epochs in each of five bands, taken over a 4.5 year period). We present a novel template fitting technique that uses well-defined and physically motivated multi-band light curves of RR Lyrae stars, and demonstrate that we get accurate period estimates, precise to 2 s in > 80 % of cases. We augment these light-curve fits with other features from photometric time-series and provide them to progressively more detailed machine-learned classification models. From these models, we are able to select the widest (three-fourths of the sky) and deepest (reaching 120 kpc) sample of RR Lyrae stars to date. The PS1 sample of ˜45,000 RRab stars is pure (90%) and complete (80% at 80 kpc) at high galactic latitudes. It also provides distances that are precise to 3%, measured with newly derived period-luminosity relations for optical/near-infrared PS1 bands. With the addition of proper motions from Gaia and radial velocity measurements from multi-object spectroscopic surveys, we expect the PS1 sample of RR Lyrae stars to become the premier source for studying the structure, kinematics, and the gravitational potential of the Galactic halo. The techniques presented in this study should translate well to other sparse, multi-band data sets, such as those produced by the Dark Energy Survey and the upcoming Large Synoptic Survey Telescope Galactic plane sub-survey.
Bounds on the sample complexity for private learning and private data release
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasiviswanathan, Shiva; Beime, Amos; Nissim, Kobbi
2009-01-01
Learning is a task that generalizes many of the analyses that are applied to collections of data, and in particular, collections of sensitive individual information. Hence, it is natural to ask what can be learned while preserving individual privacy. [Kasiviswanathan, Lee, Nissim, Raskhodnikova, and Smith; FOCS 2008] initiated such a discussion. They formalized the notion of private learning, as a combination of PAC learning and differential privacy, and investigated what concept classes can be learned privately. Somewhat surprisingly, they showed that, ignoring time complexity, every PAC learning task could be performed privately with polynomially many samples, and in many naturalmore » cases this could even be done in polynomial time. While these results seem to equate non-private and private learning, there is still a significant gap: the sample complexity of (non-private) PAC learning is crisply characterized in terms of the VC-dimension of the concept class, whereas this relationship is lost in the constructions of private learners, which exhibit, generally, a higher sample complexity. Looking into this gap, we examine several private learning tasks and give tight bounds on their sample complexity. In particular, we show strong separations between sample complexities of proper and improper private learners (such separation does not exist for non-private learners), and between sample complexities of efficient and inefficient proper private learners. Our results show that VC-dimension is not the right measure for characterizing the sample complexity of proper private learning. We also examine the task of private data release (as initiated by [Blum, Ligett, and Roth; STOC 2008]), and give new lower bounds on the sample complexity. Our results show that the logarithmic dependence on size of the instance space is essential for private data release.« less
The B-Lynch uterine brace suture, and a bit of this and a bit of that...
Karoshi, Mahantesh
2010-03-01
The widespread application of the B-Lynch brace suture to control postpartum hemorrhage has sparked interest in a variety of adjunctive methods, used alone or in combination, to control uterine bleeding. Although the B-Lynch brace suture has been used with good results throughout the world, failures can and do occur in rare instances, especially when the suture is incorrectly placed for use for an inappropriate indication. Four reports of additional methods to control postpartum hemorrhage are published in this issue of IJGO. Three use the B-Lynch brace suture combined with other techniques. The need for additional techniques reminds the reader of the importance of proper suture application for proper indication. Potential reasons for failure of the B-Lynch suture are provided.
Ball, P A; Benzel, E C; Baldwin, N G
1994-04-01
The use of bone plate instrumentation with screw fixation has proved to be a useful adjunctive measure in anterior cervical spine fusion surgery. Proper fitting, positioning, and attachment of this instrumentation have been shown to be frequently suboptimal if done without radiographic guidance. The most commonly used method of radiographic assistance for placement of this instrumentation is fluoroscopy. While this gives satisfactory technical results, it is expensive and time-consuming, and exposes the patient and the operating room personnel to ionizing radiation. The authors present a simple technique to ensure screw placement and plate fitting using Kirschner wires and a single lateral radiograph. This technique saves time, reduces exposure to radiation, and has led to satisfactory results in over 20 operative cases.
Designing a holistic end-to-end intelligent network analysis and security platform
NASA Astrophysics Data System (ADS)
Alzahrani, M.
2018-03-01
Firewall protects a network from outside attacks, however, once an attack entering a network, it is difficult to detect. Recent significance accidents happened. i.e.: millions of Yahoo email account were stolen and crucial data from institutions are held for ransom. Within two year Yahoo’s system administrators were not aware that there are intruder inside the network. This happened due to the lack of intelligent tools to monitor user behaviour in internal network. This paper discusses a design of an intelligent anomaly/malware detection system with proper proactive actions. The aim is to equip the system administrator with a proper tool to battle the insider attackers. The proposed system adopts machine learning to analyse user’s behaviour through the runtime behaviour of each node in the network. The machine learning techniques include: deep learning, evolving machine learning perceptron, hybrid of Neural Network and Fuzzy, as well as predictive memory techniques. The proposed system is expanded to deal with larger network using agent techniques.
Dating the Tidal Disruption of Globular Clusters with GAIA Data on Their Stellar Streams
NASA Astrophysics Data System (ADS)
Bose, Sownak; Ginsburg, Idan; Loeb, Abraham
2018-05-01
The Gaia mission promises to deliver precision astrometry at an unprecedented level, heralding a new era for discerning the kinematic and spatial coordinates of stars in our Galaxy. Here, we present a new technique for estimating the age of tidally disrupted globular cluster streams using the proper motions and parallaxes of tracer stars. We evolve the collisional dynamics of globular clusters within the evolving potential of a Milky Way-like halo extracted from a cosmological ΛCDM simulation and analyze the resultant streams as they would be observed by Gaia. The simulations sample a variety of globular cluster orbits, and account for stellar evolution and the gravitational influence of the disk of the Milky Way. We show that a characteristic timescale, obtained from the dispersion of the proper motions and parallaxes of stars within the stream, is a good indicator for the time elapsed since the stream has been freely expanding away due to the tidal disruption of the globular cluster. This timescale, in turn, places a lower limit on the age of the cluster. The age can be deduced from astrometry using a modest number of stars, with the error on this estimate depending on the proximity of the stream and the number of tracer stars used.
Investigation of soils affected by burnt hospital wastes in Nigeria using PIXE.
Ephraim P, Inyang; Ita, Akpan; Eusebius I, Obiajunwa
2013-12-01
Improper management of hospital waste has been reported to be responsible for several acute outbreaks like the severe acute respiratory syndrome (SARS). In spite of these challenges, hospital wastes are sometimes not properly handled in Nigeria. To date, there has not been an adequate study on the effect and fate of burnt hospital waste on agricultural soil. The effect of burnt hospital wastes on the agricultural soil was conducted on soils sampled around farm settlement near Obafemi Awolowo University Teaching Hospital Complex, Ile-Ife, South West Nigeria. PIXE technique was employed with a 1.7 MV 5SDH Tandem Pelletron accelerator available at Centre for Energy Research and Development O.A.U Ile-Ife, Nigeria. Eleven elements- Si, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Zr and Pb were detected and their concentrations and enrichment factors determined. The presence of Pb and Cl at the elevated concentrations range of (77.8 ± 3.5 - 279.6 ± 97.6 and 102.2 ± 37.4 -167.2±17.43) ppm respectively in this study, is of serious health concern because of the agricultural practices in the neighborhoods of the study sites. There is a need for proper handling of hospital and other related hazardous wastes because of the possibility of such posing serious environmental pollution problems.
Field Immune Assessment during Simulated Planetary Exploration in the Canadian Arctic
NASA Technical Reports Server (NTRS)
Crucian, Brian; Lee, Pascal; Stowe, Raymond; Jones, Jeff; Effenhauser, Rainer; Widen, Raymond; Sams, Clarence
2006-01-01
Dysregulation of the immune system has been shown to occur during space flight, although the detailed nature of the phenomenon and the clinical risks for exploration class missions has yet to be established. In addition, the growing clinical significance of immune system evaluation combined with epidemic infectious disease rates in third world countries provides a strong rationale for the development of field-compatible clinical immunology techniques and equipment. In July 2002 NASA performed a comprehensive field immunology assessment on crewmembers participating in the Haughton-Mars Project (HMP) on Devon Island in the high Canadian Arctic. The purpose of the study was to evaluate mission-associated effects on the human immune system, as well as to evaluate techniques developed for processing immune samples in remote field locations. Ten HMP-2002 participants volunteered for the study. A field protocol was developed at NASA-JSC for performing sample collection, blood staining/processing for immunophenotype analysis, wholeblood mitogenic culture for functional assessments and cell-sample preservation on-location at Devon Island. Specific assays included peripheral leukocyte distribution; constitutively activated T cells, intracellular cytokine profiles and plasma EBV viral antibody levels. Study timepoints were L-30, midmission and R+60. The protocol developed for immune sample processing in remote field locations functioned properly. Samples were processed in the field location, and stabilized for subsequent analysis at the Johnson Space Center in Houston. The data indicated that some phenotype, immune function and stress hormone changes occurred in the HMP field participants that were largely distinct from pre-mission baseline and post-mission recovery data. These immune changes appear similar to those observed in Astronauts following spaceflight. The sample processing protocol developed for this study may have applications for immune assessment during exploration-class space missions or in remote terrestrial field locations. The data validate the use of the HMP as a ground-based spaceflight/planetary exploration analog for some aspects of human physiology.
Veterinary Forensic Toxicology.
Gwaltney-Brant, S M
2016-09-01
Veterinary pathologists working in diagnostic laboratories are sometimes presented with cases involving animal poisonings that become the object of criminal or civil litigation. Forensic veterinary toxicology cases can include cases involving animal cruelty (malicious poisoning), regulatory issues (eg, contamination of the food supply), insurance litigation, or poisoning of wildlife. An understanding of the appropriate approach to these types of cases, including proper sample collection, handling, and transport, is essential so that chain of custody rules are followed and proper samples are obtained for toxicological analysis. Consultation with veterinary toxicologists at the diagnostic laboratory that will be processing the samples before, during, and after the forensic necropsy can help to ensure that the analytical tests performed are appropriate for the circumstances and findings surrounding the individual case. © The Author(s) 2016.
Take an Artistic Spin with Pinwheels
ERIC Educational Resources Information Center
Speelman, Melissa
2012-01-01
A great start for the semester, this pinwheel project provides a good dose of art history, and a variety of media and techniques. It also teaches students how to clean up and store things properly. Five artists are introduced, each with a different art medium and technique. In this activity, students are expected to: (1) study works by five famous…
Constructing bald eagle nests with natural materials
T. G. Grubb
1995-01-01
A technique for using natural materials to build artificial nests for bald eagles (Haliaeetus leucocephalus) and other raptors is detailed. Properly constructed nests are as permanently secured to the nest tree or cliff substrate as any eagle-built nest or human-made platform. Construction normally requires about three hours and at least two people. This technique is...
ERIC Educational Resources Information Center
Ratican, Kathleen L.
1996-01-01
The kinesthetic track back technique accesses the origins of current symptoms and may uncover previously repressed/dissociated material, if such material exists in the client's unconscious mind, is relevant to the symptoms, and is ready to be processed consciously. Case examples are given to illustrate proper use of this technique. (LSR)
Glaus, M A; Aertsens, M; Maes, N; Van Laer, L; Van Loon, L R
2015-01-01
Valuable techniques to measure effective diffusion coefficients in porous media are an indispensable prerequisite for a proper understanding of the migration of chemical-toxic and radioactive micropollutants in the subsurface and geosphere. The present article discusses possible pitfalls and difficulties in the classical through-diffusion technique applied to situations where large diffusive fluxes of cations in compacted clay minerals or clay rocks occur. The results obtained from a benchmark study, in which the diffusion of (85)Sr(2+) tracer in compacted illite has been studied using different experimental techniques, are presented. It is shown that these techniques may yield valuable results provided that an appropriate model is used for numerical simulations. It is further shown that effective diffusion coefficients may be systematically underestimated when the concentration at the downstream boundary is not taken adequately into account in modelling, even for very low concentrations. A criterion is derived for quasi steady-state situations, by which it can be decided whether the simplifying assumption of a zero-concentration at the downstream boundary in through-diffusion is justified or not. The application of the criterion requires, however, knowledge of the effective diffusion coefficient of the clay sample. Such knowledge is often absent or only approximately available during the planning phase of a diffusion experiment. Copyright © 2015 Elsevier B.V. All rights reserved.
Jiménez-Sotelo, Paola; Hernández-Martínez, Maylet; Osorio-Revilla, Guillermo; Meza-Márquez, Ofelia Gabriela; García-Ochoa, Felipe; Gallardo-Velázquez, Tzayhrí
2016-07-01
Avocado oil is a high-value and nutraceutical oil whose authentication is very important since the addition of low-cost oils could lower its beneficial properties. Mid-FTIR spectroscopy combined with chemometrics was used to detect and quantify adulteration of avocado oil with sunflower and soybean oils in a ternary mixture. Thirty-seven laboratory-prepared adulterated samples and 20 pure avocado oil samples were evaluated. The adulterated oil amount ranged from 2% to 50% (w/w) in avocado oil. A soft independent modelling class analogy (SIMCA) model was developed to discriminate between pure and adulterated samples. The model showed recognition and rejection rate of 100% and proper classification in external validation. A partial least square (PLS) algorithm was used to estimate the percentage of adulteration. The PLS model showed values of R(2) > 0.9961, standard errors of calibration (SEC) in the range of 0.3963-0.7881, standard errors of prediction (SEP estimated) between 0.6483 and 0.9707, and good prediction performances in external validation. The results showed that mid-FTIR spectroscopy could be an accurate and reliable technique for qualitative and quantitative analysis of avocado oil in ternary mixtures.
Superhydrophobic Analyte Concentration Utilizing Colloid-Pillar Array SERS Substrates
Wallace, Ryan A.; Charlton, Jennifer J.; Kirchner, Teresa B.; ...
2014-11-04
In order to detect a few molecules present in a large sample it is important to know the trace components in the medicinal and environmental sample. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. Moreover, the following work involves superhydrophobic surfaces that includes silicon pillar arrays formed by lithographic and dewetting protocols. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added tomore » the functionalized pillar array system via soaking. The pillars are used native and with hydrophobic modification. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 10-12 M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up applications in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.« less
Identification of proteinaceous binders used in artworks by MALDI-TOF mass spectrometry.
Kuckova, Stepanka; Hynek, Radovan; Kodicek, Milan
2007-05-01
Proper identification of proteinaceous binders in artworks is essential for specification of the painting technique and thus also for selection of the restoration method; moreover, it might be helpful for the authentication of the artwork. This paper is concerned with the optimisation of analysis of the proteinaceous binders contained in the colour layers of artworks. Within this study, we worked out a method for the preparation and analysis of solid samples from artworks using tryptic cleavage and subsequent analysis of the acquired peptide mixture by matrix-assisted laser desorption/ionisation time of flight mass spectrometry. To make this approach rational and efficient, we created a database of commonly used binders (egg yolk, egg white, casein, milk, curd, whey, gelatine, and various types of animal glues); certain peaks in the mass spectra of these binders, formed by rich protein mixtures, were matched to amino acid sequences of the individual proteins that were found in the Internet database ExPASy; their cleavage was simulated by the program Mass-2.0-alpha4. The method developed was tested on model samples of ground layers prepared by an independent laboratory and then successfully applied to a real sample originating from a painting by Edvard Munch.
Imbalanced learning for pattern recognition: an empirical study
NASA Astrophysics Data System (ADS)
He, Haibo; Chen, Sheng; Man, Hong; Desai, Sachi; Quoraishee, Shafik
2010-10-01
The imbalanced learning problem (learning from imbalanced data) presents a significant new challenge to the pattern recognition and machine learning society because in most instances real-world data is imbalanced. When considering military applications, the imbalanced learning problem becomes much more critical because such skewed distributions normally carry the most interesting and critical information. This critical information is necessary to support the decision-making process in battlefield scenarios, such as anomaly or intrusion detection. The fundamental issue with imbalanced learning is the ability of imbalanced data to compromise the performance of standard learning algorithms, which assume balanced class distributions or equal misclassification penalty costs. Therefore, when presented with complex imbalanced data sets these algorithms may not be able to properly represent the distributive characteristics of the data. In this paper we present an empirical study of several popular imbalanced learning algorithms on an army relevant data set. Specifically we will conduct various experiments with SMOTE (Synthetic Minority Over-Sampling Technique), ADASYN (Adaptive Synthetic Sampling), SMOTEBoost (Synthetic Minority Over-Sampling in Boosting), and AdaCost (Misclassification Cost-Sensitive Boosting method) schemes. Detailed experimental settings and simulation results are presented in this work, and a brief discussion of future research opportunities/challenges is also presented.
Superhydrophobic Analyte Concentration Utilizing Colloid-Pillar Array SERS Substrates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wallace, Ryan A.; Charlton, Jennifer J.; Kirchner, Teresa B.
In order to detect a few molecules present in a large sample it is important to know the trace components in the medicinal and environmental sample. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. Moreover, the following work involves superhydrophobic surfaces that includes silicon pillar arrays formed by lithographic and dewetting protocols. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added tomore » the functionalized pillar array system via soaking. The pillars are used native and with hydrophobic modification. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 10-12 M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up applications in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.« less
Scout-view Assisted Interior Micro-CT
Sen Sharma, Kriti; Holzner, Christian; Vasilescu, Dragoş M.; Jin, Xin; Narayanan, Shree; Agah, Masoud; Hoffman, Eric A.; Yu, Hengyong; Wang, Ge
2013-01-01
Micro computed tomography (micro-CT) is a widely-used imaging technique. A challenge of micro-CT is to quantitatively reconstruct a sample larger than the field-of-view (FOV) of the detector. This scenario is characterized by truncated projections and associated image artifacts. However, for such truncated scans, a low resolution scout scan with an increased FOV is frequently acquired so as to position the sample properly. This study shows that the otherwise discarded scout scans can provide sufficient additional information to uniquely and stably reconstruct the interior region of interest. Two interior reconstruction methods are designed to utilize the multi-resolution data without a significant computational overhead. While most previous studies used numerically truncated global projections as interior data, this study uses truly hybrid scans where global and interior scans were carried out at different resolutions. Additionally, owing to the lack of standard interior micro-CT phantoms, we designed and fabricated novel interior micro-CT phantoms for this study to provide means of validation for our algorithms. Finally, two characteristic samples from separate studies were scanned to show the effect of our reconstructions. The presented methods show significant improvements over existing reconstruction algorithms. PMID:23732478
NASA Astrophysics Data System (ADS)
Tang, Gao; Jiang, FanHuag; Li, JunFeng
2015-11-01
Near-Earth asteroids have gained a lot of interest and the development in low-thrust propulsion technology makes complex deep space exploration missions possible. A mission from low-Earth orbit using low-thrust electric propulsion system to rendezvous with near-Earth asteroid and bring sample back is investigated. By dividing the mission into five segments, the complex mission is solved separately. Then different methods are used to find optimal trajectories for every segment. Multiple revolutions around the Earth and multiple Moon gravity assists are used to decrease the fuel consumption to escape from the Earth. To avoid possible numerical difficulty of indirect methods, a direct method to parameterize the switching moment and direction of thrust vector is proposed. To maximize the mass of sample, optimal control theory and homotopic approach are applied to find the optimal trajectory. Direct methods of finding proper time to brake the spacecraft using Moon gravity assist are also proposed. Practical techniques including both direct and indirect methods are investigated to optimize trajectories for different segments and they can be easily extended to other missions and more precise dynamic model.
Applying Online Monitoring for Nuclear Power Plant Instrumentation and Control
NASA Astrophysics Data System (ADS)
Hashemian, H. M.
2010-10-01
This paper presents a practical review of the state-of-the-art means for applying OLM data acquisition in nuclear power plant instrumentation and control, qualifying or validating the OLM data, and then analyzing it for static and dynamic performance monitoring applications. Whereas data acquisition for static or steady-state OLM applications can require sample rates of anywhere from 1 to 10 seconds to 1 minutes per sample, for dynamic data acquisition, higher sampling frequencies are required (e.g., 100 to 1000 Hz) using a dedicated data acquisition system capable of providing isolation, anti-aliasing and removal of extraneous noise, and analog-to-digital (A/D) conversion. Qualifying the data for use with OLM algorithms can involve removing data `dead' spots (for static data) and calculating, examining, and trending amplitude probability density, variance, skewness, and kurtosis. For static OLM applications with redundant signals, trending and averaging qualification techniques are used, and for single or non-redundant signals physical and empirical modeling are used. Dynamic OLM analysis is performed in the frequency domain and/or time domain, and is based on the assumption that sensors' or transmitters' dynamic characteristics are linear and that the input noise signal (i.e., the process fluctuations) has proper spectral characteristics.
NASA Astrophysics Data System (ADS)
Zhang, M.; Yoshikawa, M.; Takeuchi, M.; Komai, T.
2011-12-01
Chlorinated ethenes, like perchloroethene (PCE) and trichloroethene (TCE), have been widely used by many industries, especially in developed countries like Japan. Because of their wide applications, lack of proper regulation, poor handing, storage and disposal practices in the past, chlorinated ethenes have become a type of the most prevalent contaminants for soils and groundwater pollution. For the sake of their degradability, bioremediation has been considered as a potentially cost-effective and environmentally friendly approach for cleanup of chlorinated ethenes in situ. In this presentation, we briefly overview the status of soil and groundwater pollution, the recent amendment of the Soil Contamination Countermeasures Act in Japan, comparison between the bioremediation and other techniques like pump and treat, and the mechanisms of reductive dechlorination, direct oxidation and co-metabolism of chlorinated ethenes. We then introduce and discuss some recent challenges and advancements in in-situ bioremediation including technologies for accelerating bio-degradation of chlorinated ethenes, technologies for assessing diffusive properties of dissolved hydrogen in hydraulically-tight soil samples, and combination of bioremediation with other techniques like electro-kinetic approach. Limiting factors that may cause incomplete remediation and/or ineffectiveness of bioremediation are examined from biochemical, geochemical and hydro-geological aspects. This study reconfirmed and illustrated that: 1) The key factor for an effective bioremediation is how to disperse a proper accelerating agent throughout the polluted strata, 2) The effective diffusion coefficient of dissolved hydrogen in geologic media is relatively big and is almost independent on their permeability, and 3) To effectively design and perform an accelerated bioremediation, a combination of natural migration with pressurized injection and/or other approaches, like electro-migration, for stimulating mass transport could be necessary depending on the hydraulic properties, like porosity and permeability of a stratum.
Flame retardant exposure assessment: findings from a behavioral intervention study.
Gibson, Elizabeth A; Stapleton, Heather M; Calero, Lehyla; Holmes, Darrell; Burke, Kimberly; Martinez, Rodney; Cortes, Boris; Nematollahi, Amy; Evans, David; Herbstman, Julie B
2018-06-28
Polybrominated diphenyl ethers (PBDEs) have been largely replaced by organophosphate flame retardants (OPFRs) and alternative brominated flame retardants (Alt-BFRs) to meet flammability requirements. Humans are ubiquitously exposed to some variety of flame retardants through contact with consumer products directly or through household dust. To evaluate the effectiveness of house cleaning and hand washing practices to reduce exposure to flame retardants, we measured concentrations in dermal hand wipes and urinary metabolites before and after assignment to two consecutive interventions. We selected 32 mother and child dyads from an existing cohort. This analysis focuses on mothers. Participants provided baseline measurements (urine, hand wipes, and questionnaires) and were then assigned for 1 week to either a house cleaning (including instruction on proper technique and cleaning supplies) or hand washing (including instruction on proper technique and soaps) intervention arm. For the second week, participants were assigned to the second intervention in addition to their initial assignment, thus all subjects both washed their hands and cleaned according to the intervention guidelines during week 2. We collected measurements at the end of weeks 1 and 2. We found reductions in urinary analytes after week 1 of house cleaning (BCIPHIPP and ip-DPHP), week 1 of hand washing (BCIPP, BCIPHIPP, and tbutyl-DPHP), and week 2 of combined interventions (BCIPHIPP and tbutyl-DPHP), compare to baseline. We found no significant decline in hand wipes in the entire sample but did find reductions after week 1 of house cleaning (BDE 209), week 1 of hand washing (TCEP), and week 2 of combined interventions (TDCIPP and BDE 209) in women with exposure above the median at baseline (verified through simulations). Exposure to individual flame retardants was reduced by about half, in some cases, by 1 week of increased hand washing, house cleaning to reduce dust, or combined activities.
Suitability of the CellientTM cell block method for diagnosing soft tissue and bone tumors
Song, W.; van Hemel, B. M.
2018-01-01
BACKGROUND The diagnosis of tumors of soft tissue and bone (STB) heavily relies on histological biopsies, whereas cytology is not widely used. CellientTM cell blocks often contain small tissue fragments. In addition to Hematoxylin and Eosin (H&E) interpretation of histological features, immunohistochemistry (IHC) can be applied after optimization of protocols. The objective of this retrospective study was to see whether this cytological technique allowed us to make a precise diagnosis of STB tumors. METHODS Our study cohort consisted of 20 consecutive STB tumors, 9 fine‐needle aspiration (FNAC) samples, and 11 endoscopic ultrasonography (EUS) FNACs and included 8 primary tumors and 12 recurrences or metastases of known STB tumors. RESULTS In all 20 cases, H&E stained sections revealed that diagnostically relevant histological and cytological features could be examined properly. In the group of 8 primary tumors, IHC performed on CellientTM material provided clinically important information in all cases. For instance, gastrointestinal stromal tumor (GIST) was positive for CD117 and DOG‐1 and a PEComa showed positive IHC for actin, desmin, and HMB‐45. In the group of 12 secondary tumors, SATB2 was visualized in metastatic osteosarcoma, whereas expression of S‐100 was present in 2 secondary chondrosarcomas. Metastatic chordoma could be confirmed by brachyury expression. Two metastatic alveolar rhabdomyosarcomas were myf4 positive, a metastasis of a gynecologic leiomyosarcoma was positive for actin and estrogen receptor (ER) and a recurrent dermatofibrosarcoma protuberans expressed CD34. CONCLUSION In the proper clinical context, including clinical presentation with imaging studies, the CellientTM cell block technique has great potential for the diagnosis of STB tumors. PMID:29318761
Sammarco, G J
1983-11-01
Conditions that occur in the dancer's hip fall into the following categories: poor training; conditions that occur as the result of normal use; overuse syndromes, including tendinitis and myositis; and conditions referring pain to the hip. Dancers are highly motivated and goal oriented and often suppress symptoms for long periods, making diagnosis and treatment difficult. Observing the dancer at work and understanding his art are emphasized, and a practical guide to therapy is presented. Development of proper dance technique and a proper flexibility program can decrease the incidence of injuries.
31 CFR 205.14 - When does Federal interest liability accrue?
Code of Federal Regulations, 2010 CFR
2010-07-01
... EFFICIENT FEDERAL-STATE FUNDS TRANSFERS Rules Applicable to Federal Assistance Programs Included in a... funding technique properly, we may deny any resulting Federal interest liability, notwithstanding any...
Mohamed, Rayane; Guy, Philippe A
2011-01-01
During recent years, a rising interest from consumers and various governmental organizations towards the quality of food has continuously been observed. Human intervention across the different stages of the food supply chain can lead to the presence of several types of chemical contaminants in food-based products. On a normal daily consumption basis, some of these chemicals are not harmful; however, for those that present a risk to consumers, legislation rules were established to specify tolerance levels or in some cases the total forbiddance of these specific contaminants. Hence, the use of appropriate analytical tools is recommended to properly identify chemical contaminants. In that context, mass spectrometry (MS)-based techniques coupled or not to chromatography offer a vast panel of features such as sensitivity, selectivity, quantification at trace levels, and/or structural elucidation. Because of the complexity of food-based matrices, sample preparation is a crucial step before final detection. In the present manuscript, we review the contribution and the potentialities of MS-based techniques to ensure the absence of chemical contaminants in food-based products. Copyright © 2011 Wiley Periodicals, Inc.
A general way for quantitative magnetic measurement by transmitted electrons
NASA Astrophysics Data System (ADS)
Song, Dongsheng; Li, Gen; Cai, Jianwang; Zhu, Jing
2016-01-01
EMCD (electron magnetic circular dichroism) technique opens a new door to explore magnetic properties by transmitted electrons. The recently developed site-specific EMCD technique makes it possible to obtain rich magnetic information from the Fe atoms sited at nonequivalent crystallographic planes in NiFe2O4, however it is based on a critical demand for the crystallographic structure of the testing sample. Here, we have further improved and tested the method for quantitative site-specific magnetic measurement applicable for more complex crystallographic structure by using the effective dynamical diffraction effects (general routine for selecting proper diffraction conditions, making use of the asymmetry of dynamical diffraction for design of experimental geometry and quantitative measurement, etc), and taken yttrium iron garnet (Y3Fe5O12, YIG) with more complex crystallographic structure as an example to demonstrate its applicability. As a result, the intrinsic magnetic circular dichroism signals, spin and orbital magnetic moment of iron with site-specific are quantitatively determined. The method will further promote the development of quantitative magnetic measurement with high spatial resolution by transmitted electrons.
Attitudes and behaviors of Hispanic smokers: implications for cessation interventions.
Marin, B V; Perez-Stable, E J; Marin, G; Sabogal, F; Otero-Sabogal, R
1990-01-01
The smoking behavior of Hispanics, especially Mexican Americans, has been reported to differ from that of non-Hispanic whites, in both large gender differences in prevalence as well as a lower self-reported number of cigarettes smoked per day. This study compared the responses of a convenience sample of 263 Hispanic (44% Mexican American and 38% Central American) and 150 non-Hispanic white smokers, in order to identify other ethnic; gender, and acculturation differences in smoking behaviors. Hispanic women smoked fewer cigarettes and initiated smoking at a comparatively later age than Hispanic men; they were also less likely to smoke during pregnancy than non-Hispanic white women. Hispanics smoked more cigarettes on Saturday than other days, but this was not true for non-Hispanic whites. Will power (voluntad propia) and knowing the negative effects of smoking were considered the most helpful techniques for quitting by Hispanics. Considering that light smokers are able to quit with less intensive cessation techniques, these data suggest that a properly developed health education community intervention may have an impact on smoking rates among Hispanics.
Computation of Asteroid Proper Elements: Recent Advances
NASA Astrophysics Data System (ADS)
Knežević, Z.
2017-12-01
The recent advances in computation of asteroid proper elements are briefly reviewed. Although not representing real breakthroughs in computation and stability assessment of proper elements, these advances can still be considered as important improvements offering solutions to some practical problems encountered in the past. The problem of getting unrealistic values of perihelion frequency for very low eccentricity orbits is solved by computing frequencies using the frequency-modified Fourier transform. The synthetic resonant proper elements adjusted to a given secular resonance helped to prove the existence of Astraea asteroid family. The preliminary assessment of stability with time of proper elements computed by means of the analytical theory provides a good indication of their poorer performance with respect to their synthetic counterparts, and advocates in favor of ceasing their regular maintenance; the final decision should, however, be taken on the basis of more comprehensive and reliable direct estimate of their individual and sample average deviations from constancy.
γ-Aminobutyric acid ameliorates fluoride-induced hypothyroidism in male Kunming mice.
Yang, Haoyue; Xing, Ronge; Liu, Song; Yu, Huahua; Li, Pengcheng
2016-02-01
This study evaluated the protective effects of γ-aminobutyric acid (GABA), a non-protein amino acid and anti-oxidant, against fluoride-induced hypothyroidism in mice. Light microscope sample preparation technique and TEM sample preparation technique were used to assay thyroid microstructure and ultrastructure; enzyme immunoassay method was used to assay hormone and protein levels; immunohistochemical staining method was used to assay apoptosis of thyroid follicular epithelium cells. Subacute injection of sodium fluoride (NaF) decreased blood T4, T3 and thyroid hormone-binding globulin (TBG) levels to 33.98 μg/l, 3 2.8 ng/ml and 11.67 ng/ml, respectively. In addition, fluoride intoxication induced structural abnormalities in thyroid follicles. Our results showed that treatment of fluoride-exposed mice with GABA appreciably decreased metabolic toxicity induced by fluoride and restored the microstructural and ultrastructural organisation of the thyroid gland towards normalcy. Compared with the negative control group, GABA treatment groups showed significantly upregulated T4, T3 and TBG levels (42.34 μg/l, 6.54 ng/ml and 18.78 ng/ml, respectively; P<0.05), properly increased TSH level and apoptosis inhibition in thyroid follicular epithelial cells. To the best of our knowledge, this is the first study to establish the therapeutic efficacy of GABA as a natural antioxidant in inducing thyroprotection against fluoride-induced toxicity. Copyright © 2015 Elsevier Inc. All rights reserved.
Thermoluminesence of gamma rays irradiated CaSO4 nanorods doped with different elements
NASA Astrophysics Data System (ADS)
Salah, Numan
2015-01-01
Nanorods of calcium sulfate (CaSO4) activated by Ag, Cu, Dy, Eu and Tb were synthesized by the co-precipitation technique. They were irradiated by γ-rays in a wide range of exposures and studied for their thermoluminesence (TL) properties. The as-synthesized samples were characterized by scanning electron microscopy (SEM), X-ray diffraction (XRD) and photoluminescence (PL) emission spectra. SEM images show that the samples doped with rare earths elements (i.e. Dy, Eu and Tb) have thinner nanorods than the other samples, while XRD pattern shows a complete crystalline structures in a monoclinic phase. The TL glow curves of these samples show two components. The first one include low temperature glow peaks at around 125 °C, while the second component shows high temperature peaks in the range 230-270 °C. These glow peaks diver from sample to sample by their TL intensity. The TL results are promising, particularly that of Tb and Eu. Tb doped sample is found to be a highly TL sensitive with a prominent glow peak at around 270 °C, while Eu has created very active, high dense electron traps. The later shows quite linear response in the whole studied exposures i.e. 10 Gy-10 kGy. These results show that Eu or Tb doped CaSO4 nanorods might be proper candidates as dosimeters for high doses of ionizing radiations used in irradiation of foods and seeds.
Measures and Relative Motions of Some Mostly F. G. W. Struve Doubles
NASA Astrophysics Data System (ADS)
Wiley, E. O.
2012-04-01
Measures of 59 pairs of double stars with long observational histories using "lucky imaging" techniques are reported. Relative motions of 59 pairs are investigated using histories of observation, scatter plots of relative motion, ordinary least-squares (OLS) and total proper motion analyses performed in "R," an open source programming language. A scatter plot of the coefficient of determinations derived from the OLS y|epoch and OLS x|epoch clearly separates common proper motion pairs from optical pairs and what are termed "long-period binary candidates." Differences in proper motion separate optical pairs from long-term binary candidates. An Appendix is provided that details how to use known rectilinear pairs as calibration pairs for the program REDUC.
ERIC Educational Resources Information Center
Adachi, Kohei
2013-01-01
Rubin and Thayer ("Psychometrika," 47:69-76, 1982) proposed the EM algorithm for exploratory and confirmatory maximum likelihood factor analysis. In this paper, we prove the following fact: the EM algorithm always gives a proper solution with positive unique variances and factor correlations with absolute values that do not exceed one,…
Optimal CCD readout by digital correlated double sampling
NASA Astrophysics Data System (ADS)
Alessandri, C.; Abusleme, A.; Guzman, D.; Passalacqua, I.; Alvarez-Fontecilla, E.; Guarini, M.
2016-01-01
Digital correlated double sampling (DCDS), a readout technique for charge-coupled devices (CCD), is gaining popularity in astronomical applications. By using an oversampling ADC and a digital filter, a DCDS system can achieve a better performance than traditional analogue readout techniques at the expense of a more complex system analysis. Several attempts to analyse and optimize a DCDS system have been reported, but most of the work presented in the literature has been experimental. Some approximate analytical tools have been presented for independent parameters of the system, but the overall performance and trade-offs have not been yet modelled. Furthermore, there is disagreement among experimental results that cannot be explained by the analytical tools available. In this work, a theoretical analysis of a generic DCDS readout system is presented, including key aspects such as the signal conditioning stage, the ADC resolution, the sampling frequency and the digital filter implementation. By using a time-domain noise model, the effect of the digital filter is properly modelled as a discrete-time process, thus avoiding the imprecision of continuous-time approximations that have been used so far. As a result, an accurate, closed-form expression for the signal-to-noise ratio at the output of the readout system is reached. This expression can be easily optimized in order to meet a set of specifications for a given CCD, thus providing a systematic design methodology for an optimal readout system. Simulated results are presented to validate the theory, obtained with both time- and frequency-domain noise generation models for completeness.
Nording, Malin; Denison, Michael S.; Baston, David; Persson, Ylva; Spinnel, Erik; Haglund, Peter
2010-01-01
The chemically activated luciferase expression assay, the chemically activated fluorescence expression assay, and the enzyme-linked immunosorbent assay (ELISA) are all bioanalytical methods that have been used for the detection and quantification of polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/Fs). However, no comparisons of the results obtained by these three methods have been published analyzing identical replicates of purified sample extracts. Therefore, we have evaluated the performance of each of these methods for analyzing PCDD/Fs in aliquots of extracts from aged-contaminated soil samples and compared the results with those obtained by gas chromatography/high-resolution mass spectrometry (GC/HRMS). The quantitative performance was assessed and the effects of sample purification and data interpretation on the quality of the bioassay results were investigated. Results from the bioanalytical techniques were, in principle, not significantly different from each other or from the GC/HRMS data (p = 0.05). Furthermore, properly used, all of the bioanalytical techniques examined were found to be sufficiently sensitive, selective, and accurate to be used in connection with soil remediation activities when aiming at the remediation goal recommended by the U.S. Environmental Protection Agency (i.e., < 1,000 pg toxic equivalency/g). However, a site-specific correction factor should be applied with the use of the ELISA to account for differences between the toxic equivalency factors and the ELISA cross-reactivities of the various PCDD/F congeners, which otherwise might significantly underestimate the PCDD/F content. PMID:17571676
Sampling considerations for modal analysis with damping
NASA Astrophysics Data System (ADS)
Park, Jae Young; Wakin, Michael B.; Gilbert, Anna C.
2015-03-01
Structural health monitoring (SHM) systems are critical for monitoring aging infrastructure (such as buildings or bridges) in a cost-effective manner. Wireless sensor networks that sample vibration data over time are particularly appealing for SHM applications due to their flexibility and low cost. However, in order to extend the battery life of wireless sensor nodes, it is essential to minimize the amount of vibration data these sensors must collect and transmit. In recent work, we have studied the performance of the Singular Value Decomposition (SVD) applied to the collection of data and provided new finite sample analysis characterizing conditions under which this simple technique{also known as the Proper Orthogonal Decomposition (POD){can correctly estimate the mode shapes of the structure. Specifically, we provided theoretical guarantees on the number and duration of samples required in order to estimate a structure's mode shapes to a desired level of accuracy. In that previous work, however, we considered simplified Multiple-Degree-Of-Freedom (MDOF) systems with no damping. In this paper we consider MDOF systems with proportional damping and show that, with sufficiently light damping, the POD can continue to provide accurate estimates of a structure's mode shapes. We support our discussion with new analytical insight and experimental demonstrations. In particular, we study the tradeoffs between the level of damping, the sampling rate and duration, and the accuracy to which the structure's mode shapes can be estimated.
ERIC Educational Resources Information Center
Vandewalle, Raymond
1976-01-01
A new nationwide program called Sail '76 has been launched to give more people the opportunity to try the sport of sailing and to teach people the proper sailing techniques before they invest in a sailboat. (SK)
Intraosseous infusion in elective and emergency pediatric anesthesia: when should we use it?
Neuhaus, Diego
2014-06-01
Difficulties to establish a venous access may also occur in routine pediatric anesthesia and lead to hazardous situations. Intraosseous infusion is a well tolerated and reliable but rarely used alternative technique in this setting. According to recent surveys, severe complications of intraosseous infusion stay a rare event. Minor complications and problems in getting an intraosseous infusion started on the other side seem to be more common than generally announced. The EZ-IO intraosseous infusion system has received expanded EU CE mark approval for an extended dwell time of up to 72 h and for insertion in pediatric patients in the distal femur. Key values of blood samples for laboratory analysis can be obtained with only 2 ml of blood/marrow waste and do also offer reliable values using an I-Stat point-of-care analyzer. Most problems in using an intraosseous infusion are provider-dependent. In pediatric anesthesia, the perioperative setting should further contribute to reduce these problems. Nevertheless, regular training, thorough anatomical knowledge and prompt availability especially in the pediatric age group are paramount to get a seldom used technique work properly under pressure. More longitudinal data on large cohorts were preferable to further support the safety of the intraosseous infusion technique in pediatric patients.
Huang, Hui; Liu, Li; Ngadi, Michael O; Gariépy, Claude; Prasher, Shiv O
2014-01-01
Marbling is an important quality attribute of pork. Detection of pork marbling usually involves subjective scoring, which raises the efficiency costs to the processor. In this study, the ability to predict pork marbling using near-infrared (NIR) hyperspectral imaging (900-1700 nm) and the proper image processing techniques were studied. Near-infrared images were collected from pork after marbling evaluation according to current standard chart from the National Pork Producers Council. Image analysis techniques-Gabor filter, wide line detector, and spectral averaging-were applied to extract texture, line, and spectral features, respectively, from NIR images of pork. Samples were grouped into calibration and validation sets. Wavelength selection was performed on calibration set by stepwise regression procedure. Prediction models of pork marbling scores were built using multiple linear regressions based on derivatives of mean spectra and line features at key wavelengths. The results showed that the derivatives of both texture and spectral features produced good results, with correlation coefficients of validation of 0.90 and 0.86, respectively, using wavelengths of 961, 1186, and 1220 nm. The results revealed the great potential of the Gabor filter for analyzing NIR images of pork for the effective and efficient objective evaluation of pork marbling.
de Souza, Marcela; Matsuzawa, Tetsuhiro; Sakai, Kanae; Muraosa, Yasunori; Lyra, Luzia; Busso-Lopes, Ariane Fidelis; Levin, Anna Sara Shafferman; Schreiber, Angélica Zaninelli; Mikami, Yuzuru; Gonoi, Tohoru; Kamei, Katsuhiko; Moretti, Maria Luiza; Trabasso, Plínio
2017-08-01
The performance of three molecular biology techniques, i.e., DNA microarray, loop-mediated isothermal amplification (LAMP), and real-time PCR were compared with DNA sequencing for properly identification of 20 isolates of Fusarium spp. obtained from blood stream as etiologic agent of invasive infections in patients with hematologic malignancies. DNA microarray, LAMP and real-time PCR identified 16 (80%) out of 20 samples as Fusarium solani species complex (FSSC) and four (20%) as Fusarium spp. The agreement among the techniques was 100%. LAMP exhibited 100% specificity, while DNA microarray, LAMP and real-time PCR showed 100% sensitivity. The three techniques had 100% agreement with DNA sequencing. Sixteen isolates were identified as FSSC by sequencing, being five Fusarium keratoplasticum, nine Fusarium petroliphilum and two Fusarium solani. On the other hand, sequencing identified four isolates as Fusarium non-solani species complex (FNSSC), being three isolates as Fusarium napiforme and one isolate as Fusarium oxysporum. Finally, LAMP proved to be faster and more accessible than DNA microarray and real-time PCR, since it does not require a thermocycler. Therefore, LAMP signalizes as emerging and promising methodology to be used in routine identification of Fusarium spp. among cases of invasive fungal infections.
Kumera, Gemechu; Tsedal, Endalkachew; Ayana, Mulatu
2018-01-01
Proper feeding practices during early childhood is fundamental for optimal child growth and development. However, scientific evidences on the determinants of dietary diversity are scanty. Particularly, the impact of fasting on children`s dietary diversity is not explored in Ethiopia. The aim of this study was to assess dietary diversity and associated factors among children aged 6-23 months, whose mothers/care-givers were Orthodox Christians during the fasting season (Lent), in Dejen District, North West Ethiopia, 2016. A community based cross-sectional study was conducted during the fasting season from March to April, 2016. The study sample were children aged 6-23 months, whose mothers/care-givers were Orthodox Christians. A systematic random sampling technique was used to select a sample of 967 children proportionally from all selected kebeles. Data was entered using Epi data and statistical analysis were done using logistic regression. P-value < 0.05 at 95% confidence interval was taken as statistically significant. Only 13.6% of children surveyed met the minimum requirement for dietary diversity. Unsatisfactory exposure to media [AOR = 5.22] and low household monthly income [AOR = 2.20] were negatively associated with dietary diversity. As compared to economic related reasons, mothers/caregivers who do not feed diet of animal origin to their children due to fear of utensil contamination for family food preparation were 1.5 times [AOR=1.5; 95% CI (1.05 - 2.53)] less likely to feed the recommended dietary diversity. The findings of this study revealed that the diet of children in the study area lacked diversity. Promoting mass media and socioeconomic empowerment of women have positive contribution to optimal child feeding practice. Sustained nutrition education to mothers regarding proper infant and young child feeding practice in collaboration with the respective religious leaders is highly recommended.
Earth Observation System Flight Dynamics System Covariance Realism
NASA Technical Reports Server (NTRS)
Zaidi, Waqar H.; Tracewell, David
2016-01-01
This presentation applies a covariance realism technique to the National Aeronautics and Space Administration (NASA) Earth Observation System (EOS) Aqua and Aura spacecraft based on inferential statistics. The technique consists of three parts: collection calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics.
Guolei Li; Yan Zhu; Yong Liu; Jiaxi Wang; Jiajia Liu; R. Kasten Dumroese
2014-01-01
Maintaining proper seedling nitrogen status is important for outplanting success. Fall fertilization of evergreen conifer seedlings is a well-known technique for averting nitrogen (N) dilution caused by continued seedling growth during hardening. For deciduous seedlings, this technique is much less understood, and regardless of foliage type, the interaction of N status...
78 FR 11171 - Proposed Information Collection Request; Comment Request; RadNet (Renewal)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-15
...) Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the Agency, including whether the information will have practical utility; (ii) evaluate the... request descriptive information pertaining to sample location, e.g., sample type, sample location, length...
Big assumptions for small samples in crop insurance
Ashley Elaine Hungerford; Barry Goodwin
2014-01-01
The purpose of this paper is to investigate the effects of crop insurance premiums being determined by small samples of yields that are spatially correlated. If spatial autocorrelation and small sample size are not properly accounted for in premium ratings, the premium rates may inaccurately reflect the risk of a loss.
KODAMA, Nao; KOSE, Katsumi
2016-01-01
Echo-planar imaging (EPI) sequences were developed for a 9.4 Tesla vertical standard bore (∼54 mm) superconducting magnet using an unshielded gradient coil optimized for live mice imaging and a data correction technique with reference scans. Because EPI requires fast switching of intense magnetic field gradients, eddy currents were induced in the surrounding metallic materials, e.g., the room temperature bore, and this produced serious artifacts on the EPI images. We solved the problem using an unshielded gradient coil set of proper size (outer diameter = 39 mm, inner diameter = 32 mm) with time control of the current rise and reference scans. The obtained EPI images of a phantom and a plant sample were almost artifact-free and demonstrated the promise of our approach. PMID:27001398
Basic biostatistics for post-graduate students
Dakhale, Ganesh N.; Hiware, Sachin K.; Shinde, Abhijit T.; Mahatme, Mohini S.
2012-01-01
Statistical methods are important to draw valid conclusions from the obtained data. This article provides background information related to fundamental methods and techniques in biostatistics for the use of postgraduate students. Main focus is given to types of data, measurement of central variations and basic tests, which are useful for analysis of different types of observations. Few parameters like normal distribution, calculation of sample size, level of significance, null hypothesis, indices of variability, and different test are explained in detail by giving suitable examples. Using these guidelines, we are confident enough that postgraduate students will be able to classify distribution of data along with application of proper test. Information is also given regarding various free software programs and websites useful for calculations of statistics. Thus, postgraduate students will be benefitted in both ways whether they opt for academics or for industry. PMID:23087501
Optimum bus headway for preemption : a simulation approach
DOT National Transportation Integrated Search
1997-01-01
Preemption techniques are designed to provide preferential treatment for buses at signalized intersections. A preemption strategy, if properly designed, can provide continuous green phases for buses at successive intersections, thereby reducing trave...
Video markers tracking methods for bike fitting
NASA Astrophysics Data System (ADS)
Rajkiewicz, Piotr; Łepkowska, Katarzyna; Cygan, Szymon
2015-09-01
Sports cycling is becoming increasingly popular over last years. Obtaining and maintaining a proper position on the bike has been shown to be crucial for performance, comfort and injury avoidance. Various techniques of bike fitting are available - from rough settings based on body dimensions to professional services making use of sophisticated equipment and expert knowledge. Modern fitting techniques use mainly joint angles as a criterion of proper position. In this work we examine performance of two proposed methods for dynamic cyclist position assessment based on video data recorded during stationary cycling. Proposed methods are intended for home use, to help amateur cyclist improve their position on the bike, and therefore no professional equipment is used. As a result of data processing, ranges of angles in selected joints are provided. Finally strengths and weaknesses of both proposed methods are discussed.
NASA Astrophysics Data System (ADS)
Lee, Haenghwa; Choi, Sunghoon; Jo, Byungdu; Kim, Hyemi; Lee, Donghoon; Kim, Dohyeon; Choi, Seungyeon; Lee, Youngjin; Kim, Hee-Joung
2017-03-01
Chest digital tomosynthesis (CDT) is a new 3D imaging technique that can be expected to improve the detection of subtle lung disease over conventional chest radiography. Algorithm development for CDT system is challenging in that a limited number of low-dose projections are acquired over a limited angular range. To confirm the feasibility of algebraic reconstruction technique (ART) method under variations in key imaging parameters, quality metrics were conducted using LUNGMAN phantom included grand-glass opacity (GGO) tumor. Reconstructed images were acquired from the total 41 projection images over a total angular range of +/-20°. We evaluated contrast-to-noise ratio (CNR) and artifacts spread function (ASF) to investigate the effect of reconstruction parameters such as number of iterations, relaxation parameter and initial guess on image quality. We found that proper value of ART relaxation parameter could improve image quality from the same projection. In this study, proper value of relaxation parameters for zero-image (ZI) and back-projection (BP) initial guesses were 0.4 and 0.6, respectively. Also, the maximum CNR values and the minimum full width at half maximum (FWHM) of ASF were acquired in the reconstructed images after 20 iterations and 3 iterations, respectively. According to the results, BP initial guess for ART method could provide better image quality than ZI initial guess. In conclusion, ART method with proper reconstruction parameters could improve image quality due to the limited angular range in CDT system.
SPR based immunosensor for detection of Legionella pneumophila in water samples
NASA Astrophysics Data System (ADS)
Enrico, De Lorenzis; Manera, Maria G.; Montagna, Giovanni; Cimaglia, Fabio; Chiesa, Maurizio; Poltronieri, Palmiro; Santino, Angelo; Rella, Roberto
2013-05-01
Detection of legionellae by water sampling is an important factor in epidemiological investigations of Legionnaires' disease and its prevention. To avoid labor-intensive problems with conventional methods, an alternative, highly sensitive and simple method is proposed for detecting L. pneumophila in aqueous samples. A compact Surface Plasmon Resonance (SPR) instrumentation prototype, provided with proper microfluidics tools, is built. The developed immunosensor is capable of dynamically following the binding between antigens and the corresponding antibody molecules immobilized on the SPR sensor surface. A proper immobilization strategy is used in this work that makes use of an important efficient step aimed at the orientation of antibodies onto the sensor surface. The feasibility of the integration of SPR-based biosensing setups with microfluidic technologies, resulting in a low-cost and portable biosensor is demonstrated.
Probing the Galactic Potential with Next-generation Observations of Disk Stars
NASA Astrophysics Data System (ADS)
Sumi, T.; Johnston, K. V.; Tremaine, S.; Spergel, D. N.; Majewski, S. R.
2009-07-01
Our current knowledge of the rotation curve of the Milky Way is remarkably poor compared to other galaxies, limited by the combined effects of extinction and the lack of large samples of stars with good distance estimates and proper motions. Near-future surveys promise a dramatic improvement in the number and precision of astrometric, photometric, and spectroscopic measurements of stars in the Milky Way's disk. We examine the impact of such surveys on our understanding of the Galaxy by "observing" particle realizations of nonaxisymmetric disk distributions orbiting in an axisymmetric halo with appropriate errors and then attempting to recover the underlying potential using a Markov Chain Monte Carlo approach. We demonstrate that the azimuthally averaged gravitational force field in the Galactic plane—and hence, to a lesser extent, the Galactic mass distribution—can be tightly constrained over a large range of radii using a variety of types of surveys so long as the error distribution of the measurements of the parallax, proper motion, and radial velocity are well understood and the disk is surveyed globally. One advantage of our method is that the target stars can be selected nonrandomly in real or apparent-magnitude space to ensure just such a global sample without biasing the results. Assuming that we can always measure the line-of-sight velocity of a star with at least 1 km s-1 precision, we demonstrate that the force field can be determined to better than ~1% for Galactocentric radii in the range R = 4-20 kpc using either: (1) small samples (a few hundred stars) with very accurate trigonometric parallaxes and good proper-motion measurements (uncertainties δ p,tri lsim 10 μas and δμ lsim 100 μas yr-1 respectively); (2) modest samples (~1000 stars) with good indirect parallax estimates (e.g., uncertainty in photometric parallax δ p,phot~ 10%-20%) and good proper-motion measurements (δμ ~ 100 μas yr-1) or (3) large samples (~104 stars) with good indirect parallax estimates and lower accuracy proper-motion measurements (δμ~ 1 mas yr-1). We conclude that near-future surveys, like Space Interferometry Mission Lite, Global Astrometric Interferometer for Astrophysics, and VERA, will provide the first precise mapping of the gravitational force field in the region of the Galactic disk.
ERIC Educational Resources Information Center
Van der Kooy-Hofland, Verna A. C.; Bus, Adriana G.; Roskos, Kathleen
2012-01-01
Living Letters is an adaptive game designed to promote children's combining of how the proper name sounds with their knowledge of how the name looks. A randomized controlled trial (RCT) was used to experimentally test whether priming for attending to the sound-symbol relationship in the proper name can reduce the risk for developing reading…
Identification of Particles in Parenteral Drug Raw Materials.
Lee, Kathryn; Lankers, Markus; Valet, Oliver
2018-04-18
Particles in drug products are not good and are therefore regulated. These particles can come from the very beginning of the manufacturing process, from the raw materials. To prevent particles, it is important to understand what they are and where they come from so the raw material quality, processing, and shipping can be improved. Thus, it is important to correctly identify particles seen in raw materials. Raw materials need to be of a certain quality with respect to physical and chemical composition, and need to have no contaminants in the form of particles which could contaminate the product or indicate the raw materials are not pure enough to make a good quality product. Particles are often seen when handling raw materials due to color, size, or shape characteristics different from those in the raw materials. Particles may appear to the eye to be very different things than they actually are, so microscope, chemical, and elemental analyses are required for accuracy in proper identification. This paper shows how using three different spectroscopy tools correctly and together can be used to identify particles from extrinsic, intrinsic, and inherent particles. Sources of materials can be humans and the environment (extrinsic), from within the process (intrinsic), and part of the formulation (inherent). Microscope versions of Raman spectroscopy, laser-induced breakdown spectroscopy (LIBS), and IR spectroscopy are excellent tools for identifying particles because they are fast and accurate techniques needing minimal sample preparation that can provide chemical composition as well as images that can be used for identification. The micro analysis capabilities allow for easy analysis of different portions of samples so multiple components can be identified and sample preparation can be reduced. Using just one of these techniques may not be sufficient to give adequate identification results so that the source of contamination can be adequately identified. The complementarity of the techniques provides the advantage of identifying various chemical and molecular components, as well as elemental and image analyses. Correct interpretation of the results from these techniques is also very important. Copyright © 2018, Parenteral Drug Association.
NASA Astrophysics Data System (ADS)
Kamal, Khaled Y.; Hemmersbach, Ruth; Medina, F. Javier; Herranz, Raúl
2015-04-01
Understanding the physical and biological effects of the absence of gravity is necessary to conduct operations on space environments. It has been previously shown that the microgravity environment induces the dissociation of cell proliferation from cell growth in young seedling root meristems, but this source material is limited to few cells in each row of meristematic layers. Plant cell cultures, composed by a large and homogeneous population of proliferating cells, are an ideal model to study the effects of altered gravity on cellular mechanisms regulating cell proliferation and associated cell growth. Cell suspension cultures of Arabidopsis thaliana cell line (MM2d) were exposed to 2D-clinorotation in a pipette clinostat for 3.5 or 14 h, respectively, and were then processed either by quick freezing, to be used in flow cytometry, or by chemical fixation, for microscopy techniques. After long-term clinorotation, the proportion of cells in G1 phase was increased and the nucleolus area, as revealed by immunofluorescence staining with anti-nucleolin, was decreased. Despite the compatibility of these results with those obtained in real microgravity on seedling meristems, we provide a technical discussion in the context of clinorotation and proper 1 g controls with respect to suspension cultures. Standard 1 g procedure of sustaining the cell suspension is achieved by continuously shaking. Thus, we compare the mechanical forces acting on cells in clinorotated samples, in a control static sample and in the standard 1 g conditions of suspension cultures in order to define the conditions of a complete and reliable experiment in simulated microgravity with corresponding 1 g controls.
Luminescence properties of pure and doped CaSO4 nanorods irradiated by 15 MeV e-beam
NASA Astrophysics Data System (ADS)
Salah, Numan; Alharbi, Najlaa D.; Enani, Mohammad A.
2014-01-01
Calcium sulfate (CaSO4) doped with proper activators is a highly sensitive phosphor used in different fields mainly for radiation dosimetry, lighting and display applications. In this work pure and doped nanorods of CaSO4 were produced by the co-precipitation technique. Samples from this material doped with Ag, Cu, Dy, Eu and Tb were exposed to different doses of 15 MeV e-beam and studied for their thermoluminesence (TL) and photoluminescence (PL) properties. Color center formation leading to PL emissions were investigated before and after e-beam irradiation. The samples doped with rare earths elements (i.e. Dy, Eu and Tb) were observed to have thinner nanorods than the other samples and have higher absorption in the UV region. The Ag and Tb doped samples have poor TL response to e-beam, while those activated by Cu, Dy and Eu have strong glow peaks at around 123 °C. Quite linear response curves in the whole studied exposures i.e. 0.1-100 Gy were also observed in Cu and Dy doped samples. The PL results show that pure CaSO4 nanorods have active color centers without irradiation, which could be enriched/modified by these impurities mainly rare earths and further enhanced by e-beam irradiation. Eu3+ → Eu2+ conversion is clearly observed in Eu doped sample after e-beam irradiation. These results show that these nanorods might be useful in lighting and display devices development.
The Use and Abuse of Limits of Detection in Environmental Analytical Chemistry
Brown, Richard J. C.
2008-01-01
The limit of detection (LoD) serves as an important method performance measure that is useful for the comparison of measurement techniques and the assessment of likely signal to noise performance, especially in environmental analytical chemistry. However, the LoD is only truly related to the precision characteristics of the analytical instrument employed for the analysis and the content of analyte in the blank sample. This article discusses how other criteria, such as sampling volume, can serve to distort the quoted LoD artificially and make comparison between various analytical methods inequitable. In order to compare LoDs between methods properly, it is necessary to state clearly all of the input parameters relating to the measurements that have been used in the calculation of the LoD. Additionally, the article discusses that the use of LoDs in contexts other than the comparison of the attributes of analytical methods, in particular when reporting analytical results, may be confusing, less informative than quoting the actual result with an accompanying statement of uncertainty, and may act to bias descriptive statistics. PMID:18690384
Low Mass Members in Nearby Young Moving Groups Revealed
NASA Astrophysics Data System (ADS)
Schlieder, Joshua; Simon, Michal; Rice, Emily; Lepine, Sebastien
2010-08-01
We are now ready to expand our program that identifies highly probable low-mass members of the nearby young moving groups (NYMGs) to stars of mass ~ 0.1 Msun. This is important 1) To provide high priority targets for exoplanet searches by direct imaging, 2) To complete the census of the membership in the NYMGs, and 3) To provide a well-characterized sample of nearby young stars for detailed study of their physical properties and multiplicity (the median distances of the (beta) Pic and AB Dor groups are ~ 35 pc with ages ~ 12 and 50 Myr respectively). Our proven technique starts with a proper motion selection algorithm, proceeds to vet the sample for indicators of youth, and requires as its last step the measurement of candidate member radial velocities (RVs). So far, we have obtained all RV measurements with the high resolution IR spectrometer at the NASA-IRTF and have reached the limits of its applicability. To identify probable new members in the south, and also those of the lowest mass, we need the sensitivity of PHOENIX at Gemini-S and NIRSPEC at Keck-II.
Heperkan, Dilek; Gökmen, Ece
2016-07-01
The aim of this study was to investigate the potential use of FTIR spectroscopy as a rapid screening method to detect fumonisin produced by Aspergillus niger. A. niger spore suspensions isolated from raisins were inoculated in Petri dishes prepared with sultana raisin or black raisin extracts containing agar and malt extract agar (MEA). After 9 days of incubation at 25°C, fumonisin B2 (FB2) production on each agar plate was determined by subjecting the agar plugs to IR spectroscopy. The presence of amino group (at 1636-1639 cm(-1)) was especially indicative of fumonisin production in MEA and the raisin extracts containing agar. The results were confirmed by HPLC analysis of the agar sample extracts after immunoaffinity column cleanup. It was determined that A. niger produced more FB2 in sultana raisins than in MEA, with no FB2 being produced in black raisin extract agar. This study demonstrated that proper sample preparation procedure followed by FTIR analysis is a useful technique for identifying toxigenic molds and their mycotoxin production in agricultural commodities.
Barrena, Raquel; Font, Xavier; Gabarrell, Xavier; Sánchez, Antoni
2014-07-01
Stability is one of the most important properties of compost obtained from the organic fraction of municipal solid wastes. This property is essential for the application of compost to land to avoid further field degradation and emissions of odors, among others. In this study, a massive characterization of compost samples from both home producers and industrial facilities is presented. Results are analyzed in terms of chemical and respiration characterizations, the latter representing the stability of the compost. Results are also analyzed in terms of statistical validation. The main conclusion from this work is that home composting, when properly conducted, can achieve excellent levels of stability, whereas industrial compost produced in the studied facilities can also present a high stability, although an important dispersion is found in these composts. The study also highlights the importance of respiration techniques to have a reliable characterization of compost quality, while the chemical characterization does not provide enough information to have a complete picture of a compost sample. Copyright © 2014 Elsevier Ltd. All rights reserved.
Hamiltonian Monte Carlo acceleration using surrogate functions with random bases.
Zhang, Cheng; Shahbaba, Babak; Zhao, Hongkai
2017-11-01
For big data analysis, high computational cost for Bayesian methods often limits their applications in practice. In recent years, there have been many attempts to improve computational efficiency of Bayesian inference. Here we propose an efficient and scalable computational technique for a state-of-the-art Markov chain Monte Carlo methods, namely, Hamiltonian Monte Carlo. The key idea is to explore and exploit the structure and regularity in parameter space for the underlying probabilistic model to construct an effective approximation of its geometric properties. To this end, we build a surrogate function to approximate the target distribution using properly chosen random bases and an efficient optimization process. The resulting method provides a flexible, scalable, and efficient sampling algorithm, which converges to the correct target distribution. We show that by choosing the basis functions and optimization process differently, our method can be related to other approaches for the construction of surrogate functions such as generalized additive models or Gaussian process models. Experiments based on simulated and real data show that our approach leads to substantially more efficient sampling algorithms compared to existing state-of-the-art methods.
Insausti, Matías; Gomes, Adriano A; Cruz, Fernanda V; Pistonesi, Marcelo F; Araujo, Mario C U; Galvão, Roberto K H; Pereira, Claudete F; Band, Beatriz S F
2012-08-15
This paper investigates the use of UV-vis, near infrared (NIR) and synchronous fluorescence (SF) spectrometries coupled with multivariate classification methods to discriminate biodiesel samples with respect to the base oil employed in their production. More specifically, the present work extends previous studies by investigating the discrimination of corn-based biodiesel from two other biodiesel types (sunflower and soybean). Two classification methods are compared, namely full-spectrum SIMCA (soft independent modelling of class analogies) and SPA-LDA (linear discriminant analysis with variables selected by the successive projections algorithm). Regardless of the spectrometric technique employed, full-spectrum SIMCA did not provide an appropriate discrimination of the three biodiesel types. In contrast, all samples were correctly classified on the basis of a reduced number of wavelengths selected by SPA-LDA. It can be concluded that UV-vis, NIR and SF spectrometries can be successfully employed to discriminate corn-based biodiesel from the two other biodiesel types, but wavelength selection by SPA-LDA is key to the proper separation of the classes. Copyright © 2012 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Shieh, Gwowen
2013-01-01
The a priori determination of a proper sample size necessary to achieve some specified power is an important problem encountered frequently in practical studies. To establish the needed sample size for a two-sample "t" test, researchers may conduct the power analysis by specifying scientifically important values as the underlying population means…
Postprostatectomy Erectile Dysfunction: A Review
Salonia, Andrea; Briganti, Alberto; Montorsi, Francesco
2016-01-01
In the current era of the early diagnosis of prostate cancer (PCa) and the development of minimally invasive surgical techniques, erectile dysfunction (ED) represents an important issue, with up to 68% of patients who undergo radical prostatectomy (RP) complaining of postoperative erectile function (EF) impairment. In this context, it is crucial to comprehensively consider all factors possibly associated with the prevention of post-RP ED throughout the entire clinical management of PCa patients. A careful assessment of both oncological and functional baseline characteristics should be carried out for each patient preoperatively. Baseline EF, together with age and the overall burden of comorbidities, has been strongly associated with the chance of post-RP EF recovery. With this goal in mind, internationally validated psychometric instruments are preferable for ensuring proper baseline EF evaluations, and questionnaires should be administered at the proper time before surgery. Careful preoperative counselling is also required, both to respect the patient's wishes and to avoid false expectations regarding eventual recovery of baseline EF. The advent of robotic surgery has led to improvements in the knowledge of prostate surgical anatomy, as reflected by the formal redefinition of nerve-sparing techniques. Overall, comparative studies have shown significantly better EF outcomes for robotic RP than for open techniques, although data from prospective trials have not always been consistent. Preclinical data and several prospective randomized trials have demonstrated the value of treating patients with oral phosphodiesterase 5 inhibitors (PDE5is) after surgery, with the concomitant potential benefit of early re-oxygenation of the erectile tissue, which appears to be crucial for avoiding the eventual penile structural changes that are associated with postoperative neuropraxia and ultimately result in severe ED. For patients who do not properly respond to PDE5is, proper counselling regarding intracavernous treatment should be considered, along with the further possibility of surgical treatment for ED involving the implantation of a penile prosthesis. PMID:27574591
Enhanced Ligand Sampling for Relative Protein–Ligand Binding Free Energy Calculations
2016-01-01
Free energy calculations are used to study how strongly potential drug molecules interact with their target receptors. The accuracy of these calculations depends on the accuracy of the molecular dynamics (MD) force field as well as proper sampling of the major conformations of each molecule. However, proper sampling of ligand conformations can be difficult when there are large barriers separating the major ligand conformations. An example of this is for ligands with an asymmetrically substituted phenyl ring, where the presence of protein loops hinders the proper sampling of the different ring conformations. These ring conformations become more difficult to sample when the size of the functional groups attached to the ring increases. The Adaptive Integration Method (AIM) has been developed, which adaptively changes the alchemical coupling parameter λ during the MD simulation so that conformations sampled at one λ can aid sampling at the other λ values. The Accelerated Adaptive Integration Method (AcclAIM) builds on AIM by lowering potential barriers for specific degrees of freedom at intermediate λ values. However, these methods may not work when there are very large barriers separating the major ligand conformations. In this work, we describe a modification to AIM that improves sampling of the different ring conformations, even when there is a very large barrier between them. This method combines AIM with conformational Monte Carlo sampling, giving improved convergence of ring populations and the resulting free energy. This method, called AIM/MC, is applied to study the relative binding free energy for a pair of ligands that bind to thrombin and a different pair of ligands that bind to aspartyl protease β-APP cleaving enzyme 1 (BACE1). These protein–ligand binding free energy calculations illustrate the improvements in conformational sampling and the convergence of the free energy compared to both AIM and AcclAIM. PMID:25906170
Evaluation of heavy metals content in dietary supplements in Lebanon
2013-01-01
Background The consumption of dietary supplements is widely spread and on the rise. These dietary supplements are generally used without prescriptions, proper counseling or any awareness of their health risk. The current study aimed at analyzing the metals in 33 samples of imported dietary supplements highly consumed by the Lebanese population, using 3 different techniques, to ensure the safety and increase the awareness of the citizen to benefit from these dietary supplements. Results Some samples had levels of metals above their maximum allowable levels (Fe: 24%, Zn: 33%, Mn: 27%, Se: 15%, Mo: 12% of samples), but did not pose any health risk because they were below permitted daily exposure limit and recommended daily allowance except for Fe in 6% of the samples. On the other hand, 34% of the samples had Cu levels above allowable limit where 18% of them were above their permitted daily exposure and recommended daily allowance. In contrast, all samples had concentration of Cr, Hg, and Pb below allowable limits and daily exposure. Whereas, 30% of analyzed samples had levels of Cd above allowable levels, and were statistically correlated with Ca, and Zn essential minerals. Similarly 62% of the samples had levels of As above allowable limits and As levels were associated with Fe and Mn essential minerals. Conclusion Dietary supplements consumed as essential nutrients for their Ca, Zn, Fe and Mn content should be monitored for toxic metal levels due to their natural geochemical association with these essential metals to provide citizens the safe allowable amounts. PMID:23331553
Zarzycki, Paweł K; Portka, Joanna K
2015-09-01
Pentacyclic triterpenoids, particularly hopanoids, are organism-specific compounds and are generally considered as useful biomarkers that allow fingerprinting and classification of biological, environmental and geological samples. Simultaneous quantification of various hopanoids together with battery of related non-polar and low-molecular mass compounds may provide principal information for geochemical and environmental research focusing on both modern and ancient investigations. Target compounds can be derived from microbial biomass, water columns, sediments, coals, crude fossils or rocks. This create number of analytical problems due to different composition of the analytical matrix and interfering compounds and therefore, proper optimization of quantification protocols for such biomarkers is still the challenge. In this work we summarizing typical analytical protocols that were recently applied for quantification of hopanoids like compounds from different samples. Main steps including components of interest extraction, pre-purification, fractionation, derivatization and quantification involving gas (1D and 2D) as well as liquid separation techniques (liquid-liquid extraction, solid-phase extraction, planar and low resolution column chromatography, high-performance liquid chromatography) are described and discussed from practical point of view, mainly based on the experimental papers that were published within last two years, where significant increase in hopanoids research was noticed. The second aim of this review is to describe the latest research trends concerning determination of hopanoids and related low-molecular mass lipids analyzed in various samples including sediments, rocks, coals, crude oils and plant fossils as well as stromatolites and microbial biomass cultivated under different conditions. It has been found that majority of the most recent papers are based on uni- or bivariate approach for complex data analysis. Data interpretation involves number of physicochemical parameters and hopanoids quantities or given biomarkers mass ratios derived from high-throughput separation and detection systems, typically GC-MS and HPLC-MS. Based on quantitative data reported in recently published experimental works it has been demonstrated that multivariate data analysis using e.g. principal components computations may significantly extend our knowledge concerning proper biomarkers selection and samples classification by means of hopanoids and related non-polar compounds. Copyright © 2015 Elsevier Ltd. All rights reserved.
Pérez, Germán M; Salomón, Luis A; Montero-Cabrera, Luis A; de la Vega, José M García; Mascini, Marcello
2016-05-01
A novel heuristic using an iterative select-and-purge strategy is proposed. It combines statistical techniques for sampling and classification by rigid molecular docking through an inverse virtual screening scheme. This approach aims to the de novo discovery of short peptides that may act as docking receptors for small target molecules when there are no data available about known association complexes between them. The algorithm performs an unbiased stochastic exploration of the sample space, acting as a binary classifier when analyzing the entire peptides population. It uses a novel and effective criterion for weighting the likelihood of a given peptide to form an association complex with a particular ligand molecule based on amino acid sequences. The exploratory analysis relies on chemical information of peptides composition, sequence patterns, and association free energies (docking scores) in order to converge to those peptides forming the association complexes with higher affinities. Statistical estimations support these results providing an association probability by improving predictions accuracy even in cases where only a fraction of all possible combinations are sampled. False positives/false negatives ratio was also improved with this method. A simple rigid-body docking approach together with the proper information about amino acid sequences was used. The methodology was applied in a retrospective docking study to all 8000 possible tripeptide combinations using the 20 natural amino acids, screened against a training set of 77 different ligands with diverse functional groups. Afterward, all tripeptides were screened against a test set of 82 ligands, also containing different functional groups. Results show that our integrated methodology is capable of finding a representative group of the top-scoring tripeptides. The associated probability of identifying the best receptor or a group of the top-ranked receptors is more than double and about 10 times higher, respectively, when compared to classical random sampling methods.
Soot and Radiation Measurements in Microgravity Jet Diffusion Flames
NASA Technical Reports Server (NTRS)
Ku, Jerry C.
1996-01-01
The subject of soot formation and radiation heat transfer in microgravity jet diffusion flames is important not only for the understanding of fundamental transport processes involved but also for providing findings relevant to spacecraft fire safety and soot emissions and radiant heat loads of combustors used in air-breathing propulsion systems. Our objectives are to measure and model soot volume fraction, temperature, and radiative heat fluxes in microgravity jet diffusion flames. For this four-year project, we have successfully completed three tasks, which have resulted in new research methodologies and original results. First is the implementation of a thermophoretic soot sampling technique for measuring particle size and aggregate morphology in drop-tower and other reduced gravity experiments. In those laminar flames studied, we found that microgravity soot aggregates typically consist of more primary particles and primary particles are larger in size than those under normal gravity. Comparisons based on data obtained from limited samples show that the soot aggregate's fractal dimension varies within +/- 20% of its typical value of 1.75, with no clear trends between normal and reduced gravity conditions. Second is the development and implementation of a new imaging absorption technique. By properly expanding and spatially-filtering the laser beam to image the flame absorption on a CCD camera and applying numerical smoothing procedures, this technique is capable of measuring instantaneous full-field soot volume fractions. Results from this technique have shown the significant differences in local soot volume fraction, smoking point, and flame shape between normal and reduced gravity flames. We observed that some laminar flames become open-tipped and smoking under microgravity. The third task we completed is the development of a computer program which integrates and couples flame structure, soot formation, and flame radiation analyses together. We found good agreements between model predictions and experimental data for laminar and turbulent flames under both normal and reduced gravity. We have also tested in the laboratory the techniques of rapid-insertion fine-wire thermocouples and emission pyrometry for temperature measurements. These techniques as well as laser Doppler velocimetry and spectral radiative intensity measurement have been proposed to provide valuable data and improve the modeling analyses.
NASA Technical Reports Server (NTRS)
Newcomb, J. S.
1975-01-01
The present paper describes an automated system for measuring stellar proper motions on the basis of information contained in photographic plates. In this system, the images on a star plate are digitized by a scanning microdensitometer using light from a He-Ne gas laser, and a special-purpose computer arranges the measurements in computer-compatible form on magnetic tape. The scanning and image-reconstruction processes are briefly outlined, and the image-evaluation techniques are discussed. It is shown that the present system has been especially successful in measuring the proper motions of low-luminosity stars, including 119 stars with less than 1/10,000 of the solar bolometric luminosity. Plans for measurements of high-density Milky Way star plates are noted.
78 FR 52236 - Proposed Collection; Comment Request for Form 13560
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-22
... return of funds in order to ensure proper handling. This form serves as supporting documentation for any..., including through the use of automated collection techniques or other forms of information technology; and...
Caring for Your Videodiscs, CD-ROM Discs, and Players.
ERIC Educational Resources Information Center
Ekhaml, Leticia; Saygan, Bobby
1993-01-01
Presents guidelines for the proper care and handling of videodisc and CD-ROM hardware and software. Topics discussed include handling the equipment, moving, cleaning techniques, storage considerations, ventilation requirements, and climate control. (LRW)
ERIC Educational Resources Information Center
Fream, Ronald
1976-01-01
When the process of remodeling a golf course is undertaken with professional and thorough planning, creative design, and proper construction techniques, the finished product can provide many years of challenging and esthetically pleasing golf play. (JD)
Locating underwater objects. [technology transfer
NASA Technical Reports Server (NTRS)
Grice, C. F.
1974-01-01
Underwater search operations are considered to be engineering and operational problems. A process for proper definition of the problem and selection of instrumentation and operational procedures is described. An outline of underwater search instrumentation and techniques is given.
Tracking and Ability Grouping in Education.
ERIC Educational Resources Information Center
Drowatzky, John N.
1981-01-01
A look at identification, grouping, and instruction techniques, followed by an assessment of the legality of a properly administered tracking system and an enumeration of guidelines and due process procedures that must be followed. (Author/MLF)
Weight Training: Do's and Don'ts of Proper Technique
... may take up extra time and contribute to overload injury. However, the number of sets that you ... Sports Medicine. http://www.acsm.org/access-public-information/brochures-fact-sheets/brochures. Accessed June 24, 2015. ...
2007-06-01
and immovable with fingers. Body side rivet base and outside rivet burr should be flat against the material. Bent rivets will fail under stress ...such as using knots, tying around sharp edges, etc.) and maximum permitted free fall distance. Also, to be stressed are the importance of inspections...limitations; e. Application limits; f. Proper hook -up, anchoring and tie-off techniques, including the proper dee-ring or other attachment point to use on
A Protocol for Safe Lithiation Reactions Using Organolithium Reagents
Gau, Michael R.; Zdilla, Michael J.
2016-01-01
Organolithium reagents are powerful tools in the synthetic chemist's toolbox. However, the extreme pyrophoric nature of the most reactive reagents warrants proper technique, thorough training, and proper personal protective equipment. To aid in the training of researchers using organolithium reagents, a thorough, step-by-step protocol for the safe and effective use of tert-butyllithium on an inert gas line or within a glovebox is described. As a model reaction, preparation of lithium tert-butyl amide by the reaction of tert-butyl amine with one equivalent of tert-butyl lithium is presented. PMID:27911386
Management of fractures of the condyle, condylar neck, and coronoid process.
Kisnisci, Reha
2013-11-01
Proper anatomic reduction of the fracture and accelerated complete recovery are desirable goals after trauma reconstruction. Over the recent decades, significant headway in craniomaxillofacial trauma care has been achieved and advancements in the management for the injuries of the mandibular condyle have also proved to be no exception. A trend in operative and reconstructive options for proper anatomic reduction and internal fixation has become notable as a result of newly introduced technology, surgical techniques, and operative expertise. Copyright © 2013 Elsevier Inc. All rights reserved.
40 CFR 1065.595 - PM sample post-conditioning and total weighing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sample media (e.g., filters) to the weighing and PM-stabilization environments. (a) Make sure the...). If those specifications are not met, leave the test sample media (e.g., filters) covered until proper.... If you use filters, you may remove them from their cassettes before or after stabilization. We...
40 CFR 1065.595 - PM sample post-conditioning and total weighing.
Code of Federal Regulations, 2013 CFR
2013-07-01
... sample media (e.g., filters) to the weighing and PM-stabilization environments. (a) Make sure the...). If those specifications are not met, leave the test sample media (e.g., filters) covered until proper.... If you use filters, you may remove them from their cassettes before or after stabilization. We...
40 CFR 1065.595 - PM sample post-conditioning and total weighing.
Code of Federal Regulations, 2011 CFR
2011-07-01
... sample media (e.g., filters) to the weighing and PM-stabilization environments. (a) Make sure the...). If those specifications are not met, leave the test sample media (e.g., filters) covered until proper.... If you use filters, you may remove them from their cassettes before or after stabilization. We...
40 CFR 1065.595 - PM sample post-conditioning and total weighing.
Code of Federal Regulations, 2014 CFR
2014-07-01
... sample media (e.g., filters) to the weighing and PM-stabilization environments. (a) Make sure the...). If those specifications are not met, leave the test sample media (e.g., filters) covered until proper.... If you use filters, you may remove them from their cassettes before or after stabilization. We...
40 CFR 1065.595 - PM sample post-conditioning and total weighing.
Code of Federal Regulations, 2012 CFR
2012-07-01
... sample media (e.g., filters) to the weighing and PM-stabilization environments. (a) Make sure the...). If those specifications are not met, leave the test sample media (e.g., filters) covered until proper.... If you use filters, you may remove them from their cassettes before or after stabilization. We...
The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the NHEXAS data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by t...
Water isotopic ratios from a continuously melted ice core sample
NASA Astrophysics Data System (ADS)
Gkinis, V.; Popp, T. J.; Blunier, T.; Bigler, M.; Schüpbach, S.; Johnsen, S. J.
2011-06-01
A new technique for on-line high resolution isotopic analysis of liquid water, tailored for ice core studies is presented. We build an interface between an Infra Red Cavity Ring Down Spectrometer (IR-CRDS) and a Continuous Flow Analysis (CFA) system. The system offers the possibility to perform simultaneuous water isotopic analysis of δ18O and δD on a continuous stream of liquid water as generated from a continuously melted ice rod. Injection of sub μl amounts of liquid water is achieved by pumping sample through a fused silica capillary and instantaneously vaporizing it with 100 % efficiency in a home made oven at a temperature of 170 °C. A calibration procedure allows for proper reporting of the data on the VSMOW scale. We apply the necessary corrections based on the assessed performance of the system regarding instrumental drifts and dependance on humidity levels. The melt rates are monitored in order to assign a depth scale to the measured isotopic profiles. Application of spectral methods yields the combined uncertainty of the system at below 0.1 ‰ and 0.5 ‰ for δ18O and δD, respectively. This performance is comparable to that achieved with mass spectrometry. Dispersion of the sample in the transfer lines limits the resolution of the technique. In this work we investigate and assess these dispersion effects. By using an optimal filtering method we show how the measured profiles can be corrected for the smoothing effects resulting from the sample dispersion. Considering the significant advantages the technique offers, i.e. simultaneuous measurement of δ18O and δD, potentially in combination with chemical components that are traditionally measured on CFA systems, notable reduction on analysis time and power consumption, we consider it as an alternative to traditional isotope ratio mass spectrometry with the possibility to be deployed for field ice core studies. We present data acquired in the framework of the NEEM deep ice core drilling project in Greenland, during the 2010 field season.
Water isotopic ratios from a continuously melted ice core sample
NASA Astrophysics Data System (ADS)
Gkinis, V.; Popp, T. J.; Blunier, T.; Bigler, M.; Schüpbach, S.; Kettner, E.; Johnsen, S. J.
2011-11-01
A new technique for on-line high resolution isotopic analysis of liquid water, tailored for ice core studies is presented. We built an interface between a Wavelength Scanned Cavity Ring Down Spectrometer (WS-CRDS) purchased from Picarro Inc. and a Continuous Flow Analysis (CFA) system. The system offers the possibility to perform simultaneuous water isotopic analysis of δ18O and δD on a continuous stream of liquid water as generated from a continuously melted ice rod. Injection of sub μl amounts of liquid water is achieved by pumping sample through a fused silica capillary and instantaneously vaporizing it with 100% efficiency in a~home made oven at a temperature of 170 °C. A calibration procedure allows for proper reporting of the data on the VSMOW-SLAP scale. We apply the necessary corrections based on the assessed performance of the system regarding instrumental drifts and dependance on the water concentration in the optical cavity. The melt rates are monitored in order to assign a depth scale to the measured isotopic profiles. Application of spectral methods yields the combined uncertainty of the system at below 0.1‰ and 0.5‰ for δ18O and δD, respectively. This performance is comparable to that achieved with mass spectrometry. Dispersion of the sample in the transfer lines limits the temporal resolution of the technique. In this work we investigate and assess these dispersion effects. By using an optimal filtering method we show how the measured profiles can be corrected for the smoothing effects resulting from the sample dispersion. Considering the significant advantages the technique offers, i.e. simultaneuous measurement of δ18O and δD, potentially in combination with chemical components that are traditionally measured on CFA systems, notable reduction on analysis time and power consumption, we consider it as an alternative to traditional isotope ratio mass spectrometry with the possibility to be deployed for field ice core studies. We present data acquired in the field during the 2010 season as part of the NEEM deep ice core drilling project in North Greenland.
40 CFR 90.712 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...
Phase computations and phase models for discrete molecular oscillators.
Suvak, Onder; Demir, Alper
2012-06-11
Biochemical oscillators perform crucial functions in cells, e.g., they set up circadian clocks. The dynamical behavior of oscillators is best described and analyzed in terms of the scalar quantity, phase. A rigorous and useful definition for phase is based on the so-called isochrons of oscillators. Phase computation techniques for continuous oscillators that are based on isochrons have been used for characterizing the behavior of various types of oscillators under the influence of perturbations such as noise. In this article, we extend the applicability of these phase computation methods to biochemical oscillators as discrete molecular systems, upon the information obtained from a continuous-state approximation of such oscillators. In particular, we describe techniques for computing the instantaneous phase of discrete, molecular oscillators for stochastic simulation algorithm generated sample paths. We comment on the accuracies and derive certain measures for assessing the feasibilities of the proposed phase computation methods. Phase computation experiments on the sample paths of well-known biological oscillators validate our analyses. The impact of noise that arises from the discrete and random nature of the mechanisms that make up molecular oscillators can be characterized based on the phase computation techniques proposed in this article. The concept of isochrons is the natural choice upon which the phase notion of oscillators can be founded. The isochron-theoretic phase computation methods that we propose can be applied to discrete molecular oscillators of any dimension, provided that the oscillatory behavior observed in discrete-state does not vanish in a continuous-state approximation. Analysis of the full versatility of phase noise phenomena in molecular oscillators will be possible if a proper phase model theory is developed, without resorting to such approximations.
Pandiyan, Vimal Prabhu; John, Renu
2016-01-20
We propose a versatile 3D phase-imaging microscope platform for real-time imaging of optomicrofluidic devices based on the principle of digital holographic microscopy (DHM). Lab-on-chip microfluidic devices fabricated on transparent polydimethylsiloxane (PDMS) and glass substrates have attained wide popularity in biological sensing applications. However, monitoring, visualization, and characterization of microfluidic devices, microfluidic flows, and the biochemical kinetics happening in these devices is difficult due to the lack of proper techniques for real-time imaging and analysis. The traditional bright-field microscopic techniques fail in imaging applications, as the microfluidic channels and the fluids carrying biological samples are transparent and not visible in bright light. Phase-based microscopy techniques that can image the phase of the microfluidic channel and changes in refractive indices due to the fluids and biological samples present in the channel are ideal for imaging the fluid flow dynamics in a microfluidic channel at high resolutions. This paper demonstrates three-dimensional imaging of a microfluidic device with nanometric depth precisions and high SNR. We demonstrate imaging of microelectrodes of nanometric thickness patterned on glass substrate and the microfluidic channel. Three-dimensional imaging of a transparent PDMS optomicrofluidic channel, fluid flow, and live yeast cell flow in this channel has been demonstrated using DHM. We also quantify the average velocity of fluid flow through the channel. In comparison to any conventional bright-field microscope, the 3D depth information in the images illustrated in this work carry much information about the biological system under observation. The results demonstrated in this paper prove the high potential of DHM in imaging optofluidic devices; detection of pathogens, cells, and bioanalytes on lab-on-chip devices; and in studying microfluidic dynamics in real time based on phase changes.
Phase computations and phase models for discrete molecular oscillators
2012-01-01
Background Biochemical oscillators perform crucial functions in cells, e.g., they set up circadian clocks. The dynamical behavior of oscillators is best described and analyzed in terms of the scalar quantity, phase. A rigorous and useful definition for phase is based on the so-called isochrons of oscillators. Phase computation techniques for continuous oscillators that are based on isochrons have been used for characterizing the behavior of various types of oscillators under the influence of perturbations such as noise. Results In this article, we extend the applicability of these phase computation methods to biochemical oscillators as discrete molecular systems, upon the information obtained from a continuous-state approximation of such oscillators. In particular, we describe techniques for computing the instantaneous phase of discrete, molecular oscillators for stochastic simulation algorithm generated sample paths. We comment on the accuracies and derive certain measures for assessing the feasibilities of the proposed phase computation methods. Phase computation experiments on the sample paths of well-known biological oscillators validate our analyses. Conclusions The impact of noise that arises from the discrete and random nature of the mechanisms that make up molecular oscillators can be characterized based on the phase computation techniques proposed in this article. The concept of isochrons is the natural choice upon which the phase notion of oscillators can be founded. The isochron-theoretic phase computation methods that we propose can be applied to discrete molecular oscillators of any dimension, provided that the oscillatory behavior observed in discrete-state does not vanish in a continuous-state approximation. Analysis of the full versatility of phase noise phenomena in molecular oscillators will be possible if a proper phase model theory is developed, without resorting to such approximations. PMID:22687330
Levinson, Kimberly L.; Salmeron, Jorge; Sologuren, Carlos Vallejos; Fernandez, Maria Jose Vallejos; Belinson, Jerome L.
2014-01-01
Peru struggles to prevent cervical cancer (CC). In the jungle, prevention programs suffer from significant barriers although technology exists to detect CC precursors. This study used community based participatory research (CBPR) methods to overcome barriers. The objective was to evaluate the utility of CBPR techniques in a mother–child screen/treat and vaccinate program for CC prevention in the Peruvian jungle. The CC prevention program used self-sampling for human papillomavirus (HPV) for screening, cryotherapy for treatment and the HPV vaccine Gardasil for vaccination. Community health leaders (HL) from around Iquitos participated in a two half day educational course. The HLs then decided how to implement interventions in their villages or urban sectors. The success of the program was measured by: (1) ability of the HLs to determine an implementation plan, (2) proper use of research forms, (3) participation and retention rates, and (4) participants’ satisfaction. HLs successfully registered 320 women at soup kitchens, schools, and health posts. Screening, treatment, and vaccination were successfully carried out using forms for registration, consent, and results with minimum error. In the screen/treat intervention 100 % of participants gave an HPV sample and 99.7 % reported high satisfaction; 81 % of HPV + women were treated, and 57 % returned for 6-month followup. Vaccine intervention: 98 % of girls received the 1st vaccine, 88 % of those received the 2nd, and 65 % the 3rd. CBPR techniques successfully helped implement a screen/treat and vaccinate CC prevention program around Iquitos, Peru. These techniques may be appropriate for large-scale preventive health-care interventions. PMID:24276617
Abuelo, Carolina E; Levinson, Kimberly L; Salmeron, Jorge; Sologuren, Carlos Vallejos; Fernandez, Maria Jose Vallejos; Belinson, Jerome L
2014-06-01
Peru struggles to prevent cervical cancer (CC). In the jungle, prevention programs suffer from significant barriers although technology exists to detect CC precursors. This study used community based participatory research (CBPR) methods to overcome barriers. The objective was to evaluate the utility of CBPR techniques in a mother-child screen/treat and vaccinate program for CC prevention in the Peruvian jungle. The CC prevention program used self-sampling for human papillomavirus (HPV) for screening, cryotherapy for treatment and the HPV vaccine Gardasil for vaccination. Community health leaders (HL) from around Iquitos participated in a two half day educational course. The HLs then decided how to implement interventions in their villages or urban sectors. The success of the program was measured by: (1) ability of the HLs to determine an implementation plan, (2) proper use of research forms, (3) participation and retention rates, and (4) participants' satisfaction. HLs successfully registered 320 women at soup kitchens, schools, and health posts. Screening, treatment, and vaccination were successfully carried out using forms for registration, consent, and results with minimum error. In the screen/treat intervention 100% of participants gave an HPV sample and 99.7% reported high satisfaction; 81% of HPV + women were treated, and 57% returned for 6-month followup. Vaccine intervention: 98% of girls received the 1st vaccine, 88% of those received the 2nd, and 65% the 3rd. CBPR techniques successfully helped implement a screen/treat and vaccinate CC prevention program around Iquitos, Peru. These techniques may be appropriate for large-scale preventive health-care interventions.
Chen, Bo; Chen, Minhua; Paisley, John; Zaas, Aimee; Woods, Christopher; Ginsburg, Geoffrey S; Hero, Alfred; Lucas, Joseph; Dunson, David; Carin, Lawrence
2010-11-09
Nonparametric Bayesian techniques have been developed recently to extend the sophistication of factor models, allowing one to infer the number of appropriate factors from the observed data. We consider such techniques for sparse factor analysis, with application to gene-expression data from three virus challenge studies. Particular attention is placed on employing the Beta Process (BP), the Indian Buffet Process (IBP), and related sparseness-promoting techniques to infer a proper number of factors. The posterior density function on the model parameters is computed using Gibbs sampling and variational Bayesian (VB) analysis. Time-evolving gene-expression data are considered for respiratory syncytial virus (RSV), Rhino virus, and influenza, using blood samples from healthy human subjects. These data were acquired in three challenge studies, each executed after receiving institutional review board (IRB) approval from Duke University. Comparisons are made between several alternative means of per-forming nonparametric factor analysis on these data, with comparisons as well to sparse-PCA and Penalized Matrix Decomposition (PMD), closely related non-Bayesian approaches. Applying the Beta Process to the factor scores, or to the singular values of a pseudo-SVD construction, the proposed algorithms infer the number of factors in gene-expression data. For real data the "true" number of factors is unknown; in our simulations we consider a range of noise variances, and the proposed Bayesian models inferred the number of factors accurately relative to other methods in the literature, such as sparse-PCA and PMD. We have also identified a "pan-viral" factor of importance for each of the three viruses considered in this study. We have identified a set of genes associated with this pan-viral factor, of interest for early detection of such viruses based upon the host response, as quantified via gene-expression data.
Determining Reduced Order Models for Optimal Stochastic Reduced Order Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonney, Matthew S.; Brake, Matthew R.W.
2015-08-01
The use of parameterized reduced order models(PROMs) within the stochastic reduced order model (SROM) framework is a logical progression for both methods. In this report, five different parameterized reduced order models are selected and critiqued against the other models along with truth model for the example of the Brake-Reuss beam. The models are: a Taylor series using finite difference, a proper orthogonal decomposition of the the output, a Craig-Bampton representation of the model, a method that uses Hyper-Dual numbers to determine the sensitivities, and a Meta-Model method that uses the Hyper-Dual results and constructs a polynomial curve to better representmore » the output data. The methods are compared against a parameter sweep and a distribution propagation where the first four statistical moments are used as a comparison. Each method produces very accurate results with the Craig-Bampton reduction having the least accurate results. The models are also compared based on time requirements for the evaluation of each model where the Meta- Model requires the least amount of time for computation by a significant amount. Each of the five models provided accurate results in a reasonable time frame. The determination of which model to use is dependent on the availability of the high-fidelity model and how many evaluations can be performed. Analysis of the output distribution is examined by using a large Monte-Carlo simulation along with a reduced simulation using Latin Hypercube and the stochastic reduced order model sampling technique. Both techniques produced accurate results. The stochastic reduced order modeling technique produced less error when compared to an exhaustive sampling for the majority of methods.« less
Identification of biogeochemical hot spots using time-lapse hydrogeophysics
NASA Astrophysics Data System (ADS)
Franz, T. E.; Loecke, T.; Burgin, A.
2016-12-01
The identification and monitoring of biogeochemical hot spots and hot moments is difficult using point based sampling techniques and sensors. Without proper monitoring and accounting of water, energy, and trace gas fluxes it is difficult to assess the environmental footprint of land management practices. One key limitation is optimal placement of sensors/chambers that adequately capture the point scale fluxes and thus a reasonable integration to landscape scale flux. In this work we present time-lapse hydrogeophysical imaging at an old agricultural field converted into a wetland mitigation bank near Dayton, Ohio. While the wetland was previously instrumented with a network of soil sensors and surface chambers to capture a suite of state variables and fluxes, we hypothesize that time-lapse hydrogeophysical imaging is an underutilized and critical reconnaissance tool for effective network design and landscape scaling. Here we combine the time-lapse hydrogeophysical imagery with the multivariate statistical technique of Empirical Orthogonal Functions (EOF) in order to isolate the spatial and temporal components of the imagery. Comparisons of soil core information (e.g. soil texture, soil carbon) from around the study site and organized within like spatial zones reveal statistically different mean values of soil properties. Moreover, the like spatial zones can be used to identify a finite number of future sampling locations, evaluation of the placement of existing sensors/chambers, upscale/downscale observations, all of which are desirable techniques for commercial use in precision agriculture. Finally, we note that combining the EOF analysis with continuous monitoring from point sensors or remote sensing products may provide a robust statistical framework for scaling observations through time as well as provide appropriate datasets for use in landscape biogeochemical models.
Outside-out "sniffer-patch" clamp technique for in situ measures of neurotransmitter release.
Muller-Chrétien, Emilie
2014-01-01
The mechanism underlying neurotransmitter release is a critical research domain for the understanding of neuronal network function; however, few techniques are available for the direct detection and measurement of neurotransmitter release. To date, the sniffer-patch clamp technique is mainly used to investigate these mechanisms from individual cultured cells. In this study, we propose to adapt the sniffer-patch clamp technique to in situ detection of neurosecretion. Using outside-out patches from donor cells as specific biosensors plunged in acute cerebral slices, this technique allows for proper detection and quantification of neurotransmitter release at the level of the neuronal network.
Garcia, Elisângela Zacanti; Yamashita, Hélio Kiitiro; Garcia, Davi Sousa; Padovani, Marina Martins Pereira; Azevedo, Renata Rangel; Chiari, Brasília Maria
2016-01-01
Cone beam computed tomography (CBCT), which represents an alternative to traditional computed tomography and magnetic resonance imaging, may be a useful instrument to study vocal tract physiology related to vocal exercises. This study aims to evaluate the applicability of CBCT to the assessment of variations in the vocal tract of healthy individuals before and after vocal exercises. Voice recordings and CBCT images before and after vocal exercises performed by 3 speech-language pathologists without vocal complaints were collected and compared. Each participant performed 1 type of exercise, i.e., Finnish resonance tube technique, prolonged consonant "b" technique, or chewing technique. The analysis consisted of an acoustic analysis and tomographic imaging. Modifications of the vocal tract settings following vocal exercises were properly detected by CBCT, and changes in the acoustic parameters were, for the most part, compatible with the variations detected in image measurements. CBCT was shown to be capable of properly assessing the changes in vocal tract settings promoted by vocal exercises. © 2017 S. Karger AG, Basel.
Surgical Removal of Neglected Soft Tissue Foreign Bodies by Needle-Guided Technique
Ebrahimi, Ali; Radmanesh, Mohammad; Rabiei, Sohrab; kavoussi, Hossein
2013-01-01
Introduction: The phenomenon of neglected foreign bodies is a significant cause of morbidity in soft tissue injuries and may present to dermatologists as delayed wound healing, localized cellulitis and inflammation, abscess formation, or foreign body sensation. Localization and removal of neglected soft tissue foreign bodies (STFBs) is complex due to possible inflammation, indurations, granulated tissue, and fibrotic scar. This paper describes a simple method for the quick localization and (surgical) removal of neglected STFBs using two 23-gauge needles without ultrasonographic or fluoroscopic guidance. Materials and Methods: A technique based on the use of two 23-gauge needles was used in 41 neglected STFBs in order to achieve proper localization and fixation of foreign bodies during surgery. Results: Surgical removal was successful in 38 of 41 neglected STFBs (ranging from 2–13mm in diameter). Conclusion: The cross-needle-guided technique is an office-based procedure that allows the successful surgical removal of STFBs using minimal soft tissue exploration and dissection via proper localization, fixation, and propulsion of the foreign body toward the surface of the skin. PMID:24303416
Hand hygiene technique quality evaluation in nursing and medicine students of two academic courses 1
Škodová, Manuela; Gimeno-Benítez, Alfredo; Martínez-Redondo, Elena; Morán-Cortés, Juan Francisco; Jiménez-Romano, Ramona; Gimeno-Ortiz, Alfredo
2015-01-01
Abstract Objective: because they are health professionals, nursing and medical students' hands during internships can function as a transmission vehicle for hospital-acquired infections. Method: a descriptive study with nursing and medical degree students on the quality of the hand hygiene technique, which was assessed via a visual test using a hydroalcoholic solution marked with fluorescence and an ultraviolet lamp. Results: 546 students were assessed, 73.8% from medicine and 26.2% from nursing. The area of the hand with a proper antiseptic distribution was the palm (92.9%); areas not properly scrubbed were the thumbs (55.1%). 24.7% was very good in both hands, 29.8% was good, 25.1% was fair, and 20.3% was poor. The worst assessed were the male, nursing and first year students. There were no significant differences in the age groups. Conclusions: hand hygiene technique is not applied efficiently. Education plays a key role in setting a good practice base in hand hygiene, theoretical knowledge, and in skill development, as well as good practice reinforcement. PMID:26444174
Photometric detection of high proper motions in dense stellar fields using difference image analysis
NASA Astrophysics Data System (ADS)
Eyer, L.; Woźniak, P. R.
2001-10-01
The difference image analysis (DIA) of the images obtained by the Optical Gravitational Lensing Experiment (OGLE-II) revealed a peculiar artefact in the sample of stars proposed as variable by Woźniak in one of the Galactic bulge fields: the occurrence of pairs of candidate variables showing anti-correlated light curves monotonic over a period of 3yr. This effect can be understood, quantified and related to the stellar proper motions. DIA photometry supplemented with a simple model offers an effective and easy way to detect high proper motion stars in very dense stellar fields, where conventional astrometric searches are extremely inefficient.
2016-03-18
Contract _______________________________ 10 PCO and ACO Did Not Ensure Appointment of COR ________________________________________ 12 NAVSEA Did Not Properly...sample was selected, NAVSEA had issued work for 36 availabilities on the contract—obligating $102.5M. { The procuring contracting officer ( PCO ) at...Responsibilities,” Section 1.602-2, “Responsibilities,” May 29, 2014. 3 The contracting officers for the contract reviewed are the PCO at NAVSEA
Multivariate analysis of elemental chemistry as a robust biosignature
NASA Astrophysics Data System (ADS)
Storrie-Lombardi, M.; Nealson, K.
2003-04-01
The robotic detection of life in extraterrestrial settings (i.e., Mars, Europa, etc.) would be greatly simplified if analysis could be accomplished in the absence of direct mechanical manipulation of a sample. It would also be preferable to employ a fundamental physico-chemical phenomenon as a biosignature and depend less on the particular manifestations of life on Earth (i.e. to employ non-earthcentric methods). One such approach, which we put forward here, is that of elemental composition, a reflection of the use of specific chemical elements for the construction of living systems. Using appropriate analyses (over the proper spatial scales), it should be possible to see deviations from the geological background (mineral and geochemical composition of the crust), and identify anomalies that would indicate sufficient deviation from the norm as to indicate a possible living system. To this end, over the past four decades elemental distributions have been determined for the sun, the interstellar medium, seawater, the crust of the Earth, carbonaceous chondrite meteorites, bacteria, plants, animals, and human beings. Such data can be relatively easily obtained for samples of a variety of types using a technique known as laser-induced breakdown spectroscopy (LIBS), which employs a high energy laser to ablate a portion of a sample, and then determine elemental composition using remote optical spectroscopy. However, the elements commonly associated with living systems (H, C, O, and N), while useful for detecting extant life, are relatively volatile and are not easily constrained across geological time scales. This minimizes their utility as fossil markers of ancient life. We have investigated the possibility of distinguishing the distributions of less volatile elements in a variety of biological materials from the distributions found in carbonaceous chondrites and the Earth’s crust using principal component analysis (PCA), a classical multivariate analysis technique capable of optimizing classification using spectral or multiple variable inputs. We present initial results indicating that 21 elements are of particular utility and can produce clear classification with no errors when used in minimum sets of four (4), e.g. [V-23, Ti-22, Cr-24, I-53] or [Al-13, Si-14, P-15, Fe-26]. The detection limits and ease of approach suggest that these methods should be valuable for detection of biological residual signatures against specific Mars mineral backgrounds. Clearly, measurements must be made at the proper spatial scales in order to see these anomalies, and data must be analyzed with no pre-predjudice of what the elemental composition of life should be - both of these potential problems are easily dealt with. Of particular interest is the observation that many non-volatile elements can be effectively used for life detection, suggesting that fossilized (e.g., dead or even extinct) samples may retain these inorganic signatures of past life.
VizieR Online Data Catalog: 1876 open clusters multimembership catalog (Sampedro+, 2017)
NASA Astrophysics Data System (ADS)
Sampedro, L.; Dias, W. S.; Alfaro, E. J.; Monteiro, H.; Molino, A.
2017-10-01
We use version 3.5 of the New Optically Visible Open Clusters and Candidates catalogue (hereafter DAML02; Dias et al., 2002, Cat. B/ocl), to select a sample of 2167 open clusters to be analysed. The stellar positions and the proper motions are taken from the UCAC4 (Zacharias et al., 2013, Cat. I/322). The catalogue contains data for over 113 million stars (105 million of them with proper-motion data), and is complete down to magnitude R=16. The positional accuracy of the listed objects is about 15-100mas per coordinate, depending on the magnitude. Formal errors in proper motions range from about 1 to 10mas/yr, depending on the magnitude and the observational history. Systematic errors in the proper motions are estimated to be about 1-4mas/yr. (2 data files).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Xingchen; Peng, Xuan; Sumption, Michael
The internal oxidation technique can generate ZrO2 nano particles in Nb3Sn strands, which markedly refine the Nb3Sn grain size and boost the high-field critical current density (Jc). This article summarizes recent efforts on implementing this technique in practical Nb3Sn wires and adding Ti as a dopant. It is demonstrated that this technique can be readily incorporated into the present Nb3Sn conductor manufacturing technology. Powder-in-tube (PIT) strands with fine subelements (~25 µm) based on this technique were successfully fabricated, and proper heat treatments for oxygen transfer were explored. Future work for producing strands ready for applications is proposed.
NASA Astrophysics Data System (ADS)
Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie
2017-12-01
In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.
Fibla, Juan J; Molins, Laureano; Moradiellos, Javier; Rodríguez, Pedro; Heras, Félix; Canalis, Emili; Bolufer, Sergio; Martínez, Pablo; Aragón, Javier; Arroyo, Andrés; Pérez, Javier; León, Pablo; Canela, Mercedes
2016-01-01
Although the Nuss technique revolutionized the surgical treatment of pectus excavatum, its use has not become widespread in our country. The aim of this study was to analyze the current use of this technique in a sample of Thoracic Surgery Departments in Spain. Observational rectrospective multicentric study analyzing the main epidemiological aspects and clinical results of ten years experience using the Nuss technique. Between 2001 and 2010 a total of 149 patients were operated on (mean age 21.2 years), 74% male. Initial aesthetic results were excellent or good in 93.2%, mild in 4.1% and bad in 2.7%. After initial surgery there were complications in 45 patients (30.6%). The most frequent were wound seroma, bar displacement, stabilizer break, pneumothorax, haemothorax, wound infection, pneumonia, pericarditis and cardiac tamponade that required urgent bar removal. Postoperative pain appeared in all patients. In 3 cases (2%) it was so intense that it required bar removal. After a mean follow-up of 39.2 months, bar removal had been performed in 72 patients (49%), being difficult in 5 cases (7%). After a 1.6 year follow-up period good results persisted in 145 patients (98.7%). Nuss technique in adults has had good results in Spanish Thoracic Surgery Departments, however its use has not been generalized. The risk of complications must be taken into account and its indication must be properly evaluated. The possibility of previous conservative treatment is being analyzed in several departments at present. Copyright © 2015 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.
Nondestructive Measurements Using Mechanical Waves in Reinforced Concrete Structures.
DOT National Transportation Integrated Search
2014-02-01
"This study evaluated various techniques that use mechanical waves for the evaluation of critical concrete properties, : such as proper consolidation of the concrete during placement and strength development; changes in modulus; and the detection : o...
7 CFR 1902.2 - Policies concerning disbursement of funds.
Code of Federal Regulations, 2013 CFR
2013-01-01
... deposited in a supervised bank account. This supervisory technique will be used for a temporary period to help the borrower learn to properly manage his/her finances. Such a period will not exceed 1 year...
7 CFR 1902.2 - Policies concerning disbursement of funds.
Code of Federal Regulations, 2014 CFR
2014-01-01
... deposited in a supervised bank account. This supervisory technique will be used for a temporary period to help the borrower learn to properly manage his/her finances. Such a period will not exceed 1 year...
7 CFR 1902.2 - Policies concerning disbursement of funds.
Code of Federal Regulations, 2012 CFR
2012-01-01
... deposited in a supervised bank account. This supervisory technique will be used for a temporary period to help the borrower learn to properly manage his/her finances. Such a period will not exceed 1 year...
Notes on the technique of landing airplanes equipped with wing flaps
NASA Technical Reports Server (NTRS)
Gough, Melvin N
1936-01-01
The proper landing of airplanes equipped with flaps, although probably no more difficult than landing without them, requires a different technique. The effects of flaps on the aerodynamics characteristics of a wing are given and, with the aid of figures and diagrams, a detailed comparison of the glide and landing of an airplane with and without flaps is made. The dangers attending improper execution and the importance of such factors as air speed fuselage attitude, glide-path angle, and control manipulation, upon all of which a pilot bases his judgement, are emphasized. Of most importance in connection with the use of flaps are: the maintenance of a sufficient margin of speed above the stall; a decisive use of the controls at the proper time; more cautious use of power during the approach glide; and, above all, the willingness to accept the steep nose-down attitude necessary in the glide resulting from the use of flaps.
Wrist Pain in Gymnasts: A Review of Common Overuse Wrist Pathology in the Gymnastics Athlete.
Benjamin, Holly J; Engel, Sean C; Chudzik, Debra
Injury rates among gymnasts are among the highest of any sport at the high school and collegiate level per athletic exposure. The wrist has increased injury risk due to repetitive physical stresses predisposing it to acute injury, overuse, and degenerative damage. This article will review the most common overuse wrist injuries seen in gymnasts. Prompt evaluation and management is necessary to avoid the negative sequelae that can often accompany these injuries. Little is known about effective sport-specific injury prevention strategies, but general guidelines for overuse injury prevention including limiting excessive loading of the wrist, maintaining wrist joint flexibility, an emphasis on proper technique, and incorporating wrist and general core strengthening seem beneficial. General return to play principles are similar for all gymnast-related wrist injuries, including resolution of pain, restoration of normal wrist joint function, completion of a progressive rehabilitation program, and use of proper technique.
Numerical Schemes for the Hamilton-Jacobi and Level Set Equations on Triangulated Domains
NASA Technical Reports Server (NTRS)
Barth, Timothy J.; Sethian, James A.
1997-01-01
Borrowing from techniques developed for conservation law equations, numerical schemes which discretize the Hamilton-Jacobi (H-J), level set, and Eikonal equations on triangulated domains are presented. The first scheme is a provably monotone discretization for certain forms of the H-J equations. Unfortunately, the basic scheme lacks proper Lipschitz continuity of the numerical Hamiltonian. By employing a virtual edge flipping technique, Lipschitz continuity of the numerical flux is restored on acute triangulations. Next, schemes are introduced and developed based on the weaker concept of positive coefficient approximations for homogeneous Hamiltonians. These schemes possess a discrete maximum principle on arbitrary triangulations and naturally exhibit proper Lipschitz continuity of the numerical Hamiltonian. Finally, a class of Petrov-Galerkin approximations are considered. These schemes are stabilized via a least-squares bilinear form. The Petrov-Galerkin schemes do not possess a discrete maximum principle but generalize to high order accuracy.
NASA Astrophysics Data System (ADS)
GonzáLez, Pablo J.; FernáNdez, José
2011-10-01
Interferometric Synthetic Aperture Radar (InSAR) is a reliable technique for measuring crustal deformation. However, despite its long application in geophysical problems, its error estimation has been largely overlooked. Currently, the largest problem with InSAR is still the atmospheric propagation errors, which is why multitemporal interferometric techniques have been successfully developed using a series of interferograms. However, none of the standard multitemporal interferometric techniques, namely PS or SB (Persistent Scatterers and Small Baselines, respectively) provide an estimate of their precision. Here, we present a method to compute reliable estimates of the precision of the deformation time series. We implement it for the SB multitemporal interferometric technique (a favorable technique for natural terrains, the most usual target of geophysical applications). We describe the method that uses a properly weighted scheme that allows us to compute estimates for all interferogram pixels, enhanced by a Montecarlo resampling technique that properly propagates the interferogram errors (variance-covariances) into the unknown parameters (estimated errors for the displacements). We apply the multitemporal error estimation method to Lanzarote Island (Canary Islands), where no active magmatic activity has been reported in the last decades. We detect deformation around Timanfaya volcano (lengthening of line-of-sight ˜ subsidence), where the last eruption in 1730-1736 occurred. Deformation closely follows the surface temperature anomalies indicating that magma crystallization (cooling and contraction) of the 300-year shallow magmatic body under Timanfaya volcano is still ongoing.
Employing lighting techniques during on-orbit operations
NASA Technical Reports Server (NTRS)
Wheelwright, Charles D.; Toole, Jennifer R.
1991-01-01
As a result of past space missions and evaluations, many procedures have been established and shown to be prudent applications for use in present and future space environment scenarios. However, recent procedures to employ the use of robotics to assist crewmembers in performing tasks which require viewing remote and obstructed locations have led to a need to pursue alternative methods to assist in these operations. One of those techniques which is under development entails incorporating the use of suitable lighting aids/techniques with a closed circuit television (CCTV) camera/monitor system to supervise the robotics operations. The capability to provide adequate lighting during grappling, deploying, docking and berthing operations under all on-orbit illumination conditions is essential to a successful mission. Using automated devices such as the Remote Manipulator System (RMS) to dock and berth a vehicle during payload retrieval, under nighttime, earthshine, solar, or artificial illumination conditions can become a cumbersome task without first incorporating lighting techniques that provide the proper target illumination, orientation, and alignment cues. Studies indicate that the use of visual aids such as the CCTV with a pretested and properly oriented lighting system can decrease the time necessary to accomplish grappling tasks. Evaluations have been and continue to be performed to assess the various on-orbit conditions in order to predict and determine the appropriate lighting techniques and viewing angles necessary to assist crewmembers in payload operations.
Employing lighting techniques during on-orbit operations
NASA Astrophysics Data System (ADS)
Wheelwright, Charles D.; Toole, Jennifer R.
As a result of past space missions and evaluations, many procedures have been established and shown to be prudent applications for use in present and future space environment scenarios. However, recent procedures to employ the use of robotics to assist crewmembers in performing tasks which require viewing remote and obstructed locations have led to a need to pursue alternative methods to assist in these operations. One of those techniques which is under development entails incorporating the use of suitable lighting aids/techniques with a closed circuit television (CCTV) camera/monitor system to supervise the robotics operations. The capability to provide adequate lighting during grappling, deploying, docking and berthing operations under all on-orbit illumination conditions is essential to a successful mission. Using automated devices such as the Remote Manipulator System (RMS) to dock and berth a vehicle during payload retrieval, under nighttime, earthshine, solar, or artificial illumination conditions can become a cumbersome task without first incorporating lighting techniques that provide the proper target illumination, orientation, and alignment cues. Studies indicate that the use of visual aids such as the CCTV with a pretested and properly oriented lighting system can decrease the time necessary to accomplish grappling tasks. Evaluations have been and continue to be performed to assess the various on-orbit conditions in order to predict and determine the appropriate lighting techniques and viewing angles necessary to assist crewmembers in payload operations.
Lead and Copper Tap Sample Site Plan Instructions
This document may be used by public water systems in Wyoming and on EPA R8 Tribal Lands as a guide for how to properly complete their lead and copper tap sample site plans to comply with the Lead and Copper Rule.
Baumrind, S
1998-11-01
A number of clinical trials sponsored by the National Institutes of Health (NIH) use rigorous methods of data acquisition and analysis previously developed in fundamental biology and the physical sciences. The naive expectation that these trials would lead relatively rapidly to definitive answers concerning the therapeutic strategies and techniques under study is dispelled. This presentation focuses on delineating differences between the study of central tendencies and individual variation, more specifically on the strategy to study this variation: measure additional sources of variance within each patient at more timepoints and perhaps with greater precision. As rigorous orthodontic research is still in its infancy, the problem of defining the proper mix between prospective and retrospective trials is discussed. In view of the high costs of prospective clinical trials, many of the questions germane to orthodontics can be answered by well-conducted retrospective trials, assuming that properly randomized sampling procedures are employed. Definitive clinical trials are likely to require better theoretical constructs, better instrumentation, and better measures than now available. Reasons for concern are the restricted resources available and the fact that current mensurational approaches may not detect many of the individual differences. The task of constructing sharable databases and record bases stored in digital form and available either remotely from servers, or locally from CD-ROMs or optical disks, is crucial to the optimization of future investigations.
De, Anuradha
2013-01-01
Diarrhea is a major cause of morbidity and mortality in human immunodeficiency virus (HIV)-infected individuals. Opportunistic enteric parasitic infections are encountered in 30-60% of HIV seropositive patients in developed countries and in 90% of patients in developing countries. Once the CD4+ cell count drops below 200 cells/μl, patients are considered to have developed acquired immunodeficiency syndrome (AIDS), with the risk of an AIDS-defining illness or opportunistic infection significantly increasing. Opportunistic enteric parasites encountered in these patients are Cryptosporidium, Isospora, Cyclospora, and microsporidia; as well as those more commonly associated with gastrointestinal disease, for example, Giardia intestinalis, Entamoeba histolytica, Strongyloides stercoralis, and also rarely Balantidium coli. In view of AIDS explosion in India, opportunistic enteric parasites are becoming increasingly important and it has to be identified properly. Apart from wet mounts, concentration methods for stool samples and special staining techniques for identification of these parasites, commercially available fecal immunoassays are widely available for the majority of enteric protozoa. Molecular methods such as polymerase chain reaction (PCR), PCR-restriction fragment length polymorphism, flow cytometry, and sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE), have also come in the pipeline for early diagnosis of these infections. Proper disposal of the feces to prevent contamination of the soil and water, boiling/filtering drinking water along with improved personal hygiene might go a long way in preventing these enteric parasitic infections. PMID:23961436
Impact of solid discharges from coal usage in the Southwest.
Jones, D G; Straughan, I R
1978-12-01
The Southwestern region of the United States is extremely wealthy in low sulfur coal resources which must be eventually utilized in response to national energy balance priorities. Fly ash and scrubber sludge can be safely disposed of using properly managed techniques to ensure that any potential impact from elements such as boron, molybdenum, or selenium is rendered insignificant. Alternative methods of solids utilization are presently being developed. Fly ash is presently being marketed commercially as an additive for concrete manufacture. Successful experiments have been completed to demonstrate the manufacture of commercial-grade wallboard from scrubber sludge. Also, greenhouse studies and field experiments have been conducted to demonstrate increased yields of selected crops grown on typical soils amended with fly ash in amounts ranging from 2% to 8%, by weight. These studies also indicate that barium and strontium may be good monitoring indices for determining atmospheric deposition of fly ash, due to their concentration ratios in soil and vegetation samples. Further studies are being conducted to confirm encouraging irrigation and crop-yield data obtained with fly ash amended soils. Finally, the composition of many fly ashes and soils are similar in the Southwest, and there are no anticipated solid discharges from coal usage which cannot be rendered insignificant with proper management of existing and emerging methods of treatment. Compared with the water availability impact of coal usage in the Southwest, the impact of solid waste discharges are insignificant.
SSAGES: Software Suite for Advanced General Ensemble Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian
Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniquesmore » as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.« less
NASA Technical Reports Server (NTRS)
Oswalt, Terry D.; Hintzen, Paul M.; Luyten, Willem J.
1988-01-01
Identifications are provided for 103 bright Luyten common proper motion (CPM) stellar systems with m(pg) less than 17.0 mag containing likely white dwarf (WD) components. New spectral types are presented for 55 components, and spectral types for 51 more are available in the literature. With the CPM systems previously published by Giclas et al. (1978), the Luyten stars provide a uniform sample of nearly 200 pairs or multiples brighter than 17h magnitude. Selection effects biasing the combined samples are discussed; in particular, evidence is presented that fewer than 1 percent of wide WD binaries have been detected.
Dias, Philipe A; Dunkel, Thiemo; Fajado, Diego A S; Gallegos, Erika de León; Denecke, Martin; Wiedemann, Philipp; Schneider, Fabio K; Suhr, Hajo
2016-06-11
In the activated sludge process, problems of filamentous bulking and foaming can occur due to overgrowth of certain filamentous bacteria. Nowadays, these microorganisms are typically monitored by means of light microscopy, commonly combined with staining techniques. As drawbacks, these methods are susceptible to human errors, subjectivity and limited by the use of discontinuous microscopy. The in situ microscope appears as a suitable tool for continuous monitoring of filamentous bacteria, providing real-time examination, automated analysis and eliminating sampling, preparation and transport of samples. In this context, a proper image processing algorithm is proposed for automated recognition and measurement of filamentous objects. This work introduces a method for real-time evaluation of images without any staining, phase-contrast or dilution techniques, differently from studies present in the literature. Moreover, we introduce an algorithm which estimates the total extended filament length based on geodesic distance calculation. For a period of twelve months, samples from an industrial activated sludge plant were weekly collected and imaged without any prior conditioning, replicating real environment conditions. Trends of filament growth rate-the most important parameter for decision making-are correctly identified. For reference images whose filaments were marked by specialists, the algorithm correctly recognized 72 % of the filaments pixels, with a false positive rate of at most 14 %. An average execution time of 0.7 s per image was achieved. Experiments have shown that the designed algorithm provided a suitable quantification of filaments when compared with human perception and standard methods. The algorithm's average execution time proved its suitability for being optimally mapped into a computational architecture to provide real-time monitoring.
Planetary mass function and planetary systems
NASA Astrophysics Data System (ADS)
Dominik, M.
2011-02-01
With planets orbiting stars, a planetary mass function should not be seen as a low-mass extension of the stellar mass function, but a proper formalism needs to take care of the fact that the statistical properties of planet populations are linked to the properties of their respective host stars. This can be accounted for by describing planet populations by means of a differential planetary mass-radius-orbit function, which together with the fraction of stars with given properties that are orbited by planets and the stellar mass function allows the derivation of all statistics for any considered sample. These fundamental functions provide a framework for comparing statistics that result from different observing techniques and campaigns which all have their very specific selection procedures and detection efficiencies. Moreover, recent results both from gravitational microlensing campaigns and radial-velocity surveys of stars indicate that planets tend to cluster in systems rather than being the lonely child of their respective parent star. While planetary multiplicity in an observed system becomes obvious with the detection of several planets, its quantitative assessment however comes with the challenge to exclude the presence of further planets. Current exoplanet samples begin to give us first hints at the population statistics, whereas pictures of planet parameter space in its full complexity call for samples that are 2-4 orders of magnitude larger. In order to derive meaningful statistics, however, planet detection campaigns need to be designed in such a way that well-defined fully deterministic target selection, monitoring and detection criteria are applied. The probabilistic nature of gravitational microlensing makes this technique an illustrative example of all the encountered challenges and uncertainties.
Majumdar, Subhabrata; Basak, Subhash C
2018-04-26
Proper validation is an important aspect of QSAR modelling. External validation is one of the widely used validation methods in QSAR where the model is built on a subset of the data and validated on the rest of the samples. However, its effectiveness for datasets with a small number of samples but large number of predictors remains suspect. Calculating hundreds or thousands of molecular descriptors using currently available software has become the norm in QSAR research, owing to computational advances in the past few decades. Thus, for n chemical compounds and p descriptors calculated for each molecule, the typical chemometric dataset today has high value of p but small n (i.e. n < p). Motivated by the evidence of inadequacies of external validation in estimating the true predictive capability of a statistical model in recent literature, this paper performs an extensive and comparative study of this method with several other validation techniques. We compared four validation methods: leave-one-out, K-fold, external and multi-split validation, using statistical models built using the LASSO regression, which simultaneously performs variable selection and modelling. We used 300 simulated datasets and one real dataset of 95 congeneric amine mutagens for this evaluation. External validation metrics have high variation among different random splits of the data, hence are not recommended for predictive QSAR models. LOO has the overall best performance among all validation methods applied in our scenario. Results from external validation are too unstable for the datasets we analyzed. Based on our findings, we recommend using the LOO procedure for validating QSAR predictive models built on high-dimensional small-sample data. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
ERIC Educational Resources Information Center
Noll, Jennifer; Hancock, Stacey
2015-01-01
This research investigates what students' use of statistical language can tell us about their conceptions of distribution and sampling in relation to informal inference. Prior research documents students' challenges in understanding ideas of distribution and sampling as tools for making informal statistical inferences. We know that these…
The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the study data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by th...
ERIC Educational Resources Information Center
Erdmann, Mitzy A.; March, Joe L.
2016-01-01
Sample handling and laboratory notebook maintenance are necessary skills but can seem abstract if not presented to students in context. An introductory exercise focusing on proper sample handling, data collection and laboratory notebook keeping for the general chemistry laboratory was developed to emphasize the importance of keeping an accurate…
30 CFR 90.204 - Approved sampling devices; maintenance and calibration.
Code of Federal Regulations, 2013 CFR
2013-07-01
... performed to assure that the sampling devices are clean and in proper working condition by a person... voltage per cell value; (2) Examination of all components of the cyclone to assure that they are clean and... sampling device to assure that it is clean and free of leaks; and (5) Examination of the clamping and...
30 CFR 90.204 - Approved sampling devices; maintenance and calibration.
Code of Federal Regulations, 2014 CFR
2014-07-01
... performed to assure that the sampling devices are clean and in proper working condition by a person... voltage per cell value; (2) Examination of all components of the cyclone to assure that they are clean and... sampling device to assure that it is clean and free of leaks; and (5) Examination of the clamping and...
30 CFR 90.204 - Approved sampling devices; maintenance and calibration.
Code of Federal Regulations, 2012 CFR
2012-07-01
... performed to assure that the sampling devices are clean and in proper working condition by a person... voltage per cell value; (2) Examination of all components of the cyclone to assure that they are clean and... sampling device to assure that it is clean and free of leaks; and (5) Examination of the clamping and...
Some Considerations on {sup 242m}Am Production in Thermal Reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cesana, Alessandra; Mongelli, Sara Tania; Terrani, Mario
2004-10-15
Recently, it has been suggested to consider {sup 242m}Am as a potential nuclear fuel. This artificial nuclide can be produced through {sup 241}Am neutron capture carried on in a neutron field typical of a thermal reactor. In order to suppress the thermal neutron flux, which will cause {sup 242m}Am depletion mainly through fission, proper neutron filters should be adopted. In a very intense neutron field, the {sup 242m}Am enrichment depends mainly on the energy distribution of the neutrons, the sample thickness, and the cutoff energy of the neutron filter.An investigation on different geometries of the sample to be irradiated usingmore » Cd, B, Sm, and Gd as neutron filters has been carried out by means of Monte Carlo simulation. The most favorable results have been obtained irradiating thin {sup 241}Am samples (11 {mu}g/cm{sup 2}) covered with a Gd (0.2-mm-thick) or Sm (1-mm-thick) filter. In these cases the theoretical {sup 242m}Am enrichment can reach 20%.The preparation of significant quantities of this unconventional nuclear fuel implies isotopic separation techniques operating in high radioactive environments and hopefully characterized by very high recovery factors, which are in no way trivial problems.« less
Raman spectroscopy method for subsurface detection of food powders through plastic layers
NASA Astrophysics Data System (ADS)
Dhakal, Sagar; Chao, Kuanglin; Qin, Jianwei; Schmidt, Walter F.; Kim, Moon S.; Chan, Diane E.; Bae, Abigail
2017-05-01
Proper chemical analyses of materials in sealed containers are important for quality control purpose. Although it is feasible to detect chemicals at top surface layer, it is relatively challenging to detect objects beneath obscuring surface. This study used spatially offset Raman spectroscopy (SORS) method to detect urea, ibuprofen and acetaminophen powders contained within one or more (up to eight) layers of gelatin capsules to demonstrate subsurface chemical detection and identification. A 785 nm point-scan Raman spectroscopy system was used to acquire spatially offset Raman spectra for an offset range of 0 to 10 mm from the surfaces of 24 encapsulated samples, using a step size of 0.1 mm to obtain 101 spectral measurements per sample. With increasing offset distance, the fraction of information from the deeper subsurface material increased compared to that from the top surface material. The series of measurements was analyzed to differentiate and identify the top surface and subsurface materials. Containing mixed contributions from the powder and capsule, the SORS of each sample was decomposed using self modeling mixture analysis (SMA) to obtain pure component spectra of each component and corresponding components were identified using spectral information divergence values. Results show that SORS technique together with SMA method has a potential for non-invasive detection of chemicals at deep subsurface layer.
Superhydrophobic analyte concentration utilizing colloid-pillar array SERS substrates.
Wallace, Ryan A; Charlton, Jennifer J; Kirchner, Teresa B; Lavrik, Nickolay V; Datskos, Panos G; Sepaniak, Michael J
2014-12-02
The ability to detect a few molecules present in a large sample is of great interest for the detection of trace components in both medicinal and environmental samples. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. The following work involves superhydrophobic surfaces that have as a framework deterministic or stochastic silicon pillar arrays formed by lithographic or metal dewetting protocols, respectively. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added to the functionalized pillar array system via soaking. Native pillars and pillars with hydrophobic modification are used. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A ≥ 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 × 10(-12) M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up uses in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.
Mapping The Temporal and Spatial Variability of Soil Moisture Content Using Proximal Soil Sensing
NASA Astrophysics Data System (ADS)
Virgawati, S.; Mawardi, M.; Sutiarso, L.; Shibusawa, S.; Segah, H.; Kodaira, M.
2018-05-01
In studies related to soil optical properties, it has been proven that visual and NIR soil spectral response can predict soil moisture content (SMC) using proper data analysis techniques. SMC is one of the most important soil properties influencing most physical, chemical, and biological soil processes. The problem is how to provide reliable, fast and inexpensive information of SMC in the subsurface from numerous soil samples and repeated measurement. The use of spectroscopy technology has emerged as a rapid and low-cost tool for extensive investigation of soil properties. The objective of this research was to develop calibration models based on laboratory Vis-NIR spectroscopy to estimate the SMC at four different growth stages of the soybean crop in Yogyakarta Province. An ASD Field-spectrophotoradiometer was used to measure the reflectance of soil samples. The partial least square regression (PLSR) was performed to establish the relationship between the SMC with Vis-NIR soil reflectance spectra. The selected calibration model was used to predict the new samples of SMC. The temporal and spatial variability of SMC was performed in digital maps. The results revealed that the calibration model was excellent for SMC prediction. Vis-NIR spectroscopy was a reliable tool for the prediction of SMC.
Osman, Kamelia M; Samir, Ahmed; Orabi, Ahmed; Zolnikov, Tara Rava
2014-02-01
She-camel milk is an alternative solution for people allergic to milk; unfortunately, potential harmful bacteria have not been tested in she-camel milk. Listeria monocytogenes is one harmful bacterium that causes adverse health effects if chronically or acutely ingested by humans. The purpose of this study was to estimate the prevalence, characterize the phenotypic, genetic characterization, virulence factors, and antibiopotential harmful bacteria resistance profile of Listeria isolated from the milk of she-camel. Udder milk samples were collected from 100 she-camels and screened for mastitis using the California mastitis test (46 healthy female camels, 24 subclinical mastitic animals and 30 clinical mastitic animals). Samples were then examined for the presence of pathogenic Listeria spp; if located, the isolation of Listeria was completed using the International Organization for Standards technique to test for pathogenicity. The isolates were subjected to PCR assay for virulence-associated genes. Listeria spp. were isolated from 4% of samples and only 1.0% was confirmed as L. monocytogenes. The results of this study provide evidence for the low prevalence of intramammary Listeria infection; additionally, this study concludes she-camel milk in healthy camels milked and harvested in proper hygienic conditions may be used as alternative milk for human consumption. Copyright © 2013 Elsevier B.V. All rights reserved.
Preparation of a bonelike apatite-polymer fiber composite using a simple biomimetic process.
Yokoyama, Yoshiro; Oyane, Ayako; Ito, Atsuo
2008-08-01
A bonelike apatite-polymer fiber composite may be useful as an implant material to replace bone, the enthesis of a tendon, and the joint part of a ligament. We treated an ethylene-vinyl alcohol copolymer (EVOH) plate and knitted EVOH fibers with an oxygen plasma to produce oxygen-containing functional groups on their surfaces. The plasma-treated samples were alternately dipped in alcoholic calcium and phosphate ion solutions three times to deposit apatite precursors onto their surfaces. The surface-modified samples formed a dense and uniform bonelike surface apatite layer after immersion for 24 h in a simulated body fluid with ion concentrations approximately equal to those of human blood plasma. The adhesive strength between the apatite layer and the sample's surface increased with increasing power density of the oxygen plasma. The apatite-EVOH fiber composite obtained by our process has similarities to natural bone in that apatite crystals are deposited on organic polymer fibers. The resulting composite would possess osteoconductivity due to the apatite phase. With proper polymer selection and optimized synthesis techniques, a composite could be made that would have bonelike mechanical properties. Hence, the present surface modification and coating process would be a promising route to obtain new implant materials with bonelike mechanical properties and osteoconductivity. (c) 2007 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Li, Suxia; Wu, Haizhen; Zhao, Jian; Ou, Ling; Zhang, Yuanxing
2010-01-01
In an effort to achieve high success in knowledge and technique acquisition as a whole, a biochemistry and molecular biology experiment was established for high-grade biotechnology specialty students after they had studied essential theory and received proper technique training. The experiment was based on cloning and expression of alkaline…
Finite Element Analysis of Lamb Waves Acting within a Thin Aluminum Plate
2007-09-01
signal to avoid time aliasing % LambWaveMode % lamb wave mode to simulate; use proper phase velocity curve % thickness % thickness of...analysis of the simulated signal response data demonstrated that elevated temperatures delay wave propagation, although the delays are minimal at the...Echo Techniques Ultrasonic NDE techniques are based on the propagation and reflection of elastic waves , with the assumption that damage in the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theissen, Christopher A.; West, Andrew A.; Dhital, Saurav, E-mail: ctheisse@bu.edu
2016-02-15
We present a photometric catalog of 8,735,004 proper motion selected low-mass stars (KML-spectral types) within the Sloan Digital Sky Survey (SDSS) footprint, from the combined SDSS Data Release 10 (DR10), Two Micron All-Sky Survey (2MASS) point-source catalog (PSC), and Wide-field Infrared Survey Explorer (WISE) AllWISE catalog. Stars were selected using r − i, i − z, r − z, z − J, and z − W1 colors, and SDSS, WISE, and 2MASS astrometry was combined to compute proper motions. The resulting 3,518,150 stars were augmented with proper motions for 5,216,854 earlier type stars from the combined SDSS and United States Naval Observatory B1.0 catalog (USNO-B). We used SDSS+USNO-B proper motionsmore » to determine the best criteria for selecting a clean sample of stars. Only stars whose proper motions were greater than their 2σ uncertainty were included. Our Motion Verified Red Stars catalog is available through SDSS CasJobs and VizieR.« less
Anaesthesia for children with bronchial asthma and respiratory infections.
Rajesh, M C
2015-09-01
Asthma represents one of the most common chronic diseases in children with an increasing incidence reported worldwide. The key to successful anaesthetic outcome involves thorough pre-operative assessment and optimisation of the child's pulmonary status. Judicious application of proper anti-inflammatory and bronchodilatory regimes should be instituted as part of pre-operative preparation. Bronchospasm triggering agents should be carefully probed and meticulously avoided. A calm and properly sedated child at the time of induction is ideal, so also is extubation in a deep plane with an unobstructed airway. Wherever possible, regional anaesthesia should be employed. This will avoid airway manipulations, with additional benefit of excellent peri-operative analgesia. Agents with a potential for histamine release and techniques that can increase airway resistance should be diligently avoided. Emphasis must be given to proper post-operative care including respiratory monitoring, analgesia and breathing exercises.
Nanophotonics for Lab-on-Chip Applications
NASA Astrophysics Data System (ADS)
Seitz, Peter
Optical methods are the preferred measurement techniques for biosensors and lab-on-chip applications. Their key properties are sensitivity, selectivity and robustness. To simplify the systems and their operation, it is desirable to employ label-free optical methods, requiring the functionalization of interfaces. Evanescent electromagnetic waves are probing the optical proper ties near the interfaces, a few 100 nm deep into the sample fluid. The sensitivity of these measurements can be improved with optical micro-resonators, in particular whispering gallery mode devices. Q factors as high as 2x108 have been achieved in practice. The resulting narrow-linewidth resonances and an unexpected thermo-optic effect make it possible to detect single biomolecules using a label-free biosensor principle. Future generations of biosensors and labs-on-chip for point-of-care and high-troughput screening applications will require large numbers of parallel measurement channels, necessitating optical micro-resonators in array format produced very cost-effectively.
The Metabolism of the Volatile Amines
Tobe, Barry A.; Goldman, Bernard S.
1963-01-01
The effects of certain drugs on metabolism of ammonia by the liver and kidneys in dogs were investigated by a technique in which both hepatic inflow and outflow bloods could be repeatedly sampled in unanesthetized healthy animals. Specific representatives of the classes of the drugs studied included thiopental (barbiturates), morphine (opiates and analgesics), promazine (tranquillizers), and chlorothiazide (oral diuretics). The three drugs commonly used as sedatives were all found to impair the ability of the liver to metabolize ammonia. The diuretic, by contrast, increased the amount of ammonia put into the systemic system by the kidneys. Ethanol appeared to have little or no direct effect on ammonia metabolism. The possibility exists that the occurrence of acute hepatic encephalopathy in patients with severe liver disease may be avoided in many cases if these drugs are administered with proper care. Results also indicated that current concepts of the pharmacological action of sedatives, opiates and tranquillizers may require revision. ImagesFig. 2 PMID:14069611
Effects of the density and homogeneity in NIRS crop moisture estimation
NASA Astrophysics Data System (ADS)
Lenzini, Nicola; Rovati, Luigi; Ferrari, Luca
2017-06-01
Near-infrared spectroscopy (NIRS) is widely used in fruits and vegetables quality evaluation. This technique is also used for the analysis of alfalfa, a crop that occupies a position of great importance in the agricultural field. In particular for the storage, moisture content is a key parameter for the crops and for this reason its monitoring is very important during the harvesting phase. Usually optical methods like NIRS are well suitable in laboratory frameworks where the specimen is properly prepared, while their application during the harvesting phase presents several diffculties. A lot of influencing factors, such as density and degree of homogeneity can affect the moisture evaluation. In this paper we present the NIRS analysis of alfalfa specimens with different values of moisture and density, as well as the obtained results. To study scattering and absorption phenomena, the forward and backward scattered light from the sample have been spectrally analyzed.
Oliveri, Paolo
2017-08-22
Qualitative data modelling is a fundamental branch of pattern recognition, with many applications in analytical chemistry, and embraces two main families: discriminant and class-modelling methods. The first strategy is appropriate when at least two classes are meaningfully defined in the problem under study, while the second strategy is the right choice when the focus is on a single class. For this reason, class-modelling methods are also referred to as one-class classifiers. Although, in the food analytical field, most of the issues would be properly addressed by class-modelling strategies, the use of such techniques is rather limited and, in many cases, discriminant methods are forcedly used for one-class problems, introducing a bias in the outcomes. Key aspects related to the development, optimisation and validation of suitable class models for the characterisation of food products are critically analysed and discussed. Copyright © 2017 Elsevier B.V. All rights reserved.
PIXE microbeam analysis of the metallic debris release around endosseous implants
NASA Astrophysics Data System (ADS)
Buso, G. P.; Galassini, S.; Moschini, G.; Passi, P.; Zadro, A.; Uzunov, N. M.; Doyle, B. L.; Rossi, P.; Provencio, P.
2005-10-01
The mechanical friction that occurs during the surgical insertion of endosseous implants, both in dentistry and orthopaedics, may cause the detachment of metal debris which are dislodged into the peri-implant tissues and can lead to adverse clinical effects. This phenomenon more likely happens with coated or roughened implants that are the most widely employed. In the present study were studied dental implants screws made of commercially pure titanium and coated using titanium plasma-spray (TPS) technique. The implants were inserted in the tibia of rabbits, and removed "en bloc" with the surrounding bone after one month. After proper processing and mounting on plastic holders, samples from bones were analysed by EDXRF setup at of National Laboratories of Legnaro, INFN, Italy, and consequently at 3 MeV proton microbeam setup at Sandia National Laboratories. Elemental maps were drawn, showing some occasional presence of metal particles in the peri-implant bone.
Rare earth doped M-type hexaferrites; ferromagnetic resonance and magnetization dynamics
NASA Astrophysics Data System (ADS)
Sharma, Vipul; Kumari, Shweta; Kuanr, Bijoy K.
2018-05-01
M-type hexagonal barium ferrites come in the category of magnetic material that plays a key role in electromagnetic wave propagation in various microwave devices. Due to their large magnetic anisotropy and large magnetization, their operating frequency exceeds above 50 GHz. Doping is a way to vary its magnetic properties to such an extent that its ferromagnetic resonance (FMR) response can be tuned over a broad frequency band. We have done a complete FMR study of rare earth elements neodymium (Nd) and samarium (Sm), with cobalt (Co) as base, doped hexaferrite nanoparticles (NPs). X-ray diffractometry, vibrating sample magnetometer (VSM), and ferromagnetic resonance (FMR) techniques were used to characterize the microstructure and magnetic properties of doped hexaferrite nanoparticles. Using proper theoretical electromagnetic models, various parameters are extracted from FMR data which play important role in designing and fabricating high-frequency microwave devices.
NASA Astrophysics Data System (ADS)
Majerek, Dariusz; Guz, Łukasz; Suchorab, Zbigniew; Łagód, Grzegorz; Sobczuk, Henryk
2017-07-01
Mold that develops on moistened building barriers is a major cause of the Sick Building Syndrome (SBS). Fungal contamination is normally evaluated using standard biological methods which are time-consuming and require a lot of manual labor. Fungi emit Volatile Organic Compounds (VOC) that can be detected in the indoor air using several techniques of detection e.g. chromatography. VOCs can be also detected using gas sensors arrays. All array sensors generate particular voltage signals that ought to be analyzed using properly selected statistical methods of interpretation. This work is focused on the attempt to apply statistical classifying models in evaluation of signals from gas sensors matrix to analyze the air sampled from the headspace of various types of the building materials at different level of contamination but also clean reference materials.
Blumencranz, Peter; Whitworth, Pat W; Deck, Kenneth; Rosenberg, Anne; Reintgen, Douglas; Beitsch, Peter; Chagpar, Anees; Julian, Thomas; Saha, Sukamal; Mamounas, Eleftherios; Giuliano, Armando; Simmons, Rache
2007-10-01
When sentinel node dissection reveals breast cancer metastasis, completion axillary lymph node dissection is ideally performed during the same operation. Intraoperative histologic techniques have low and variable sensitivity. A new intraoperative molecular assay (GeneSearch BLN Assay; Veridex, LLC, Warren, NJ) was evaluated to determine its efficiency in identifying significant sentinel lymph node metastases (>.2 mm). Positive or negative BLN Assay results generated from fresh 2-mm node slabs were compared with results from conventional histologic evaluation of adjacent fixed tissue slabs. In a prospective study of 416 patients at 11 clinical sites, the assay detected 98% of metastases >2 mm and 88% of metastasis greater >.2 mm, results superior to frozen section. Micrometastases were less frequently detected (57%) and assay positive results in nodes found negative by histology were rare (4%). The BLN Assay is properly calibrated for use as a stand alone intraoperative molecular test.
NASA Astrophysics Data System (ADS)
Erich, M.; Kokkoris, M.; Fazinić, S.; Petrović, S.
2018-02-01
This work reports on the induced diamond crystal amorphization by 4 MeV carbon ions implanted in the 〈1 0 0〉 oriented crystal and its determination by application of RBS/C and EBS/C techniques. The spectra from the implanted samples were recorded for 1.2, 1.5, 1.75 and 1.9 MeV protons. For the two latter ones the strong resonance of the nuclear elastic scattering 12C(p,p0)12C at 1.737 MeV was explored. The backscattering channeling spectra were successfully fitted and the ion beam induced crystal amorphization depth profile was determined using a phenomenological approach, which is based on the properly defined Gompertz type dechanneling functions for protons in the 〈1 0 0〉 diamond crystal channels and the introduction of the concept of ion beam amorphization, which is implemented through our newly developed computer code CSIM.
Barile, Claudia; Casavola, Caterina; Pappalettera, Giovanni; Pappalettere, Carmine
2014-01-01
Hole drilling is the most widespread method for measuring residual stress. It is based on the principle that drilling a hole in the material causes a local stress relaxation; the initial residual stress can be calculated by measuring strain in correspondence with each drill depth. Recently optical techniques were introduced to measure strain; in this case, the accuracy of the final results depends, among other factors, on the proper choice of the area of analysis. Deformations are in fact analyzed within an annulus determined by two parameters: the internal and the external radius. In this paper, the influence of the choice of the area of analysis was analysed. A known stress field was introduced on a Ti grade 5 sample and then the stress was measured in correspondence with different values of the internal and the external radius of analysis; results were finally compared with the expected theoretical value.
2014-01-01
Hole drilling is the most widespread method for measuring residual stress. It is based on the principle that drilling a hole in the material causes a local stress relaxation; the initial residual stress can be calculated by measuring strain in correspondence with each drill depth. Recently optical techniques were introduced to measure strain; in this case, the accuracy of the final results depends, among other factors, on the proper choice of the area of analysis. Deformations are in fact analyzed within an annulus determined by two parameters: the internal and the external radius. In this paper, the influence of the choice of the area of analysis was analysed. A known stress field was introduced on a Ti grade 5 sample and then the stress was measured in correspondence with different values of the internal and the external radius of analysis; results were finally compared with the expected theoretical value. PMID:25276850
Primitive Saltmaking and Marine Science Education.
ERIC Educational Resources Information Center
Spence, Lundie; Copeland, B. J.
1985-01-01
Describes the procedures employed to make salt from seawater. Reviews the basic principles of seawater chemistry and discusses the techniques used to measure salinity. Identifies major saltworks locations and indicates the proper conditions needed for solar production of salt. (ML)
The Care and Maintenance of Videodiscs and Players.
ERIC Educational Resources Information Center
Paris, Judith; Boss, Richard W.
1982-01-01
Explores the effects of library use on both capacitance and laser-optical videodisc systems and outlines proper cleaning, servicing, and storage techniques. The article is excerpted from "Conservation in the Library," a book edited by Susan Swartzberg. (Author/JJD)
Code of Federal Regulations, 2011 CFR
2011-07-01
... engine and sampling systems. (7) Sample emissions throughout the duty cycle. (8) Record post-test data. (9) Perform post-test procedures to verify proper operation of certain equipment and analyzers. (10... PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.501 Overview. (a) Use the procedures...
Code of Federal Regulations, 2010 CFR
2010-07-01
... engine and sampling systems. (7) Sample emissions throughout the duty cycle. (8) Record post-test data. (9) Perform post-test procedures to verify proper operation of certain equipment and analyzers. (10... PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.501 Overview. (a) Use the procedures...
Code of Federal Regulations, 2014 CFR
2014-07-01
... engine and sampling systems. (7) Sample emissions throughout the duty cycle. (8) Record post-test data. (9) Perform post-test procedures to verify proper operation of certain equipment and analyzers. (10... PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.501 Overview. (a) Use the procedures...
Code of Federal Regulations, 2013 CFR
2013-07-01
... engine and sampling systems. (7) Sample emissions throughout the duty cycle. (8) Record post-test data. (9) Perform post-test procedures to verify proper operation of certain equipment and analyzers. (10... PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.501 Overview. (a) Use the procedures...
Code of Federal Regulations, 2012 CFR
2012-07-01
... engine and sampling systems. (7) Sample emissions throughout the duty cycle. (8) Record post-test data. (9) Perform post-test procedures to verify proper operation of certain equipment and analyzers. (10... PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.501 Overview. (a) Use the procedures...
Groundwater sampling: Chapter 5
Wang, Qingren; Munoz-Carpena, Rafael; Foster, Adam; Migliaccio, Kati W.; Li, Yuncong; Migliaccio, Kati
2011-01-01
Discussing an array of water quality topics, from water quality regulations and criteria, to project planning and sampling activities, this book outlines a framework for improving water quality programs. Using this framework, you can easily put the proper training and tools in place for better management of water resources.
Dash, Hirak Ranjan; Das, Surajit
2018-02-01
Forensic biology is a sub-discipline of biological science with an amalgam of other branches of science used in the criminal justice system. Any nucleated cell/tissue harbouring DNA, either live or dead, can be used as forensic exhibits, a source of investigation through DNA typing. These biological materials of human origin are rich source of proteins, carbohydrates, lipids, trace elements as well as water and, thus, provide a virtuous milieu for the growth of microbes. The obstinate microbial growth augments the degradation process and is amplified with the passage of time and improper storage of the biological materials. Degradation of these biological materials carriages a huge challenge in the downstream processes of forensic DNA typing technique, such as short tandem repeats (STR) DNA typing. Microbial degradation yields improper or no PCR amplification, heterozygous peak imbalance, DNA contamination from non-human sources, degradation of DNA by microbial by-products, etc. Consequently, the most precise STR DNA typing technique is nullified and definite opinion can be hardly given with degraded forensic exhibits. Thus, suitable precautionary measures should be taken for proper storage and processing of the biological exhibits to minimize their decaying process by micro-organisms.
Xiao, Xiang; Zhu, Hao; Liu, Wei-Jie; Yu, Xiao-Ting; Duan, Lian; Li, Zheng; Zhu, Chao-Zhe
2017-01-01
The International 10/20 system is an important head-surface-based positioning system for transcranial brain mapping techniques, e.g., fNIRS and TMS. As guidance for probe placement, the 10/20 system permits both proper ROI coverage and spatial consistency among multiple subjects and experiments in a MRI-free context. However, the traditional manual approach to the identification of 10/20 landmarks faces problems in reliability and time cost. In this study, we propose a semi-automatic method to address these problems. First, a novel head surface reconstruction algorithm reconstructs head geometry from a set of points uniformly and sparsely sampled on the subject's head. Second, virtual 10/20 landmarks are determined on the reconstructed head surface in computational space. Finally, a visually-guided real-time navigation system guides the experimenter to each of the identified 10/20 landmarks on the physical head of the subject. Compared with the traditional manual approach, our proposed method provides a significant improvement both in reliability and time cost and thus could contribute to improving both the effectiveness and efficiency of 10/20-guided MRI-free probe placement.
Nanoscale Chemical Imaging of Zeolites Using Atom Probe Tomography.
Weckhuysen, Bert Marc; Schmidt, Joel; Peng, Linqing; Poplawsky, Jonathan
2018-05-02
Understanding structure-composition-property relationships in zeolite-based materials is critical to engineering improved solid catalysts. However, this can be difficult to realize as even single zeolite crystals can exhibit heterogeneities spanning several orders of magnitude, with consequences for e.g. reactivity, diffusion as well as stability. Great progress has been made in characterizing these porous solids using tomographic techniques, though each method has an ultimate spatial resolution limitation. Atom Probe Tomography (APT) is the only technique so far capable of producing 3-D compositional reconstructions with sub-nm-scale resolution, and has only recently been applied to zeolite-based catalysts. Herein, we discuss the use of APT to study zeolites, including the critical aspects of sample preparation, data collection, assignment of mass spectral peaks including the predominant CO peak, the limitations of spatial resolution for the recovery of crystallographic information, and proper data analysis. All sections are illustrated with examples from recent literature, as well as previously unpublished data and analyses to demonstrate practical strategies to overcome potential pitfalls in applying APT to zeolites, thereby highlighting new insights gained from the APT method. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Suppressing Anomalous Localized Waffle Behavior in Least Squares Wavefront Reconstructors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gavel, D
2002-10-08
A major difficulty with wavefront slope sensors is their insensitivity to certain phase aberration patterns, the classic example being the waffle pattern in the Fried sampling geometry. As the number of degrees of freedom in AO systems grows larger, the possibility of troublesome waffle-like behavior over localized portions of the aperture is becoming evident. Reconstructor matrices have associated with them, either explicitly or implicitly, an orthogonal mode space over which they operate, called the singular mode space. If not properly preconditioned, the reconstructor's mode set can consist almost entirely of modes that each have some localized waffle-like behavior. In thismore » paper we analyze the behavior of least-squares reconstructors with regard to their mode spaces. We introduce a new technique that is successful in producing a mode space that segregates the waffle-like behavior into a few ''high order'' modes, which can then be projected out of the reconstructor matrix. This technique can be adapted so as to remove any specific modes that are undesirable in the final reconstructor (such as piston, tip, and tilt for example) as well as suppress (the more nebulously defined) localized waffle behavior.« less
Applications of mass spectrometry techniques to autoclave curing of materials
NASA Technical Reports Server (NTRS)
Smith, A. C.
1983-01-01
Mass spectrometer analysis of gases evolved from polymer materials during a cure cycle can provide a wealth of information useful for studying cure properties and procedures. In this paper data is presented for two materials to support the feasibility of using mass spectrometer gas analysis techniques to enhance the knowledge of autoclave curing of composite materials and provide additional information for process control evaluation. It is expected that this technique will also be useful in working out the details involved in determining the proper cure cycle for new or experimental materials.
Characteristic-eddy decomposition of turbulence in a channel
NASA Technical Reports Server (NTRS)
Moin, Parviz; Moser, Robert D.
1989-01-01
Lumley's proper orthogonal decomposition technique is applied to the turbulent flow in a channel. Coherent structures are extracted by decomposing the velocity field into characteristic eddies with random coefficients. A generalization of the shot-noise expansion is used to determine the characteristic eddies in homogeneous spatial directions. Three different techniques are used to determine the phases of the Fourier coefficients in the expansion: (1) one based on the bispectrum, (2) a spatial compactness requirement, and (3) a functional continuity argument. Similar results are found from each of these techniques.
NASA Technical Reports Server (NTRS)
Ukeiley, L.; Varghese, M.; Glauser, M.; Valentine, D.
1991-01-01
A 'lobed mixer' device that enhances mixing through secondary flows and streamwise vorticity is presently studied within the framework of multifractal-measures theory, in order to deepen understanding of velocity time trace data gathered on its operation. Proper orthogonal decomposition-based knowledge of coherent structures has been applied to obtain the generalized fractal dimensions and multifractal spectrum of several proper eigenmodes for data samples of the velocity time traces; this constitutes a marked departure from previous multifractal theory applications to self-similar cascades. In certain cases, a single dimension may suffice to capture the entire spectrum of scaling exponents for the velocity time trace.
Recommendations for research design of telehealth studies.
Chumbler, Neale R; Kobb, Rita; Brennan, David M; Rabinowitz, Terry
2008-11-01
Properly designed randomized controlled trials (RCTs) are the gold standard to use when examining the effectiveness of telehealth interventions on clinical outcomes. Some published telehealth studies have employed well-designed RCTs. However, such methods are not always feasible and practical in particular settings. This white paper addresses not only the need for properly designed RCTs, but also offers alternative research designs, such as quasi-experimental designs, and statistical techniques that can be employed to rigorously assess the effectiveness of telehealth studies. This paper further offers design and measurement recommendations aimed at and relevant to administrative decision-makers, policymakers, and practicing clinicians.
Maryam, Ehsani; Farida, Abesi; Farhad, Akbarzade; Soraya, Khafri
2013-11-01
Obtaining the proper working length in endodontic treatment is essential. The aim of this study was to compare the working length (WL) assessment of small diameter K-files using the two different digital imaging methods. The samples for this in-vitro experimental study consisted of 40 extracted single-rooted premolars. After access cavity preparation, the ISO files no. 6, 8, and 10 stainless steel K-files were inserted in the canals in the three different lengths to evaluate the results in a blinded manner: At the level of apical foramen(actual)1 mm short of apical foramen2 mm short of apical foramen A digital caliper was used to measure the length of the files which was considered as the Gold Standard. Five observers (two oral and maxillofacial radiologists and three endodontists) observed the digital radiographs which were obtained using PSP and CCD digital imaging sensors. The collected data were analyzed by SPSS 17 and Repeated Measures Paired T-test. In WL assessment of small diameter K-files, a significant statistical relationship was seen among the observers of two digital imaging techniques (P<0.001). However, no significant difference was observed between the two digital techniques in WL assessment of small diameter K-files (P<0.05). PSP and CCD digital imaging techniques were similar in WL assessment of canals using no. 6, 8, and 10 K-files.
Monitoring and understanding crustal deformation by means of GPS and InSAR data
NASA Astrophysics Data System (ADS)
Zerbini, Susanna; Prati, Claudio; Bruni, Sara; Errico, Maddalena; Musicò, Elvira; Novali, Fabrizio; Santi, Efisio
2014-05-01
Monitoring deformation of the Earth's crust by using data acquired by both the GNSS and SAR techniques allows describing crustal movements with high spatial and temporal resolution. This is a key contribution for achieving a deeper and better insight of geodynamic processes. Combination of the two techniques provides a very powerful means, however, before combing the different data sets it is important to properly understand their respective contribution. For this purpose, strictly simultaneous and long time series would be necessary. This is not, in general, a common case due to the relatively long SAR satellites revisit time. A positive exception is represented by the data set of COSMO SKYMed (CSK) images made available for this study by the Italian Space Agency (ASI). The flyover area encompass the city of Bologna and the smaller nearby town of Medicina where permanent GPS stations are operational. At the times of the CSK flyovers, we compared the GPS and SAR Up and East coordinates of a few stations as well as differential tropospheric delays derived by both techniques. The GPS time series were carefully screened and corrected for the presence of discontinuities by adopting a dedicated statistical procedure. The comparisons of both the estimated deformation and the tropospheric delays are encouraging and highlight the need for having available a more evenly sampled SAR data set.
NASA Astrophysics Data System (ADS)
Chockalingam, Letchumanan
2005-01-01
The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.
RADIOANALYTICAL AND MIXED WASTE ANALYTICAL SUPPORT FOR STATES, REGIONS, AND OTHER FEDERAL AGENCIES
Provide technical advice and support to Regions and other Federal Agencies on types of analyses, proper sampling, preservation, shipping procedures, and detection limits for samples for radionuclides and stable metals. Provide in-house data review and validation to ensure the qua...
NASA Astrophysics Data System (ADS)
Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.
2006-12-01
Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.
Immune system changes during simulated planetary exploration on Devon Island, high arctic
Crucian, Brian; Lee, Pascal; Stowe, Raymond; Jones, Jeff; Effenhauser, Rainer; Widen, Raymond; Sams, Clarence
2007-01-01
Background Dysregulation of the immune system has been shown to occur during spaceflight, although the detailed nature of the phenomenon and the clinical risks for exploration class missions have yet to be established. Also, the growing clinical significance of immune system evaluation combined with epidemic infectious disease rates in third world countries provides a strong rationale for the development of field-compatible clinical immunology techniques and equipment. In July 2002 NASA performed a comprehensive immune assessment on field team members participating in the Haughton-Mars Project (HMP) on Devon Island in the high Canadian Arctic. The purpose of the study was to evaluate the effect of mission-associated stressors on the human immune system. To perform the study, the development of techniques for processing immune samples in remote field locations was required. Ten HMP-2002 participants volunteered for the study. A field protocol was developed at NASA-JSC for performing sample collection, blood staining/processing for immunophenotype analysis, whole-blood mitogenic culture for functional assessments and cell-sample preservation on-location at Devon Island. Specific assays included peripheral leukocyte distribution; constitutively activated T cells, intracellular cytokine profiles, plasma cortisol and EBV viral antibody levels. Study timepoints were 30 days prior to mission start, mid-mission and 60 days after mission completion. Results The protocol developed for immune sample processing in remote field locations functioned properly. Samples were processed on Devon Island, and stabilized for subsequent analysis at the Johnson Space Center in Houston. The data indicated that some phenotype, immune function and stress hormone changes occurred in the HMP field participants that were largely distinct from pre-mission baseline and post-mission recovery data. These immune changes appear similar to those observed in astronauts following spaceflight. Conclusion The immune system changes described during the HMP field deployment validate the use of the HMP as a ground-based spaceflight/planetary exploration analog for some aspects of human physiology. The sample processing protocol developed for this study may have applications for immune studies in remote terrestrial field locations. Elements of this protocol could possibly be adapted for future in-flight immunology studies conducted during space missions. PMID:17521440
Mahon, Michael B; Campbell, Kaitlin U; Crist, Thomas O
2017-06-01
Selection of proper sampling methods for measuring a community of interest is essential whether the study goals are to conduct a species inventory, environmental monitoring, or a manipulative experiment. Insect diversity studies often employ multiple collection methods at the expense of researcher time and funding. Ants (Formicidae) are widely used in environmental monitoring owing to their sensitivity to ecosystem changes. When sampling ant communities, two passive techniques are recommended in combination: pitfall traps and Winkler litter extraction. These recommendations are often based on studies from highly diverse tropical regions or when a species inventory is the goal. Studies in temperate regions often focus on measuring consistent community response along gradients of disturbance or among management regimes; therefore, multiple sampling methods may be unnecessary. We compared the effectiveness of pitfalls and Winkler litter extraction in an eastern temperate forest for measuring ant species richness, composition, and occurrence of ant functional groups in response to experimental manipulations of two key forest ecosystem drivers, white-tailed deer and an invasive shrub (Amur honeysuckle). We found no significant effect of sampling method on the outcome of the ecological experiment; however, we found differences between the two sampling methods in the resulting ant species richness and functional group occurrence. Litter samples approximated the overall combined species richness and composition, but pitfalls were better at sampling large-bodied (Camponotus) species. We conclude that employing both methods is essential only for species inventories or monitoring ants in the Cold-climate Specialists functional group. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Isotope Ratios Reveal Trickery in the Produce Aisle
ERIC Educational Resources Information Center
Journal of Chemical Education, 2007
2007-01-01
A new technique for the proper checking and banning of organic food items is proposed. The analysis of the nitrogen isotope ratio present in the food is found to be a perfect standard for the organic checking of the food products.
FIELD MEASUREMENT OF DISSOLVED OXYGEN: A COMPARISON OF TECHNIQUES
The measurement and interpretation of geochemical redox parameters are key components of ground water remedial investigations. Dissolved oxygen (DO) is perhaps the most robust geochemical parameter in redox characterization; however, recent work has indicated a need for proper da...
Standardization of a traditional polyherbo-mineral formulation - Brahmi vati.
Mishra, Amrita; Mishra, Arun K; Ghosh, Ashoke K; Jha, Shivesh
2013-01-01
The present study deals with standardization of an in-house standard preparation and three marketed samples of Brahmi vati, which is a traditional medicine known to be effective in mental disorders, convulsions, weak memory, high fever and hysteria. Preparation and standardization have been done by following modern scientific quality control procedures for raw material and the finished products. The scanning electron microscopic (SEM) analysis showed the reduction of metals and minerals (particle size range 2-5 µm) which indicates the proper preparation of bhasmas, the important ingredient of Brahmi vati. Findings of EDX analysis of all samples of Brahmi vati suggested the absence of Gold, an important constituent of Brahmi vati in two marketed samples. All the samples of Brahmi vati were subjected to quantitative estimation of Bacoside A (marker compound) by HPTLC technique. Extraction of the samples was done in methanol and the chromatograms were developed in Butanol: Glacial acetic acid: water (4.5:0.5:5 v/v) and detected at 225nm. The regression analysis of calibration plots of Bacoside A exhibited linear relationship in the concentration range of 50-300 ng, while the % recovery was found to be 96.06% w/w, thus proving the accuracy and precision of the analysis. The Bacoside A content in the in-house preparation was found to be higher than that of the commercial samples. The proposed HPTLC method was found to be rapid, simple and accurate for quantitative estimation of Bacoside A in different formulations. The results of this study could be used as a model data in the standardization of Brahmi vati.
NASA Astrophysics Data System (ADS)
Cizdziel, James V.; Tolbert, Candice; Brown, Garry
2010-02-01
A Direct Mercury Analyzer (DMA) based on sample combustion, concentration of mercury by amalgamation with gold, and cold vapor atomic absorption spectrometry (CVAAS) was coupled to a mercury-specific cold vapor atomic fluorescence spectrometer (CVAFS). The purpose was to evaluate combustion-AFS, a technique which is not commercially available, for low-level analysis of mercury in environmental and biological samples. The experimental setup allowed for comparison of dual measurements of mercury (AAS followed by AFS) for a single combustion event. The AFS instrument control program was modified to properly time capture of mercury from the DMA, avoiding deleterious combustion products from reaching its gold traps. Calibration was carried out using both aqueous solutions and solid reference materials. The absolute detection limits for mercury were 0.002 ng for AFS and 0.016 ng for AAS. Recoveries for reference materials ranged from 89% to 111%, and the precision was generally found to be <10% relative standard deviation (RSD). The two methods produced similar results for samples of hair, finger nails, coal, soil, leaves and food stuffs. However, for samples with mercury near the AAS detection limit (e.g., filter paper spotted with whole blood and segments of tree rings) the signal was still quantifiable with AFS, demonstrating the lower detection limit and greater sensitivity of AFS. This study shows that combustion-AFS is feasible for the direct analysis of low levels of mercury in solid samples that would otherwise require time-consuming and contamination-prone digestion.
Code of Federal Regulations, 2013 CFR
2013-10-01
... samples are analyzed directly by high performance liquid chromatography (HPLC). Detection limits: 0.01% by... proper selection of HPLC parameters. 2.4. Samples must be free of any particulates that may clog the... clarification kit. 3. Apparatus 3.1. Liquid chromatograph equipped with a UV detector. 3.2. HPLC Column that...
Code of Federal Regulations, 2012 CFR
2012-10-01
... samples are analyzed directly by high performance liquid chromatography (HPLC). Detection limits: 0.01% by... proper selection of HPLC parameters. 2.4. Samples must be free of any particulates that may clog the... clarification kit. 3. Apparatus 3.1. Liquid chromatograph equipped with a UV detector. 3.2. HPLC Column that...
Code of Federal Regulations, 2014 CFR
2014-10-01
... samples are analyzed directly by high performance liquid chromatography (HPLC). Detection limits: 0.01% by... proper selection of HPLC parameters. 2.4. Samples must be free of any particulates that may clog the... clarification kit. 3. Apparatus 3.1. Liquid chromatograph equipped with a UV detector. 3.2. HPLC Column that...
Viegas, Carla; Faria, Tiago; Pacífico, Cátia; Dos Santos, Mateus; Monteiro, Ana; Lança, Carla; Carolino, Elisabete; Viegas, Susana; Cabo Verde, Sandra
2017-01-01
The aim of this work was to assess the microbiota (fungi and bacteria) and particulate matter in optical shops, contributing to a specific protocol to ensure a proper assessment. Air samples were collected through an impaction method. Surface and equipment swab samples were also collected side-by-side. Measurements of particulate matter were performed using portable direct-reading equipment. A walkthrough survey and checklist was also applied in each shop. Regarding air sampling, eight of the 13 shops analysed were above the legal requirement and 10 from the 26 surfaces samples were overloaded. In three out of the 13 shops fungal contamination in the analysed equipment was not detected. The bacteria air load was above the threshold in one of the 13 analysed shops. However, bacterial counts were detected in all sampled equipment. Fungi and bacteria air load suggested to be influencing all of the other surface and equipment samples. These results reinforce the need to improve air quality, not only to comply with the legal requirements, but also to ensure proper hygienic conditions. Public health intervention is needed to assure the quality and safety of the rooms and equipment in optical shops that perform health interventions in patients. PMID:28505144
NASA Technical Reports Server (NTRS)
Ricks, Wendell R.; Abbott, Kathy H.
1987-01-01
To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. A study is described that compared cost-related factors associated with traditional programming techniques to rule-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the results should be widely applicable.
Application of Raman Spectroscopy and Infrared Spectroscopy in the Identification of Breast Cancer.
Depciuch, Joanna; Kaznowska, Ewa; Zawlik, Izabela; Wojnarowska, Renata; Cholewa, Marian; Heraud, Philip; Cebulski, Józef
2016-02-01
Raman spectroscopy and infrared (IR) spectroscopy are both techniques that allow for the investigation of vibrating chemical particles. These techniques provide information not only about chemical particles through the identification of functional groups and spectral analysis of so-called "fingerprints", these methods allow for the qualitative and quantitative analyses of chemical substances in the sample. Both of these spectral techniques are frequently being used in biology and medicine in diagnosing illnesses and monitoring methods of therapy. The type of breast cancer found in woman is often a malignant tumor, causing 1.38 million new cases of breast cancer and 458 000 deaths in the world in 2013. The most important risk factors for breast cancer development are: sex, age, family history, specific benign breast conditions in the breast, ionizing radiation, and lifestyle. The main purpose of breast cancer screening tests is to establish early diagnostics and to apply proper treatment. Diagnoses of breast cancer are based on: (1) physical techniques (e.g., ultrasonography, mammography, elastography, magnetic resonance, positron emission tomography [PET]); (2) histopathological techniques; (3) biological techniques; and (4) optical techniques (e.g., photo acoustic imaging, fluorescence tomography). However, none of these techniques provides unique or especially revealing answers. The aim of our study is comparative spectroscopic measurements on patients with the following: normal non-cancerous breast tissue; breast cancer tissues before chemotherapy; breast cancer tissues after chemotherapy; and normal breast tissues received around the cancerous breast region. Spectra collected from breast cancer patients shows changes in amounts of carotenoids and fats. We also observed changes in carbohydrate and protein levels (e.g., lack of amino acids, changes in the concentration of amino acids, structural changes) in comparison with normal breast tissues. This fact verifies that Raman spectroscopy and IR spectroscopy are very useful diagnostic tools that will shed new light in understanding the etiology of breast cancer. © The Author(s) 2016.
Handbook of Forecasting Techniques. Part 2. Description of 31 Techniques
1977-08-01
a discipline, or some other coherent group. Panels have often produced good results, but care must be taken to avoid bandwagon effects , blockage of...34 bandwagon " effect often occurs in panels, so that one person’s viewpoint overwhelms the opinions of others and/or plausible alternatives never get proper...as an ancient one, however. Since Newton, the western world has increasingly acknowledged the universality of cause- effect explanations, with cause
Low cost silicon-on-ceramic photovoltaic solar cells
NASA Technical Reports Server (NTRS)
Koepke, B. G.; Heaps, J. D.; Grung, B. L.; Zook, J. D.; Sibold, J. D.; Leipold, M. H.
1980-01-01
A technique has been developed for coating low-cost mullite-based refractory substrates with thin layers of solar cell quality silicon. The technique involves first carbonizing one surface of the ceramic and then contacting it with molten silicon. The silicon wets the carbonized surface and, under the proper thermal conditions, solidifies as a large-grained sheet. Solar cells produced from this composite silicon-on-ceramic material have exhibited total area conversion efficiencies of ten percent.
NASA Astrophysics Data System (ADS)
Khare, P.; Marcotte, A.; Sheu, R.; Ditto, J.; Gentner, D. R.
2017-12-01
Intermediate- and semi-volatile organic compounds (IVOCs and SVOCs) have high secondary organic aerosol (SOA) yields, as well as significant ozone formation potentials. Yet, their emission sources and oxidation pathways remain largely understudied due to limitations in current analytical capabilities. Online mass spectrometers are able to collect real time data but their limited mass resolving power renders molecular level characterization of IVOCs and SVOCs from the unresolved complex mixture unfeasible. With proper sampling techniques and powerful analytical instrumentation, our offline tandem mass spectrometry (i.e. MS×MS) techniques provide molecular-level and structural identification over wide polarity and volatility ranges. We have designed a novel analytical system for offline analysis of gas-phase SOA precursors collected on custom-made multi-bed adsorbent tubes. Samples are desorbed into helium via a gradual temperature ramp and sample flow is split equally for direct-MS×MS analysis and separation via gas chromatography (GC). The effluent from GC separation is split again for analysis via atmospheric pressure chemical ionization quadrupole time-of-flight mass spectrometry (APCI-Q×TOF) and traditional electron ionization mass spectrometry (EI-MS). The compounds for direct-MS×MS analysis are delivered via a transfer line maintained at 70ºC directly to APCI-Q×TOF, thus preserving the molecular integrity of thermally-labile, or other highly-reactive, organic compounds. Both our GC-MS×MS and direct-MS×MS analyses report high accuracy parent ion masses as well as information on molecular structure via MS×MS, which together increase the resolution of unidentified complex mixtures. We demonstrate instrument performance and present preliminary results from urban atmospheric samples collected from New York City with a wide range of compounds including highly-functionalized organic compounds previously understudied in outdoor air. Our work offers new insights into emerging emission sources in urban environments that can have a major impact on public health and also improves understanding of anthropogenic SOA precursor emissions.
ON THE CONNECTION OF THE APPARENT PROPER MOTION AND THE VLBI STRUCTURE OF COMPACT RADIO SOURCES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moor, A.; Frey, S.; Lambert, S. B.
2011-06-15
Many of the compact extragalactic radio sources that are used as fiducial points to define the celestial reference frame are known to have proper motions detectable with long-term geodetic/astrometric very long baseline interferometry (VLBI) measurements. These changes can be as high as several hundred microarcseconds per year for certain objects. When imaged with VLBI at milliarcsecond (mas) angular resolution, these sources (radio-loud active galactic nuclei) typically show structures dominated by a compact, often unresolved 'core' and a one-sided 'jet'. The positional instability of compact radio sources is believed to be connected with changes in their brightness distribution structure. For themore » first time, we test this assumption in a statistical sense on a large sample rather than on only individual objects. We investigate a sample of 62 radio sources for which reliable long-term time series of astrometric positions as well as detailed 8 GHz VLBI brightness distribution models are available. We compare the characteristic direction of their extended jet structure and the direction of their apparent proper motion. We present our data and analysis method, and conclude that there is indeed a correlation between the two characteristic directions. However, there are cases where the {approx}1-10 mas scale VLBI jet directions are significantly misaligned with respect to the apparent proper motion direction.« less
Likhitkar, Manoj S; Kulkarni, Shantaram V; Burande, Aravind; Solanke, Vishal; Kumar, C Sushil; Kamble, Babasaheb
2016-01-01
The success of root canal treatment depends on proper debridement, instrumentation, proper accessibility, and proper restoration. The presence of a smear layer is considered to be a significant factor. This in vitro study was conducted to assess the effect of the presence/absence of a smear layer on the microleakage of root canal filled teeth using different instruments and obturation methods. One hundred extracted mandibular premolars with closed apices and single roots were chosen and then divided into six groups, A to F, consisting of 15 teeth each. The control group included 10 teeth; 5 positive and 5 negative. The teeth were decoronated at the cementoenamel junction. Groups A, B, C, and D were instrumented with engine-driven rotary Protaper NiTi files. Groups E and F were instrumented with conventional stainless steel hand files. Groups A, C, and E were flushed with 3 ml of 17% EDTA to remove the smear layer prior to obturation. All teeth were flushed with 5.25% sodium hypochlorite solution and obturated with AH-Plus sealer with lateral condensation technique for Groups C, D, E, F and with thermoplasticized gutta-percha technique for Groups A and B. Using an electrochemical technique, leakages in the obturated canals were assessed for 45 days. The results were tabulated using Student's t-test (paired and unpaired t-test) with the Statistical Package for the Social Sciences Software Version 21 (IBM Company, New York, USA). Group A showed the lowest mean value at intervals of 10, 20, 30, and 45 days. There was no current flow in the negative controls during the test period. There was leakage in the positive controls within a few minutes of immersion. The results showed that rotary instrumentation contributed toward an exceptional preparation of root canals compared to hand instrumentation. Elimination of the smear layer enhanced the resistance to microleakage; thermoplasticized gutta-percha obturation technique produced a better seal compared to the lateral condensation technique.
Addo, Henry O; Dun-Dery, Elvis J; Afoakwa, Eugenia; Elizabeth, Addai; Ellen, Amposah; Rebecca, Mwinfaug
2017-07-03
Domestic waste generation has contributed significantly to hampering national waste management efforts. It poses serious threat to national development and requires proper treatment and management within and outside households. The problem of improper waste management has always been a challenge in Ghana, compelling several national surveys to report on the practice of waste management. However, little is known about how much waste is generated and managed within households and there is a serious dearth of information for national policy and planning. This paper seeks to document the handling and practice of waste management, including collection, storage, transportation and disposal along with the types and amount of waste generated by Households and their related health outcome. The study was a descriptive cross-sectional study and used a multi-stage sampling technique to sample 700 households. The study was planned and implemented from January to May 2015. It involved the use of structured questionnaires in the data collection over the period. Factors such as demographic characteristics, amount of waste generated, types of waste bins used within households, waste recycling, cost of disposing waste, and distance to dumpsite were all assessed. The paper shows that each surveyed household generated 0.002 t of waste per day, of which 29% are both organic and inorganic. Though more than half of the respondents (53.6%) had positive attitude towards waste management, only 29.1% practiced waste management. The study reveals that there is no proper management of domestic waste except in few households that segregate waste. The study identified several elements as determinants of waste management practice. Female respondents were less likely to practice waste management (AOR 0.45; 95% Cl 0.29, 0.79), household size also determined respondents practice (AOR 0.26; Cl 0.09, 0.77). Practice of recycling (AOR 0.03; Cl 0.02, 0.08), distance to dumpsite (AOR 0.45; Cl 0.20, 0.99), were all significant predictors of waste management practice. Cholera which is a hygiene related disease was three times more likely to determine households' waste management practice (AOR 3.22; Cl 1.33, 7.84). Considering the low waste management practice among households, there is the need for improved policy and enhanced education on proper waste management practice among households.
Diagnosis and treatment of common metabolic spinal disorders in the geriatric population.
Eck, J C; Humphreys, S C
1998-12-01
Bone is constantly resorbed and remodeled throughout life. After approximately age 30, there is a net loss of bone mass. This places the geriatric population at an increased risk of pathologic bone disorders that can lead to fractures and deformity. In this paper, we review bone metabolism and remodeling and introduce the proper diagnostic techniques. The most common pathologic spinal disorders are introduced, with emphasis on presentation and treatment options. To prevent excessive bone loss, patients should be educated on proper nutrition (calcium and vitamin D requirements) and lifestyle (avoiding alcohol and cigarette smoking). Sex hormone and drug therapies are available to reduce bone loss. New bisphosphonates such as alendronate sodium (Fosamax) have been effective in increasing bone mass. Early diagnosis and proper treatment of pathologic bone disorders can reduce the incidence of fracture and allow the patient a more productive and comfortable life.
Detection and tracking of gas plumes in LWIR hyperspectral video sequence data
NASA Astrophysics Data System (ADS)
Gerhart, Torin; Sunu, Justin; Lieu, Lauren; Merkurjev, Ekaterina; Chang, Jen-Mei; Gilles, Jérôme; Bertozzi, Andrea L.
2013-05-01
Automated detection of chemical plumes presents a segmentation challenge. The segmentation problem for gas plumes is difficult due to the diffusive nature of the cloud. The advantage of considering hyperspectral images in the gas plume detection problem over the conventional RGB imagery is the presence of non-visual data, allowing for a richer representation of information. In this paper we present an effective method of visualizing hyperspectral video sequences containing chemical plumes and investigate the effectiveness of segmentation techniques on these post-processed videos. Our approach uses a combination of dimension reduction and histogram equalization to prepare the hyperspectral videos for segmentation. First, Principal Components Analysis (PCA) is used to reduce the dimension of the entire video sequence. This is done by projecting each pixel onto the first few Principal Components resulting in a type of spectral filter. Next, a Midway method for histogram equalization is used. These methods redistribute the intensity values in order to reduce icker between frames. This properly prepares these high-dimensional video sequences for more traditional segmentation techniques. We compare the ability of various clustering techniques to properly segment the chemical plume. These include K-means, spectral clustering, and the Ginzburg-Landau functional.
Improvements in powered air purifying respirator protection in an ABSL-3E facility
USDA-ARS?s Scientific Manuscript database
The study of and experimentation with zoonotic pathogens such as highly pathogenic avian influenza (HPAI) requires risk mitigation strategies including laboratory engineering controls and safety equipment, personal protective equipment (PPE), and proper practices and techniques. Incidences of potent...
ERIC Educational Resources Information Center
Connors, G. Patrick
Chondromalacia is the degeneration of the hyaline cartilage on the under surface of the kneecap. Its causes include patella maltracking (the kneecap does not glide properly over the joint), posttraumatic condition, and chronic overuse. The treatment can be a controlled rehabilitation program, various bracing techniques, foot orthoses, or, in…
15 CFR 291.3 - Environmental tools and techniques projects.
Code of Federal Regulations, 2011 CFR
2011-01-01
... demonstrated access to relevant technical or information sources external to the organization. (3) Degree of.... Applicants should specify plans for proper organization, staffing, and management of the implementation... managing organization to conduct the proposed activities; qualifications of the project team and its...
Health Instruction Packages: Injections.
ERIC Educational Resources Information Center
Dunkleman, Ellie; And Others
Text, illustrations, and exercises are utilized in this set of four learning modules designed to instruct nursing students in techniques and equipment utilized for intramuscular injections. The first module, "Equipment for Intramuscular Injections" by Ellie Dunkleman, presents guidelines for selecting needles of the proper length and…
ERIC Educational Resources Information Center
Casey, Joe
This document contains five units for a course in computer numerical control (CNC) for computer-aided manufacturing. It is intended to familiarize students with the principles and techniques necessary to create proper CNC programs manually. Each unit consists of an introduction, instructional objectives, learning materials, learning activities,…
Risk Assessment Strategies and Techniques for Combined Exposures
Author: Cynthia V. Rider, Ph.D., and Jane Ellen Simmons, Ph.D.Abstract: Consideration of cumulative risk is necessary to evaluate properly the safety of, and the risks associated with, combined exposures. These combined exposures ("mixtures") commonly occur from exposure to: envi...
Enhanced methodology of focus control and monitoring on scanner tool
NASA Astrophysics Data System (ADS)
Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.
2017-03-01
As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus control and monitoring on multiple illumination conditions, opens an avenue to significantly reduce Focus-Exposure Matrix (FEM) wafer exposure for new product/layer best focus (BF) setup.
Drug samples in dermatology: special considerations and recommendations for the future.
Alikhan, Ali; Sockolov, Mary; Brodell, Robert T; Feldman, Steven R
2010-06-01
The use of drug samples is a controversial issue in medicine. We sought to determine the pros and cons of drug sampling, and how drug sampling in general medicine differs from dermatology. Literature searches were conducted on PubMed, Google, and Yahoo!. Articles were found pertaining to drug sampling in general, and for dermatology specifically. Numerous pros and cons for drug sampling were found in the literature search. We divided these by cost-related issues, such as the industry-wide cost of sampling and the use of sampling to assist the underinsured and poor, and quality of care issues, such as adherence, patient education, and safety considerations. Articles also suggested that dermatology may differ from general medicine as topical treatments have fewer side effects, are more complicated to use, and come in different vehicles. We identified few studies specifically focused on issues relevant to sampling in dermatology. There are strong arguments for and against drug sampling involving both cost and quality of care issues. Dermatology-specific medications clearly differ from oral medications in several regards. We ultimately conclude that the benefits of drug sampling outweigh the risks, but give recommendations on how drug sampling can be done ethically and effectively, including limiting personal use, not selling samples, properly documenting sample release, teaching patients about proper use, teaching students and residents ethical use of samples, working with pharmaceutical representatives in an ethical manner, prescribing the drug that is best for the patient, and securing samples appropriately to prevent theft and misuse. Copyright 2010 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.
Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai
2015-01-01
The use of wavelength variable selection before partial least squares discriminant analysis (PLS-DA) for qualitative identification of solid state fermentation degree by FT-NIR spectroscopy technique was investigated in this study. Two wavelength variable selection methods including competitive adaptive reweighted sampling (CARS) and stability competitive adaptive reweighted sampling (SCARS) were employed to select the important wavelengths. PLS-DA was applied to calibrate identified model using selected wavelength variables by CARS and SCARS for identification of solid state fermentation degree. Experimental results showed that the number of selected wavelength variables by CARS and SCARS were 58 and 47, respectively, from the 1557 original wavelength variables. Compared with the results of full-spectrum PLS-DA, the two wavelength variable selection methods both could enhance the performance of identified models. Meanwhile, compared with CARS-PLS-DA model, the SCARS-PLS-DA model achieved better results with the identification rate of 91.43% in the validation process. The overall results sufficiently demonstrate the PLS-DA model constructed using selected wavelength variables by a proper wavelength variable method can be more accurate identification of solid state fermentation degree. Copyright © 2015 Elsevier B.V. All rights reserved.
New Low-Mass Members of Nearby Young Moving Groups
NASA Astrophysics Data System (ADS)
Schlieder, Joshua; Simon, Michal; Rice, Emily; Lepine, Sebastien
2012-08-01
We are now ready to expand our program to identify new low-mass members of nearby young moving groups (NYMGs) to stars of mass ≤0.3 M_⊙. This is important to: (1) complete the census of low-mass stars near the Sun, (2) provide high priority targets for disk and exoplanet studies by direct imaging, and (3) provide a well- characterized sample of nearby, young stars for detailed study of their physical and kinematic properties. Our proven technique starts with a proper motion selection algorithm, proceeds to vet the sample for indicators of youth, and requires as its last step the measurement of candidate member radial velocities (RVs). So far, we have measured more than 100 candidate RVs using CSHELL on the NASA-IRTF and PHOENIX on Gemini-South, yielding more than 50 likely new moving group members. Here we propose to continue our RV follow-up of candidate NYMG members using PHOENIX on the KPNO 4m. We aim to measure RVs and determine spectral types of 23 faint (V≥15, H≥9), late-type (≥M4) candidates of the (beta) Pic (10 Myrs), AB Dor (70 Myrs), Tuc/Hor (30 Myrs), and TW Hydrae (8 Myrs) moving groups.
Potable water scarcity: options and issues in the coastal areas of Bangladesh.
Islam, Atikul; Sakakibara, Hiroyuki; Karim, Rezaul; Sekine, Masahiko
2013-09-01
In the coastal areas of Bangladesh, scarcity of drinking water is acute as freshwater aquifers are not available at suitable depths and surface water is highly saline. Households are mainly dependent on rainwater harvesting, pond sand filters and pond water for drinking purposes. Thus, individuals in these areas often suffer from waterborne diseases. In this paper, water consumption behaviour in two southwestern coastal districts of Bangladesh has been investigated. The data for this study were collected through a survey conducted on 750 rural households in 39 villages of the study area. The sample was selected using a random sampling technique. Households' choice of water source is complex and seasonally dependent. Water sourcing patterns, households' preference of water sourcing options and economic feasibility of options suggest that a combination of household and community-based options could be suitable for year-round water supply. Distance and time required for water collection were found to be difficult for water collection from community-based options. Both household and community-based options need regular maintenance. In addition to installation of water supply facilities, it is necessary to make the residents aware of proper operation and maintenance of the facilities.
NASA Astrophysics Data System (ADS)
Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai
2015-10-01
The use of wavelength variable selection before partial least squares discriminant analysis (PLS-DA) for qualitative identification of solid state fermentation degree by FT-NIR spectroscopy technique was investigated in this study. Two wavelength variable selection methods including competitive adaptive reweighted sampling (CARS) and stability competitive adaptive reweighted sampling (SCARS) were employed to select the important wavelengths. PLS-DA was applied to calibrate identified model using selected wavelength variables by CARS and SCARS for identification of solid state fermentation degree. Experimental results showed that the number of selected wavelength variables by CARS and SCARS were 58 and 47, respectively, from the 1557 original wavelength variables. Compared with the results of full-spectrum PLS-DA, the two wavelength variable selection methods both could enhance the performance of identified models. Meanwhile, compared with CARS-PLS-DA model, the SCARS-PLS-DA model achieved better results with the identification rate of 91.43% in the validation process. The overall results sufficiently demonstrate the PLS-DA model constructed using selected wavelength variables by a proper wavelength variable method can be more accurate identification of solid state fermentation degree.
An illicit economy: scavenging and recycling of medical waste.
Patwary, Masum A; O'Hare, William Thomas; Sarker, M H
2011-11-01
This paper discusses a significant illicit economy, including black and grey aspects, associated with medical waste scavenging and recycling in a megacity, considering hazards to the specific group involved in scavenging as well as hazards to the general population of city dwellers. Data were collected in Dhaka, Bangladesh, using a variety of techniques based on formal representative sampling for fixed populations (such as recycling operatives) and adaptive sampling for roaming populations (such as scavengers). Extremely hazardous items (including date expired medicines, used syringes, knives, blades and saline bags) were scavenged, repackaged and resold to the community. Some HCE employees were also observed to sell hazardous items directly to scavengers, and both employees and scavengers were observed to supply contaminated items to an informal plastics recycling industry. This trade was made possible by the absence of segregation, secure storage and proper disposal of medical waste. Corruption, a lack of accountability and individual responsibility were also found to be contributors. In most cases the individuals involved with these activities did not understand the risks. Although motivation was often for personal gain or in support of substance abuse, participants sometimes felt that they were providing a useful service to the community. Copyright © 2011 Elsevier Ltd. All rights reserved.
Tea, Illa; Tcherkez, Guillaume
2017-01-01
The natural isotope abundance in bulk organic matter or tissues is not a sufficient base to investigate physiological properties, biosynthetic mechanisms, and nutrition sources of biological systems. In fact, isotope effects in metabolism lead to a heterogeneous distribution of 2 H, 18 O, 13 C, and 15 N isotopes in metabolites. Therefore, compound-specific isotopic analysis (CSIA) is crucial to biological and medical applications of stable isotopes. Here, we review methods to implement CSIA for 15 N and 13 C from plant, animal, and human samples and discuss technical solutions that have been used for the conversion to CO 2 and N 2 for IRMS analysis, derivatization and isotope effect measurements. It appears that despite the flexibility of instruments used for CSIA, there is no universal method simply because the chemical nature of metabolites of interest varies considerably. Also, CSIA methods are often limited by isotope effects in sample preparation or the addition of atoms from the derivatizing reagents, and this implies that corrections must be made to calculate a proper δ-value. Therefore, CSIA has an enormous potential for biomedical applications, but its utilization requires precautions for its successful application. © 2017 Elsevier Inc. All rights reserved.
Liu, Lian; Zhang, Shao-Wu; Huang, Yufei; Meng, Jia
2017-08-31
As a newly emerged research area, RNA epigenetics has drawn increasing attention recently for the participation of RNA methylation and other modifications in a number of crucial biological processes. Thanks to high throughput sequencing techniques, such as, MeRIP-Seq, transcriptome-wide RNA methylation profile is now available in the form of count-based data, with which it is often of interests to study the dynamics at epitranscriptomic layer. However, the sample size of RNA methylation experiment is usually very small due to its costs; and additionally, there usually exist a large number of genes whose methylation level cannot be accurately estimated due to their low expression level, making differential RNA methylation analysis a difficult task. We present QNB, a statistical approach for differential RNA methylation analysis with count-based small-sample sequencing data. Compared with previous approaches such as DRME model based on a statistical test covering the IP samples only with 2 negative binomial distributions, QNB is based on 4 independent negative binomial distributions with their variances and means linked by local regressions, and in the way, the input control samples are also properly taken care of. In addition, different from DRME approach, which relies only the input control sample only for estimating the background, QNB uses a more robust estimator for gene expression by combining information from both input and IP samples, which could largely improve the testing performance for very lowly expressed genes. QNB showed improved performance on both simulated and real MeRIP-Seq datasets when compared with competing algorithms. And the QNB model is also applicable to other datasets related RNA modifications, including but not limited to RNA bisulfite sequencing, m 1 A-Seq, Par-CLIP, RIP-Seq, etc.
Rainwater harvesting in American Samoa: current practices and indicative health risks.
Kirs, Marek; Moravcik, Philip; Gyawali, Pradip; Hamilton, Kerry; Kisand, Veljo; Gurr, Ian; Shuler, Christopher; Ahmed, Warish
2017-05-01
Roof-harvested rainwater (RHRW) is an important alternative source of water that many island communities can use for drinking and other domestic purposes when groundwater and/or surface water sources are contaminated, limited, or simply not available. The aim of this pilot-scale study was to investigate current RHRW practices in American Samoa (AS) and to evaluate and compare the quality of water from common potable water sources including RHRW stored in tanks, untreated stream water, untreated municipal well water, and treated municipal tap water samples. Samples were analyzed using culture-based methods, quantitative polymerase chain reaction (qPCR), and 16S amplicon sequencing-based methods. Based on indicator bacteria (total coliform and Escherichia coli) concentrations, the quality of RHRW was slightly lower than well and chlorinated tap water but exceeded that of untreated stream water. Although no Giardia or Leptospira spp. were detected in any of the RHRW samples, 86% of the samples were positive for Cryptosporidium spp. All stream water samples tested positive for Cryptosporidium spp. Opportunistic pathogens (Pseudomonas aeruginosa and Mycobacterium intracellulare) were also detected in the RHRW samples (71 and 21% positive samples, respectively). Several potentially pathogenic genera of bacteria were also detected in RHRW by amplicon sequencing. Each RHRW system was characterized by distinct microbial communities, 77% of operational taxonomic units (OTUs) were detected only in a single tank, and no OTU was shared by all the tanks. Risk of water-borne illness increased in the following order: chlorinated tap water/well water < RHRW < stream water. Frequent detection of opportunistic pathogens indicates that RHRW should be treated before use. Stakeholder education on RHRW system design options as well as on importance of regular cleaning and proper management techniques could improve the quality of the RHRW in AS.