METHOD DEVELOPMENT FOR THE DETERMINATION OF FORMALDEHYDE IN SAMPLES OF ENVIRONMENTAL ORIGIN
An analytical method was developed for the determination of formaldehyde in samples of environmental origin. After a review of the current literature, five candidate methods involving chemical derivatization were chosen for evaluation. The five derivatization reagents studied wer...
Final Report for X-ray Diffraction Sample Preparation Method Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ely, T. M.; Meznarich, H. K.; Valero, T.
WRPS-1500790, “X-ray Diffraction Saltcake Sample Preparation Method Development Plan/Procedure,” was originally prepared with the intent of improving the specimen preparation methodology used to generate saltcake specimens suitable for XRD-based solid phase characterization. At the time that this test plan document was originally developed, packed powder in cavity supports with collodion binder was the established XRD specimen preparation method. An alternate specimen preparation method less vulnerable, if not completely invulnerable to preferred orientation effects, was desired as a replacement for the method.
Theory of the Trojan-Horse Method - From the Original Idea to Actual Applications
NASA Astrophysics Data System (ADS)
Typel, Stefan
2018-01-01
The origin and the main features of the Trojan-horse (TH) method are delineated starting with the original idea of Gerhard Baur. Basic theoretical considerations, general experimental conditions and possible problems are discussed. Significant steps in experimental studies towards the implementation of the TH method and the development of the theoretical description are presented. This lead to the successful application of the TH approach by Claudio Spitaleri and his group to determine low-energy cross section that are relevant for astrophysics. An outlook with possible developments in the future are given.
Masada, Sayaka
2016-07-01
Various herbal medicines have been developed and used in various parts of the world for thousands of years. Although locally grown indigenous plants were originally used for traditional herbal preparations, Western herbal products are now becoming popular in Japan with the increasing interest in health. At the same time, there are growing concerns about the substitution of ingredients and adulteration of herbal products, highlighting the need for the authentication of the origin of plants used in herbal products. This review describes studies on Cimicifuga and Vitex products developed in Europe and Japan, focusing on establishing analytical methods to evaluate the origins of material plants and finished products. These methods include a polymerase chain reaction-restriction fragment length polymorphism method and a multiplex amplification refractory mutation system method. A genome-based authentication method and liquid chromatography-mass spectrometry-based authentication for black cohosh products, and the identification of two characteristic diterpenes of agnus castus fruit and a shrub chaste tree fruit-specific triterpene derivative are also described.
Jiang, Hao; Zhao, Dehua; Cai, Ying; An, Shuqing
2012-01-01
In previous attempts to identify aquatic vegetation from remotely-sensed images using classification trees (CT), the images used to apply CT models to different times or locations necessarily originated from the same satellite sensor as that from which the original images used in model development came, greatly limiting the application of CT. We have developed an effective normalization method to improve the robustness of CT models when applied to images originating from different sensors and dates. A total of 965 ground-truth samples of aquatic vegetation types were obtained in 2009 and 2010 in Taihu Lake, China. Using relevant spectral indices (SI) as classifiers, we manually developed a stable CT model structure and then applied a standard CT algorithm to obtain quantitative (optimal) thresholds from 2009 ground-truth data and images from Landsat7-ETM+, HJ-1B-CCD, Landsat5-TM and ALOS-AVNIR-2 sensors. Optimal CT thresholds produced average classification accuracies of 78.1%, 84.7% and 74.0% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. However, the optimal CT thresholds for different sensor images differed from each other, with an average relative variation (RV) of 6.40%. We developed and evaluated three new approaches to normalizing the images. The best-performing method (Method of 0.1% index scaling) normalized the SI images using tailored percentages of extreme pixel values. Using the images normalized by Method of 0.1% index scaling, CT models for a particular sensor in which thresholds were replaced by those from the models developed for images originating from other sensors provided average classification accuracies of 76.0%, 82.8% and 68.9% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. Applying the CT models developed for normalized 2009 images to 2010 images resulted in high classification (78.0%–93.3%) and overall (92.0%–93.1%) accuracies. Our results suggest that Method of 0.1% index scaling provides a feasible way to apply CT models directly to images from sensors or time periods that differ from those of the images used to develop the original models.
Lung bioaccessibility of contaminants in particulate matter of geological origin.
Guney, Mert; Chapuis, Robert P; Zagury, Gerald J
2016-12-01
Human exposure to particulate matter (PM) has been associated with adverse health effects. While inhalation exposure to airborne PM is a prominent research subject, exposure to PM of geological origin (i.e., generated from soil/soil-like material) has received less attention. This review discusses the contaminants in PM of geological origin and their relevance for human exposure and then evaluates lung bioaccessibility assessment methods and their use. PM of geological origin can contain toxic elements as well as organic contaminants. Observed/predicted PM lung clearance times are long, which may lead to prolonged contact with lung environment. Thus, certain exposure scenarios warrant the use of in vitro bioaccessibility testing to predict lung bioavailability. Limited research is available on lung bioaccessibility test development and test application to PM of geological origin. For in vitro tests, test parameter variation between different studies and concerns about physiological relevance indicate a crucial need for test method standardization and comparison with relevant animal data. Research is recommended on (1) developing robust in vitro lung bioaccessibility methods, (2) assessing bioaccessibility of various contaminants (especially polycyclic aromatic hydrocarbons (PAHs)) in PM of diverse origin (surface soils, mine tailings, etc.), and (3) risk characterization to determine relative importance of exposure to PM of geological origin.
Using Aerospace Technology To Design Orthopedic Implants
NASA Technical Reports Server (NTRS)
Saravanos, D. A.; Mraz, P. J.; Davy, D. T.
1996-01-01
Technology originally developed to optimize designs of composite-material aerospace structural components used to develop method for optimizing designs of orthopedic implants. Development effort focused on designing knee implants, long-term goal to develop method for optimizing designs of orthopedic implants in general.
Gaudin, Valérie
2017-04-15
Antibiotic residues may be found in food of animal origin, since veterinary drugs are used for preventive and curative purposes to treat animals. The control of veterinary drug residues in food is necessary to ensure consumer safety. Screening methods are the first step in the control of antibiotic residues in food of animal origin. Conventional screening methods are based on different technologies, microbiological methods, immunological methods or physico-chemical methods (e.g. thin-layer chromatography, HPLC, LC-MS/MS). Screening methods should be simple, quick, inexpensive and specific, with low detection limits and high sample throughput. Biosensors can meet some of these requirements. Therefore, the development of biosensors for the screening of antibiotic residues has been increasing since the 1980s. The present review provides extensive and up-to-date findings on biosensors for the screening of antibiotic residues in food products of animal origin. Biosensors are constituted of a bioreceptor and a transducer. In the detection of antibiotic residues, even though antibodies were the first bioreceptors to be used, new kinds of bioreceptors are being developed more and more (enzymes, aptamers, MIPs); their advantages and drawbacks are discussed in this review. The different categories of transducers (electrochemical, mass-based biosensors, optical and thermal) and their potential applications for the screening of antibiotic residues in food are presented. Moreover, the advantages and drawbacks of the different types of transducers are discussed. Lastly, outlook and the future development of biosensors for the control of antibiotic residues in food are highlighted. Copyright © 2016. Published by Elsevier B.V.
Development of a Compound Optimization Approach Based on Imperialist Competitive Algorithm
NASA Astrophysics Data System (ADS)
Wang, Qimei; Yang, Zhihong; Wang, Yong
In this paper, an improved novel approach is developed for the imperialist competitive algorithm to achieve a greater performance. The Nelder-Meand simplex method is applied to execute alternately with the original procedures of the algorithm. The approach is tested on twelve widely-used benchmark functions and is also compared with other relative studies. It is shown that the proposed approach has a faster convergence rate, better search ability, and higher stability than the original algorithm and other relative methods.
Monitoring Marine Microbial Fouling
NASA Technical Reports Server (NTRS)
Colwell, R.
1985-01-01
Two techniques developed for studying marine fouling. Methods originally developed to study fouling of materials used in Space Shuttle solid fuel booster rockets. Methods used to determine both relative fouling rates and efficacy of cleaning methods to remove fouling on various surfaces including paints, metals, and sealants intended for marine use.
DETECTION OF CRYPTOSPORIDIUM OOCYSTS IN WATER MATRICES
Since the advent and recognition of waterborne outbreaks of cryptosporidiosis great effort has been expended on development of methods for detecting Cryptosporidium oocysts in water. Oocysts recovery rates using a method originally developed for detecting Giardia cysts ranged fr...
Mu, Zhaobin; Feng, Xiaoxiao; Zhang, Yun; Zhang, Hongyan
2016-02-01
A multi-residue method based on modified QuEChERS (quick, easy, cheap, effective, rugged, and safe) sample preparation, followed by liquid chromatography tandem mass spectrometry (LC-MS/MS), was developed and validated for the determination of three selected fungicides (propiconazole, pyraclostrobin, and isopyrazam) in seven animal origin foods. The overall recoveries at the three spiking levels of 0.005, 0.05, and 0.5 mg kg(-1) spanned between 72.3 and 101.4% with relative standard deviation (RSD) values between 0.7 and 14.9%. The method shows good linearity in the concentrations between 0.001 and 1 mg L(-1) with the coefficient of determination (R (2)) value >0.99 for each target analyte. The limit of detections (LODs) for target analytes were between 0.04 and 1.26 μg kg(-1), and the limit of quantifications (LOQs) were between 0.13 and 4.20 μg kg(-1). The matrix effect for each individual compound was evaluated through the study of ratios of the areas obtained in solvent and matrix standards. The optimized method provided a negligible matrix effect for propiconazole within 20%, whereas for pyraclostrobin and isopyrazam, the matrix effect was relatively significant with a maximum value of 49.8%. The developed method has been successfully applied to the analysis of 210 animal origin samples obtained from 16 provinces of China. The results suggested that the developed method was satisfactory for trace analysis of three fungicides in animal origin foods.
FODEM: A Multi-Threaded Research and Development Method for Educational Technology
ERIC Educational Resources Information Center
Suhonen, Jarkko; de Villiers, M. Ruth; Sutinen, Erkki
2012-01-01
Formative development method (FODEM) is a multithreaded design approach that was originated to support the design and development of various types of educational technology innovations, such as learning tools, and online study programmes. The threaded and agile structure of the approach provides flexibility to the design process. Intensive…
Higashimoto, Makiko; Takahashi, Masahiko; Jokyu, Ritsuko; Syundou, Hiromi; Saito, Hidetsugu
2007-11-01
A HCV core antigen (Ag) detection assay system, Lumipulse Ortho HCV Ag has been developed and is commercially available in Japan with a lower detection level limit of 50 fmol/l, which is equivalent to 20 KIU/ml in PCR quantitative assay. HCV core Ag assay has an advantage of broader dynamic range compared with PCR assay, however the sensitivity is lower than PCR. We developed a novel HCV core Ag concentration method using polyethylene glycol (PEG), which can improve the sensitivity five times better than the original assay. The reproducibility was examined by consecutive five-time measurement of HCV patients serum, in which the results of HCV core Ag original and concentrated method were 56.8 +/- 8.1 fmol/l (mean +/- SD), CV 14.2% and 322.9 +/- 45.5 fmol/l CV 14.0%, respectively. The assay results of HCV negative samples in original HCV core Ag were all 0.1 fmol/l and the results were same even in the concentration method. The results of concentration method were 5.7 times higher than original assay, which was almost equal to theoretical rate as expected. The assay results of serially diluted samples were also as same as expected data in both original and concentration assay. We confirmed that the sensitivity of HCV core Ag concentration method had almost as same sensitivity as PCR high range assay in the competitive assay study using the serially monitored samples of five HCV patients during interferon therapy. A novel concentration method using PEG in HCV core Ag assay system seems to be useful for assessing and monitoring interferon treatment for HCV.
Mwogi, Thomas S.; Biondich, Paul G.; Grannis, Shaun J.
2014-01-01
Motivated by the need for readily available data for testing an open-source health information exchange platform, we developed and evaluated two methods for generating synthetic messages. The methods used HL7 version 2 messages obtained from the Indiana Network for Patient Care. Data from both methods were analyzed to assess how effectively the output reflected original ‘real-world’ data. The Markov Chain method (MCM) used an algorithm based on transitional probability matrix while the Music Box model (MBM) randomly selected messages of particular trigger type from the original data to generate new messages. The MBM was faster, generated shorter messages and exhibited less variation in message length. The MCM required more computational power, generated longer messages with more message length variability. Both methods exhibited adequate coverage, producing a high proportion of messages consistent with original messages. Both methods yielded similar rates of valid messages. PMID:25954458
Creative Workshop as a Form of Contemporary Art and a Space for Subjective Development
ERIC Educational Resources Information Center
Józefowski, Eugeniusz
2015-01-01
The article presents the original concept of the author's creative workshop which is treated as an art form and the method of education. It contains a presentation of the structure of the original workshop developed by the author in the context of multi-layered relations occurring in the interconnected areas of art and education leading to…
Reliable femoral frame construction based on MRI dedicated to muscles position follow-up.
Dubois, G; Bonneau, D; Lafage, V; Rouch, P; Skalli, W
2015-10-01
In vivo follow-up of muscle shape variation represents a challenge when evaluating muscle development due to disease or treatment. Recent developments in muscles reconstruction techniques indicate MRI as a clinical tool for the follow-up of the thigh muscles. The comparison of 3D muscles shape from two different sequences is not easy because there is no common frame. This study proposes an innovative method for the reconstruction of a reliable femoral frame based on the femoral head and both condyles centers. In order to robustify the definition of condylar spheres, an original method was developed to combine the estimation of diameters of both condyles from the lateral antero-posterior distance and the estimation of the spheres center from an optimization process. The influence of spacing between MR slices and of origin positions was studied. For all axes, the proposed method presented an angular error lower than 1° with spacing between slice of 10 mm and the optimal position of the origin was identified at 56 % of the distance between the femoral head center and the barycenter of both condyles. The high reliability of this method provides a robust frame for clinical follow-up based on MRI .
Pathways to Lean Software Development: An Analysis of Effective Methods of Change
ERIC Educational Resources Information Center
Hanson, Richard D.
2014-01-01
This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…
Recent developments in the Dorfman-Berbaum-Metz procedure for multireader ROC study analysis.
Hillis, Stephen L; Berbaum, Kevin S; Metz, Charles E
2008-05-01
The Dorfman-Berbaum-Metz (DBM) method has been one of the most popular methods for analyzing multireader receiver-operating characteristic (ROC) studies since it was proposed in 1992. Despite its popularity, the original procedure has several drawbacks: it is limited to jackknife accuracy estimates, it is substantially conservative, and it is not based on a satisfactory conceptual or theoretical model. Recently, solutions to these problems have been presented in three papers. Our purpose is to summarize and provide an overview of these recent developments. We present and discuss the recently proposed solutions for the various drawbacks of the original DBM method. We compare the solutions in a simulation study and find that they result in improved performance for the DBM procedure. We also compare the solutions using two real data studies and find that the modified DBM procedure that incorporates these solutions yields more significant results and clearer interpretations of the variance component parameters than the original DBM procedure. We recommend using the modified DBM procedure that incorporates the recent developments.
Review of analytical methods for the quantification of iodine in complex matrices.
Shelor, C Phillip; Dasgupta, Purnendu K
2011-09-19
Iodine is an essential element of human nutrition. Nearly a third of the global population has insufficient iodine intake and is at risk of developing Iodine Deficiency Disorders (IDD). Most countries have iodine supplementation and monitoring programs. Urinary iodide (UI) is the biomarker used for epidemiological studies; only a few methods are currently used routinely for analysis. These methods either require expensive instrumentation with qualified personnel (inductively coupled plasma-mass spectrometry, instrumental nuclear activation analysis) or oxidative sample digestion to remove potential interferences prior to analysis by a kinetic colorimetric method originally introduced by Sandell and Kolthoff ~75 years ago. The Sandell-Kolthoff (S-K) method is based on the catalytic effect of iodide on the reaction between Ce(4+) and As(3+). No available technique fully fits the needs of developing countries; research into inexpensive reliable methods and instrumentation are needed. There have been multiple reviews of methods used for epidemiological studies and specific techniques. However, a general review of iodine determination on a wide-ranging set of complex matrices is not available. While this review is not comprehensive, we cover the principal developments since the original development of the S-K method. Copyright © 2011 Elsevier B.V. All rights reserved.
Synthesizing Regression Results: A Factored Likelihood Method
ERIC Educational Resources Information Center
Wu, Meng-Jia; Becker, Betsy Jane
2013-01-01
Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported…
Downscaling Thermal Infrared Radiance for Subpixel Land Surface Temperature Retrieval
Liu, Desheng; Pu, Ruiliang
2008-01-01
Land surface temperature (LST) retrieved from satellite thermal sensors often consists of mixed temperature components. Retrieving subpixel LST is therefore needed in various environmental and ecological studies. In this paper, we developed two methods for downscaling coarse resolution thermal infrared (TIR) radiance for the purpose of subpixel temperature retrieval. The first method was developed on the basis of a scale-invariant physical model on TIR radiance. The second method was based on a statistical relationship between TIR radiance and land cover fraction at high spatial resolution. The two methods were applied to downscale simulated 990-m ASTER TIR data to 90-m resolution. When validated against the original 90-m ASTER TIR data, the results revealed that both downscaling methods were successful in capturing the general patterns of the original data and resolving considerable spatial details. Further quantitative assessments indicated a strong agreement between the true values and the estimated values by both methods. PMID:27879844
Downscaling Thermal Infrared Radiance for Subpixel Land Surface Temperature Retrieval.
Liu, Desheng; Pu, Ruiliang
2008-04-06
Land surface temperature (LST) retrieved from satellite thermal sensors often consists of mixed temperature components. Retrieving subpixel LST is therefore needed in various environmental and ecological studies. In this paper, we developed two methods for downscaling coarse resolution thermal infrared (TIR) radiance for the purpose of subpixel temperature retrieval. The first method was developed on the basis of a scale-invariant physical model on TIR radiance. The second method was based on a statistical relationship between TIR radiance and land cover fraction at high spatial resolution. The two methods were applied to downscale simulated 990-m ASTER TIR data to 90-m resolution. When validated against the original 90-m ASTER TIR data, the results revealed that both downscaling methods were successful in capturing the general patterns of the original data and resolving considerable spatial details. Further quantitative assessments indicated a strong agreement between the true values and the estimated values by both methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doyle, Jamie L.; Kuhn, Kevin John; Byerly, Benjamin
Nuclear forensic publications, performance tests, and research and development efforts typically target the bulk global inventory of intentionally safeguarded materials, such as plutonium (Pu) and uranium (U). Other materials, such as neptunium (Np), pose a nuclear security risk as well. Trafficking leading to recovery of an interdicted Np sample is a realistic concern especially for materials originating in countries that reprocesses fuel. Using complementary forensic methods, potential signatures for an unknown Np oxide sample were investigated. Measurement results were assessed against published Np processes to present hypotheses as to the original intended use, method of production, and origin for thismore » Np oxide.« less
Alday, Erick A Perez; Colman, Michael A; Langley, Philip; Zhang, Henggui
2017-03-01
Atrial tachy-arrhytmias, such as atrial fibrillation (AF), are characterised by irregular electrical activity in the atria, generally associated with erratic excitation underlain by re-entrant scroll waves, fibrillatory conduction of multiple wavelets or rapid focal activity. Epidemiological studies have shown an increase in AF prevalence in the developed world associated with an ageing society, highlighting the need for effective treatment options. Catheter ablation therapy, commonly used in the treatment of AF, requires spatial information on atrial electrical excitation. The standard 12-lead electrocardiogram (ECG) provides a method for non-invasive identification of the presence of arrhythmia, due to irregularity in the ECG signal associated with atrial activation compared to sinus rhythm, but has limitations in providing specific spatial information. There is therefore a pressing need to develop novel methods to identify and locate the origin of arrhythmic excitation. Invasive methods provide direct information on atrial activity, but may induce clinical complications. Non-invasive methods avoid such complications, but their development presents a greater challenge due to the non-direct nature of monitoring. Algorithms based on the ECG signals in multiple leads (e.g. a 64-lead vest) may provide a viable approach. In this study, we used a biophysically detailed model of the human atria and torso to investigate the correlation between the morphology of the ECG signals from a 64-lead vest and the location of the origin of rapid atrial excitation arising from rapid focal activity and/or re-entrant scroll waves. A focus-location algorithm was then constructed from this correlation. The algorithm had success rates of 93% and 76% for correctly identifying the origin of focal and re-entrant excitation with a spatial resolution of 40 mm, respectively. The general approach allows its application to any multi-lead ECG system. This represents a significant extension to our previously developed algorithms to predict the AF origins in association with focal activities.
Multirate sampled-data yaw-damper and modal suppression system design
NASA Technical Reports Server (NTRS)
Berg, Martin C.; Mason, Gregory S.
1990-01-01
A multirate control law synthesized algorithm based on an infinite-time quadratic cost function, was developed along with a method for analyzing the robustness of multirate systems. A generalized multirate sampled-data control law structure (GMCLS) was introduced. A new infinite-time-based parameter optimization multirate sampled-data control law synthesis method and solution algorithm were developed. A singular-value-based method for determining gain and phase margins for multirate systems was also developed. The finite-time-based parameter optimization multirate sampled-data control law synthesis algorithm originally intended to be applied to the aircraft problem was instead demonstrated by application to a simpler problem involving the control of the tip position of a two-link robot arm. The GMCLS, the infinite-time-based parameter optimization multirate control law synthesis method and solution algorithm, and the singular-value based method for determining gain and phase margins were all demonstrated by application to the aircraft control problem originally proposed for this project.
Hilke Schroeder; Richard Cronn; Yulai Yanbaev; Tara Jennings; Malte Mader; Bernd Degen; Birgit Kersten; Dusan Gomory
2016-01-01
To detect and avoid illegal logging of valuable tree species, identification methods for the origin of timber are necessary. We used next-generation sequencing to identify chloroplast genome regions that differentiate the origin of white oaks from the three continents; Asia, Europe, and North America. By using the chloroplast genome of Asian Q. mongolica...
Assessment of the floral origin of honey by SDS-page immunoblot techniques.
Baroni, María V; Chiabrando, Gustavo A; Costa, Cristina; Wunderlin, Daniel A
2002-03-13
We report on the development of a novel alternative method for the assessment of floral origin in honey samples based on the study of honey proteins using immunoblot assays. The main goal of our work was to evaluate the use of honey proteins as chemical markers of the floral origin of honey. Considering that honeybee proteins should be common to all types of honey, we decided to verify the usefulness of pollen proteins as floral origin markers in honey. We used polyclonal anti-pollen antibodies raised in rabbits by repeated immunization of Sunflower (Elianthus annuus) and Eucalyptus (Eucalyptus sp.) pollen extracts. The IgG fraction was purified by immunoaffinity. These antibodies were verified with nitrocellulose blotted pollen and unifloral honey protein extracts. The antibodies anti-Sunflower pollen, bound to the 36 and 33 kDa proteins of Sunflower unifloral honey and to honey containing Sunflower pollen; and the antibodies anti-Eucalyptus sp. pollen bound to the 38 kDa proteins of Eucalyptus sp. unifloral honey in immunoblot assays. Satisfactory results were obtained in differentiating between the types of pollen analyzed and between Sunflower honey and Eucalyptus honey with less cross reactivity with other types of honey from different origin and also with good sensitivity in the detection. This immunoblot method opens an interesting field for the development of new antibodies from different plants, which could serve as an alternative or complementary method to the usual melissopalynological analysis to assess honey floral origin.
Investigating human geographic origins using dual-isotope (87Sr/86Sr, δ18O) assignment approaches.
Laffoon, Jason E; Sonnemann, Till F; Shafie, Termeh; Hofman, Corinne L; Brandes, Ulrik; Davies, Gareth R
2017-01-01
Substantial progress in the application of multiple isotope analyses has greatly improved the ability to identify nonlocal individuals amongst archaeological populations over the past decades. More recently the development of large scale models of spatial isotopic variation (isoscapes) has contributed to improved geographic assignments of human and animal origins. Persistent challenges remain, however, in the accurate identification of individual geographic origins from skeletal isotope data in studies of human (and animal) migration and provenance. In an attempt to develop and test more standardized and quantitative approaches to geographic assignment of individual origins using isotopic data two methods, combining 87Sr/86Sr and δ18O isoscapes, are examined for the Circum-Caribbean region: 1) an Interval approach using a defined range of fixed isotopic variation per location; and 2) a Likelihood assignment approach using univariate and bivariate probability density functions. These two methods are tested with enamel isotope data from a modern sample of known origin from Caracas, Venezuela and further explored with two archaeological samples of unknown origin recovered from Cuba and Trinidad. The results emphasize both the potential and limitation of the different approaches. Validation tests on the known origin sample exclude most areas of the Circum-Caribbean region and correctly highlight Caracas as a possible place of origin with both approaches. The positive validation results clearly demonstrate the overall efficacy of a dual-isotope approach to geoprovenance. The accuracy and precision of geographic assignments may be further improved by better understanding of the relationships between environmental and biological isotope variation; continued development and refinement of relevant isoscapes; and the eventual incorporation of a broader array of isotope proxy data.
Assigning African elephant DNA to geographic region of origin: Applications to the ivory trade
Wasser, Samuel K.; Shedlock, Andrew M.; Comstock, Kenine; Ostrander, Elaine A.; Mutayoba, Benezeth; Stephens, Matthew
2004-01-01
Resurgence of illicit trade in African elephant ivory is placing the elephant at renewed risk. Regulation of this trade could be vastly improved by the ability to verify the geographic origin of tusks. We address this need by developing a combined genetic and statistical method to determine the origin of poached ivory. Our statistical approach exploits a smoothing method to estimate geographic-specific allele frequencies over the entire African elephants' range for 16 microsatellite loci, using 315 tissue and 84 scat samples from forest (Loxodonta africana cyclotis) and savannah (Loxodonta africana africana) elephants at 28 locations. These geographic-specific allele frequency estimates are used to infer the geographic origin of DNA samples, such as could be obtained from tusks of unknown origin. We demonstrate that our method alleviates several problems associated with standard assignment methods in this context, and the absolute accuracy of our method is high. Continent-wide, 50% of samples were located within 500 km, and 80% within 932 km of their actual place of origin. Accuracy varied by region (median accuracies: West Africa, 135 km; Central Savannah, 286 km; Central Forest, 411 km; South, 535 km; and East, 697 km). In some cases, allele frequencies vary considerably over small geographic regions, making much finer discriminations possible and suggesting that resolution could be further improved by collection of samples from locations not represented in our study. PMID:15459317
Nakano, Keiichi; Tamura, Shogo; Otuka, Kohei; Niizeki, Noriyasu; Shigemura, Masahiko; Shimizu, Chikara; Matsuno, Kazuhiko; Kobayashi, Seiichi; Moriyama, Takanori
2013-07-15
Three-dimensional gel electrophoresis (3-DE), which combines agarose gel electrophoresis and isoelectric focusing/SDS-PAGE, was developed to characterize monoclonal proteins (M-proteins). However, the original 3-DE method has not been optimized and its specificity has not been demonstrated. The main goal of this study was to optimize the 3-DE procedure and then compare it with 2-DE. We developed a highly sensitive 3-DE method in which M-proteins are extracted from a first-dimension agarose gel, by diffusing into 150 mM NaCl, and the recovery of M-proteins was 90.6%. To validate the utility of the highly sensitive 3-DE, we compared it with the original 3-DE method. We found that highly sensitive 3-DE provided for greater M-protein recovery and was more effective in terms of detecting spots on SDS-PAGE gels than the original 3-DE. Moreover, highly sensitive 3-DE separates residual normal IgG from M-proteins, which could not be done by 2-DE. Applying the highly sensitive 3-DE to clinical samples, we found that the characteristics of M-proteins vary tremendously between individuals. We believe that our highly sensitive 3-DE method described here will prove useful in further studies of the heterogeneity of M-proteins. Copyright © 2013 Elsevier Inc. All rights reserved.
An Evolutionary Framework for Understanding the Origin of Eukaryotes.
Blackstone, Neil W
2016-04-27
Two major obstacles hinder the application of evolutionary theory to the origin of eukaryotes. The first is more apparent than real-the endosymbiosis that led to the mitochondrion is often described as "non-Darwinian" because it deviates from the incremental evolution championed by the modern synthesis. Nevertheless, endosymbiosis can be accommodated by a multi-level generalization of evolutionary theory, which Darwin himself pioneered. The second obstacle is more serious-all of the major features of eukaryotes were likely present in the last eukaryotic common ancestor thus rendering comparative methods ineffective. In addition to a multi-level theory, the development of rigorous, sequence-based phylogenetic and comparative methods represents the greatest achievement of modern evolutionary theory. Nevertheless, the rapid evolution of major features in the eukaryotic stem group requires the consideration of an alternative framework. Such a framework, based on the contingent nature of these evolutionary events, is developed and illustrated with three examples: the putative intron proliferation leading to the nucleus and the cell cycle; conflict and cooperation in the origin of eukaryotic bioenergetics; and the inter-relationship between aerobic metabolism, sterol synthesis, membranes, and sex. The modern synthesis thus provides sufficient scope to develop an evolutionary framework to understand the origin of eukaryotes.
Liachko, Ivan; Youngblood, Rachel A.; Keich, Uri; Dunham, Maitreya J.
2013-01-01
DNA replication origins are necessary for the duplication of genomes. In addition, plasmid-based expression systems require DNA replication origins to maintain plasmids efficiently. The yeast autonomously replicating sequence (ARS) assay has been a valuable tool in dissecting replication origin structure and function. However, the dearth of information on origins in diverse yeasts limits the availability of efficient replication origin modules to only a handful of species and restricts our understanding of origin function and evolution. To enable rapid study of origins, we have developed a sequencing-based suite of methods for comprehensively mapping and characterizing ARSs within a yeast genome. Our approach finely maps genomic inserts capable of supporting plasmid replication and uses massively parallel deep mutational scanning to define molecular determinants of ARS function with single-nucleotide resolution. In addition to providing unprecedented detail into origin structure, our data have allowed us to design short, synthetic DNA sequences that retain maximal ARS function. These methods can be readily applied to understand and modulate ARS function in diverse systems. PMID:23241746
Scale Development for Perceived School Climate for Girls' Physical Activity
ERIC Educational Resources Information Center
Birnbaum, Amanda S.; Evenson, Kelly R.; Motl, Robert W.; Dishman, Rod K.; Voorhees, Carolyn C.; Sallis, James F.; Elder, John P.; Dowda, Marsha
2005-01-01
Objectives: To test an original scale assessing perceived school climate for girls' physical activity in middle school girls. Methods: Confirmatory factor analysis (CFA) and structural equation modeling (SEM). Results: CFA retained 5 of 14 original items. A model with 2 correlated factors, perceptions about teachers' and boys' behaviors,…
A discussion on the origin of quantum probabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holik, Federico, E-mail: olentiev2@gmail.com; Departamento de Matemática - Ciclo Básico Común, Universidad de Buenos Aires - Pabellón III, Ciudad Universitaria, Buenos Aires; Sáenz, Manuel
We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivationmore » of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.« less
Nuclear forensic analysis of a non-traditional actinide sample
Doyle, Jamie L.; Kuhn, Kevin John; Byerly, Benjamin; ...
2016-06-15
Nuclear forensic publications, performance tests, and research and development efforts typically target the bulk global inventory of intentionally safeguarded materials, such as plutonium (Pu) and uranium (U). Other materials, such as neptunium (Np), pose a nuclear security risk as well. Trafficking leading to recovery of an interdicted Np sample is a realistic concern especially for materials originating in countries that reprocesses fuel. Using complementary forensic methods, potential signatures for an unknown Np oxide sample were investigated. Measurement results were assessed against published Np processes to present hypotheses as to the original intended use, method of production, and origin for thismore » Np oxide.« less
Nuclear forensic analysis of a non-traditional actinide sample.
Doyle, Jamie L; Kuhn, Kevin; Byerly, Benjamin; Colletti, Lisa; Fulwyler, James; Garduno, Katherine; Keller, Russell; Lujan, Elmer; Martinez, Alexander; Myers, Steve; Porterfield, Donivan; Spencer, Khalil; Stanley, Floyd; Townsend, Lisa; Thomas, Mariam; Walker, Laurie; Xu, Ning; Tandon, Lav
2016-10-01
Nuclear forensic publications, performance tests, and research and development efforts typically target the bulk global inventory of intentionally safeguarded materials, such as plutonium (Pu) and uranium (U). Other materials, such as neptunium (Np), pose a nuclear security risk as well. Trafficking leading to recovery of an interdicted Np sample is a realistic concern especially for materials originating in countries that reprocesses fuel. Using complementary forensic methods, potential signatures for an unknown Np oxide sample were investigated. Measurement results were assessed against published Np processes to present hypotheses as to the original intended use, method of production, and origin for this Np oxide. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Cheng, Jian; Yue, Huiqiang; Yu, Shengjiao; Liu, Tiegang
2018-06-01
In this paper, an adjoint-based high-order h-adaptive direct discontinuous Galerkin method is developed and analyzed for the two dimensional steady state compressible Navier-Stokes equations. Particular emphasis is devoted to the analysis of the adjoint consistency for three different direct discontinuous Galerkin discretizations: including the original direct discontinuous Galerkin method (DDG), the direct discontinuous Galerkin method with interface correction (DDG(IC)) and the symmetric direct discontinuous Galerkin method (SDDG). Theoretical analysis shows the extra interface correction term adopted in the DDG(IC) method and the SDDG method plays a key role in preserving the adjoint consistency. To be specific, for the model problem considered in this work, we prove that the original DDG method is not adjoint consistent, while the DDG(IC) method and the SDDG method can be adjoint consistent with appropriate treatment of boundary conditions and correct modifications towards the underlying output functionals. The performance of those three DDG methods is carefully investigated and evaluated through typical test cases. Based on the theoretical analysis, an adjoint-based h-adaptive DDG(IC) method is further developed and evaluated, numerical experiment shows its potential in the applications of adjoint-based adaptation for simulating compressible flows.
[Theatre systems as a basis for developing medical and rehabilitation methods in psychiatry].
Stroganov, A E
2004-01-01
On the basis of existing and widely used in theatrical practice systems, a new direction in medical and rehabilitation psychiatry, namely transdramatherapy, was developed. The approach is illustrated by the original psychotherapeutic method of epos therapy directed to treatment of neurotic disorders, which has been already created and approbated.
Simple and fast multiplex PCR method for detection of species origin in meat products.
Izadpanah, Mehrnaz; Mohebali, Nazanin; Elyasi Gorji, Zahra; Farzaneh, Parvaneh; Vakhshiteh, Faezeh; Shahzadeh Fazeli, Seyed Abolhassan
2018-02-01
Identification of animal species is one of the major concerns in food regulatory control and quality assurance system. Different approaches have been used for species identification in animal origin of feedstuff. This study aimed to develop a multiplex PCR approach to detect the origin of meat and meat products. Specific primers were designed based on the conserved region of mitochondrial Cytochrome C Oxidase subunit I ( COX1 ) gene. This method could successfully distinguish the origin of the pig, camel, sheep, donkey, goat, cow, and chicken in one single reaction. Since PCR products derived from each species represent unique molecular weight, the amplified products could be identified by electrophoresis and analyzed based on their size. Due to the synchronized amplification of segments within a single PCR reaction, multiplex PCR is considered to be a simple, fast, and inexpensive technique that can be applied for identification of meat products in food industries. Nowadays, this technique has been considered as a practical method to identify the species origin, which could further applied for animal feedstuffs identification.
Methods for Data-based Delineation of Spatial Regions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, John E.
In data analysis, it is often useful to delineate or segregate areas of interest from the general population of data in order to concentrate further analysis efforts on smaller areas. Three methods are presented here for automatically generating polygons around spatial data of interest. Each method addresses a distinct data type. These methods were developed for and implemented in the sample planning tool called Visual Sample Plan (VSP). Method A is used to delineate areas of elevated values in a rectangular grid of data (raster). The data used for this method are spatially related. Although VSP uses data from amore » kriging process for this method, it will work for any type of data that is spatially coherent and appears on a regular grid. Method B is used to surround areas of interest characterized by individual data points that are congregated within a certain distance of each other. Areas where data are “clumped” together spatially will be delineated. Method C is used to recreate the original boundary in a raster of data that separated data values from non-values. This is useful when a rectangular raster of data contains non-values (missing data) that indicate they were outside of some original boundary. If the original boundary is not delivered with the raster, this method will approximate the original boundary.« less
Convenient, Sensitive and High-Throughput Method for Screening Botanic Origin
NASA Astrophysics Data System (ADS)
Yuan, Yuan; Jiang, Chao; Liu, Libing; Yu, Shulin; Cui, Zhanhu; Chen, Min; Lin, Shufang; Wang, Shu; Huang, Luqi
2014-06-01
In this work, a rapid (within 4-5 h), sensitive and visible new method for assessing botanic origin is developed by combining loop-mediated isothermal amplification with cationic conjugated polymers. The two Chinese medicinal materials (Jin-Yin-Hua and Shan-Yin-Hua) with similar morphology and chemical composition were clearly distinguished by gene SNP genotyping assays. The identification of plant species in Patented Chinese drugs containing Lonicera buds is successfully performed using this detection system. The method is also robust enough to be used in high-throughput screening. This new method is very helpful to identify herbal materials, and is beneficial for detecting safety and quality of botanic products.
Multi-Stage Convex Relaxation Methods for Machine Learning
2013-03-01
Many problems in machine learning can be naturally formulated as non-convex optimization problems. However, such direct nonconvex formulations have...original nonconvex formulation. We will develop theoretical properties of this method and algorithmic consequences. Related convex and nonconvex machine learning methods will also be investigated.
Speciation of animal fat: Needs and challenges.
Hsieh, Yun-Hwa Peggy; Ofori, Jack Appiah
2017-05-24
The use of pork fat is a concern for Muslims and Jews, who for religious reasons avoid consuming anything that is pig-derived. The use of bovine materials, including beef fat, is prohibited in Hinduism and may also pose a risk of carrying the infectious agent for bovine spongiform encephalopathy. Vegetable oils are sometimes adulterated with animal fat or pork fat with beef fat for economic gain. The development of methods to determine the species origin of fat has therefore become a priority due to the complex and global nature of the food trade, which creates opportunities for the fraudulent use of these animal fats as food ingredients. However, determining the species origin of fats in processed foods or composite blends is an arduous task as the adulterant has a composition that is very similar to that of the original fat or oil. This review examines some of the methods that have been developed for fat speciation, including both fat-based and DNA-based methods, their shortcomings, and the need for additional alternatives. Protein-based methods, specifically immunoassays targeting residual proteins in adipose tissue, that are being explored by researchers as a new tool for fat speciation will also be discussed.
Gerald Caplan: A Tribute to the Originator of Mental Health Consultation
ERIC Educational Resources Information Center
Erchul, William P.
2009-01-01
Gerald Caplan (1917-2008), world-renowned child and community psychiatrist, was the originator of the modern practice of mental health consultation. In addition to consultation, Caplan developed and refined many conceptual models and methods for practice for use in community mental health, psychology, and education. This tribute article focuses on…
Gene selection for microarray data classification via subspace learning and manifold regularization.
Tang, Chang; Cao, Lijuan; Zheng, Xiao; Wang, Minhui
2017-12-19
With the rapid development of DNA microarray technology, large amount of genomic data has been generated. Classification of these microarray data is a challenge task since gene expression data are often with thousands of genes but a small number of samples. In this paper, an effective gene selection method is proposed to select the best subset of genes for microarray data with the irrelevant and redundant genes removed. Compared with original data, the selected gene subset can benefit the classification task. We formulate the gene selection task as a manifold regularized subspace learning problem. In detail, a projection matrix is used to project the original high dimensional microarray data into a lower dimensional subspace, with the constraint that the original genes can be well represented by the selected genes. Meanwhile, the local manifold structure of original data is preserved by a Laplacian graph regularization term on the low-dimensional data space. The projection matrix can serve as an importance indicator of different genes. An iterative update algorithm is developed for solving the problem. Experimental results on six publicly available microarray datasets and one clinical dataset demonstrate that the proposed method performs better when compared with other state-of-the-art methods in terms of microarray data classification. Graphical Abstract The graphical abstract of this work.
A mixed solvent system for preparation of spherically agglomerated crystals of ascorbic acid.
Ren, Fuzheng; Zhou, Yaru; Liu, Yan; Fu, Jinping; Jing, Qiufang; Ren, Guobin
2017-09-01
The objective of this research was to develop a novel solvent system to prepare spherically agglomerated crystals (SAC) of ascorbic acid with improved flowability for direct compression. A spherical agglomeration method was developed by selecting the mixed solvents (n-butyl and ethyl acetate) as a poor solvent and the process was further optimized by using triangular phase diagram and particle vision measurement. Physiochemical properties of SAC were characterized and compared with original drug crystals. It showed that amount of poor solvent, ratio of solvent mixture, and drug concentration are critical for preparation of SAC with desirable properties. The solid state of SAC was same as original crystals according to DSC, XRD, and FT-IR results. There was no significant difference in solubility and dissolution rate of drug between SAC and original crystals. The flowability and packability of SAC as well as the tensile strength and elastic recovery of tablets made from SAC were all significantly improved when compared with original crystals and tablets from crystals. It is concluded that the present method was suitable to prepare SAC of ascorbic acid for direct compression.
Chew, David S. H.; Choi, Kwok Pui; Leung, Ming-Ying
2005-01-01
Many empirical studies show that there are unusual clusters of palindromes, closely spaced direct and inverted repeats around the replication origins of herpesviruses. In this paper, we introduce two new scoring schemes to quantify the spatial abundance of palindromes in a genomic sequence. Based on these scoring schemes, a computational method to predict the locations of replication origins is developed. When our predictions are compared with 39 known or annotated replication origins in 19 herpesviruses, close to 80% of the replication origins are located within 2% of the genome length. A list of predicted locations of replication origins in all the known herpesviruses with complete genome sequences is reported. PMID:16141192
Electrical Resistivity Imaging
Electrical resistivity imaging (ERI) is a geophysical method originally developed within the mining industry where it has been used for decades to explore for and characterize subsurface mineral deposits. It is one of the oldest geophysical methods with the first documented usag...
[Leg ulcers of venous origin and their development around the year 1955].
Marmasse, J
1984-01-01
"Eventual sclerosis of varicose veins, elastic support, methodical ambulation": the teachings of R. Tournay remains the golden rule for healing leg ulcers of venous origin. Their frequent relapse has been perceptibly reduced by the therapeutic developments of the "Sixties", and notably the phlebosurgical routine collaboration in many cases of varicose ulcers Conference on Stripping, Paris, 1960); and the use of stockings calculated scientifically to benefit healed phlebitic ulcers (Van der Molen, Passien).
NASA Astrophysics Data System (ADS)
Ignat, V.
2016-08-01
Advanced industrial countries are affected by technology theft. German industry annually loses more than 50 billion euros. The main causes are industrial espionage and fraudulent copying patents and industrial products. Many Asian countries are profiteering saving up to 65% of production costs. Most affected are small medium enterprises, who do not have sufficient economic power to assert themselves against some powerful countries. International organizations, such as Interpol and World Customs Organization - WCO - work together to combat international economic crime. Several methods of protection can be achieved by registering patents or specific technical methods for recognition of product originality. They have developed more suitable protection, like Hologram, magnetic stripe, barcode, CE marking, digital watermarks, DNA or Nano-technologies, security labels, radio frequency identification, micro color codes, matrix code, cryptographic encodings. The automotive industry has developed the method “Manufactures against Product Piracy”. A sticker on the package features original products and it uses a Data Matrix verifiable barcode. The code can be recorded with a smartphone camera. The smartphone is connected via Internet to a database, where the identification numbers of the original parts are stored.
Software development for teleroentgenogram analysis
NASA Astrophysics Data System (ADS)
Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.
2017-09-01
A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.
An Evolutionary Framework for Understanding the Origin of Eukaryotes
Blackstone, Neil W.
2016-01-01
Two major obstacles hinder the application of evolutionary theory to the origin of eukaryotes. The first is more apparent than real—the endosymbiosis that led to the mitochondrion is often described as “non-Darwinian” because it deviates from the incremental evolution championed by the modern synthesis. Nevertheless, endosymbiosis can be accommodated by a multi-level generalization of evolutionary theory, which Darwin himself pioneered. The second obstacle is more serious—all of the major features of eukaryotes were likely present in the last eukaryotic common ancestor thus rendering comparative methods ineffective. In addition to a multi-level theory, the development of rigorous, sequence-based phylogenetic and comparative methods represents the greatest achievement of modern evolutionary theory. Nevertheless, the rapid evolution of major features in the eukaryotic stem group requires the consideration of an alternative framework. Such a framework, based on the contingent nature of these evolutionary events, is developed and illustrated with three examples: the putative intron proliferation leading to the nucleus and the cell cycle; conflict and cooperation in the origin of eukaryotic bioenergetics; and the inter-relationship between aerobic metabolism, sterol synthesis, membranes, and sex. The modern synthesis thus provides sufficient scope to develop an evolutionary framework to understand the origin of eukaryotes. PMID:27128953
Grundy, H H; Reece, P; Buckley, M; Solazzo, C M; Dowle, A A; Ashford, D; Charlton, A J; Wadsley, M K; Collins, M J
2016-01-01
Gelatine is a component of a wide range of foods. It is manufactured as a by-product of the meat industry from bone and hide, mainly from bovine and porcine sources. Accurate food labelling enables consumers to make informed decisions about the food they buy. Since labelling currently relies heavily on due diligence involving a paper trail, there could be benefits in developing a reliable test method for the consumer industries in terms of the species origin of gelatine. We present a method to determine the species origin of gelatines by peptide mass spectrometry methods. An evaluative comparison is also made with ELISA and PCR technologies. Commercial gelatines were found to contain undeclared species. Furthermore, undeclared bovine peptides were observed in commercial injection matrices. This analytical method could therefore support the food industry in terms of determining the species authenticity of gelatine in foods. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Danch, J. M.
2008-12-01
Originally designed to allow secondary students with special needs to participate in original scientific research, the Methods of Science Curriculum was piloted in 2008. Students participating included those with special needs, English language learners, and the general population. Students were incrementally graduated from traditional inquiry activities towards authentic student-generated research projects. Students were evaluated via class work grades, an in-school symposium and a pre/post test. 100 percent of participants successfully completed and presented their original research. The pre/post evaluation demonstrated improvement for 91 percent of participants. An unanticipated result was the performance and growth of English language learners, possibly because of the emphasis on the creative and active process of science rather than vocabulary. A teacher-training program is being developed for expansion of the curriculum to additional schools in 2009.
NASA Astrophysics Data System (ADS)
Zhou, Chuan; Chan, Heang-Ping; Chightai, Aamer; Wei, Jun; Hadjiiski, Lubomir M.; Agarwal, Prachi; Kuriakose, Jean W.; Kazerooni, Ella A.
2013-03-01
Automatic tracking and segmentation of the coronary arterial tree is the basic step for computer-aided analysis of coronary disease. The goal of this study is to develop an automated method to identify the origins of the left coronary artery (LCA) and right coronary artery (RCA) as the seed points for the tracking of the coronary arterial trees. The heart region and the contrast-filled structures in the heart region are first extracted using morphological operations and EM estimation. To identify the ascending aorta, we developed a new multiscale aorta search method (MAS) method in which the aorta is identified based on a-priori knowledge of its circular shape. Because the shape of the ascending aorta in the cCTA axial view is roughly a circle but its size can vary over a wide range for different patients, multiscale circularshape priors are used to search for the best matching circular object in each CT slice, guided by the Hausdorff distance (HD) as the matching indicator. The location of the aorta is identified by finding the minimum HD in the heart region over the set of multiscale circular priors. An adaptive region growing method is then used to extend the above initially identified aorta down to the aortic valves. The origins at the aortic sinus are finally identified by a morphological gray level top-hat operation applied to the region-grown aorta with morphological structuring element designed for coronary arteries. For the 40 test cases, the aorta was correctly identified in 38 cases (95%). The aorta can be grown to the aortic root in 36 cases, and 36 LCA origins and 34 RCA origins can be identified within 10 mm of the locations marked by radiologists.
An integrated bioanalytical method development and validation approach: case studies.
Xue, Y-J; Melo, Brian; Vallejo, Martha; Zhao, Yuwen; Tang, Lina; Chen, Yuan-Shek; Keller, Karin M
2012-10-01
We proposed an integrated bioanalytical method development and validation approach: (1) method screening based on analyte's physicochemical properties and metabolism information to determine the most appropriate extraction/analysis conditions; (2) preliminary stability evaluation using both quality control and incurred samples to establish sample collection, storage and processing conditions; (3) mock validation to examine method accuracy and precision and incurred sample reproducibility; and (4) method validation to confirm the results obtained during method development. This integrated approach was applied to the determination of compound I in rat plasma and compound II in rat and dog plasma. The effectiveness of the approach was demonstrated by the superior quality of three method validations: (1) a zero run failure rate; (2) >93% of quality control results within 10% of nominal values; and (3) 99% incurred sample within 9.2% of the original values. In addition, rat and dog plasma methods for compound II were successfully applied to analyze more than 900 plasma samples obtained from Investigational New Drug (IND) toxicology studies in rats and dogs with near perfect results: (1) a zero run failure rate; (2) excellent accuracy and precision for standards and quality controls; and (3) 98% incurred samples within 15% of the original values. Copyright © 2011 John Wiley & Sons, Ltd.
Bello, Alessandra; Bianchi, Federica; Careri, Maria; Giannetto, Marco; Mori, Giovanni; Musci, Marilena
2007-11-05
A new NIR method based on multivariate calibration for determination of ethanol in industrially packed wholemeal bread was developed and validated. GC-FID was used as reference method for the determination of actual ethanol concentration of different samples of wholemeal bread with proper content of added ethanol, ranging from 0 to 3.5% (w/w). Stepwise discriminant analysis was carried out on the NIR dataset, in order to reduce the number of original variables by selecting those that were able to discriminate between the samples of different ethanol concentrations. With the so selected variables a multivariate calibration model was then obtained by multiple linear regression. The prediction power of the linear model was optimized by a new "leave one out" method, so that the number of original variables resulted further reduced.
Convenient, sensitive and high-throughput method for screening botanic origin.
Yuan, Yuan; Jiang, Chao; Liu, Libing; Yu, Shulin; Cui, Zhanhu; Chen, Min; Lin, Shufang; Wang, Shu; Huang, Luqi
2014-06-23
In this work, a rapid (within 4-5 h), sensitive and visible new method for assessing botanic origin is developed by combining loop-mediated isothermal amplification with cationic conjugated polymers. The two Chinese medicinal materials (Jin-Yin-Hua and Shan-Yin-Hua) with similar morphology and chemical composition were clearly distinguished by gene SNP genotyping assays. The identification of plant species in Patented Chinese drugs containing Lonicera buds is successfully performed using this detection system. The method is also robust enough to be used in high-throughput screening. This new method is very helpful to identify herbal materials, and is beneficial for detecting safety and quality of botanic products.
MICROBIAL SOURCE TRACKING: DIFFERENT USES AND APPROACHES
Microbial Source Tracking (MST) methods are used to determine the origin of fecal pollution impacting natural water systems. Several methods require the isolation of pure cultures in order to develop phenotypic or genotypic fingerprint libraries of both source and water bacterial...
NASA Astrophysics Data System (ADS)
Ischak, M.; Setioko, B.; Nurgandarum, D.
2018-01-01
The growth trend of Jakarta city as a Metropolitan city nowadays is the construction of large-scale planned settlement that is often referred to as a new town and is carried out by major developers. The process of land tenure and the process of constructing the new town are directly tangent to the original pre-existing settlements and shape the pattern or types of original settlements in the context of their relationship with the new town. This research was intended to measure the scale of sustainability due to land expansion by new town developers and was measured from the side of the original settlers who still exist. The research method used was descriptive explorative that is by formulating sustainability criteria that match best with research context and using the criteria as a tool to measure the sustainability level of new city development at research site that is new town of Gading Serpong Tangerang. The research concludes that despite the apparent displacement and restriction of original settlement’ lands, it indicates, overall, that new town development meets sustainability criteria when viewed from the residents of three types of the original settlements.
A GPU-accelerated implicit meshless method for compressible flows
NASA Astrophysics Data System (ADS)
Zhang, Jia-Le; Ma, Zhi-Hua; Chen, Hong-Quan; Cao, Cheng
2018-05-01
This paper develops a recently proposed GPU based two-dimensional explicit meshless method (Ma et al., 2014) by devising and implementing an efficient parallel LU-SGS implicit algorithm to further improve the computational efficiency. The capability of the original 2D meshless code is extended to deal with 3D complex compressible flow problems. To resolve the inherent data dependency of the standard LU-SGS method, which causes thread-racing conditions destabilizing numerical computation, a generic rainbow coloring method is presented and applied to organize the computational points into different groups by painting neighboring points with different colors. The original LU-SGS method is modified and parallelized accordingly to perform calculations in a color-by-color manner. The CUDA Fortran programming model is employed to develop the key kernel functions to apply boundary conditions, calculate time steps, evaluate residuals as well as advance and update the solution in the temporal space. A series of two- and three-dimensional test cases including compressible flows over single- and multi-element airfoils and a M6 wing are carried out to verify the developed code. The obtained solutions agree well with experimental data and other computational results reported in the literature. Detailed analysis on the performance of the developed code reveals that the developed CPU based implicit meshless method is at least four to eight times faster than its explicit counterpart. The computational efficiency of the implicit method could be further improved by ten to fifteen times on the GPU.
NASA Technical Reports Server (NTRS)
Murphy, Patrick C. (Technical Monitor); Klein, Vladislav
2005-01-01
The program objectives are fully defined in the original proposal entitled Program of Research in Flight Dynamics in GW at NASA Langley Research Center, which was originated March 20, 1975, and in the renewals of the research program from January 1, 2003 to September 30, 2005. The program in its present form includes three major topics: 1. the improvement of existing methods and development of new methods for wind tunnel and flight data analysis, 2. the application of these methods to wind tunnel and flight test data obtained from advanced airplanes, 3. the correlation of flight results with wind tunnel measurements, and theoretical predictions.
An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle
NASA Astrophysics Data System (ADS)
Wang, Yue; Gao, Dan; Mao, Xuming
2018-03-01
A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.
Using the Saturn V and Titan III Vibroacoustic Databanks for Random Vibration Criteria Development
NASA Technical Reports Server (NTRS)
Ferbee, R C.
2009-01-01
This is an update to TN D-7159, "Development and Application of Vibroacoustic Structural Data Banks in Predicting Vibration Design and Test Criteria for Rocket Vehicle Structures", which was originally published in 1973. Errors in the original document have been corrected and additional data from the Titan III program have been included. Methods for using the vibroacoustic databanks for vibration test criteria development are shown, as well as all of the data with drawings and pictures of the measurement locations. An Excel spreadsheet with the data included is available from the author.
On the Effectiveness of Security Countermeasures for Critical Infrastructures.
Hausken, Kjell; He, Fei
2016-04-01
A game-theoretic model is developed where an infrastructure of N targets is protected against terrorism threats. An original threat score is determined by the terrorist's threat against each target and the government's inherent protection level and original protection. The final threat score is impacted by the government's additional protection. We investigate and verify the effectiveness of countermeasures using empirical data and two methods. The first is to estimate the model's parameter values to minimize the sum of the squared differences between the government's additional resource investment predicted by the model and the empirical data. The second is to develop a multivariate regression model where the final threat score varies approximately linearly relative to the original threat score, sectors, and threat scenarios, and depends nonlinearly on the additional resource investment. The model and method are offered as tools, and as a way of thinking, to determine optimal resource investments across vulnerable targets subject to terrorism threats. © 2014 Society for Risk Analysis.
Wang, Chuanxian; Qu, Li; Liu, Xia; Zhao, Chaomin; Zhao, Fengjuan; Huang, Fuzhen; Zhu, Zhenou; Han, Chao
2017-02-01
An analytical method has been developed for the detection of a metabolite of nifursol, 3,5-dinitrosalicylic acid hydrazide, in foodstuffs of animal origin (chicken liver, pork liver, lobster, shrimp, eel, sausage, and honey). The method combines liquid chromatography and tandem mass spectrometry with liquid-liquid extraction. Samples were hydrolyzed with hydrochloric acid and derivatized with 2-nitrobenzaldehyde at 37°C for 16 h. The solutions of derivatives were adjusted to pH 7.0-7.5, and the metabolite was extracted with ethyl acetate. 3,5-Dinitrosalicylic acid hydrazide determination was performed in the negative electrospray ionization method. Both isotope-labeled internal standard and matrix-matched calibration solutions were used to correct the matrix effects. Limits of quantification were 0.5 μg/kg for all samples. The average recoveries, measured at three concentration levels (0.5, 2.0, and 10 μg/kg) were in the range of 75.8-108.4% with relative standard deviations below 9.8%. The developed method exhibits a high sensitivity and selectivity for the routine determination and confirmation of the presence of a metabolite of nifursol in foodstuffs of animal origin. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Quasicontinuum Method: Overview, applications and current directions
NASA Astrophysics Data System (ADS)
Miller, Ronald E.; Tadmor, E. B.
2002-10-01
The Quasicontinuum (QC) Method, originally conceived and developed by Tadmor, Ortiz and Phillips [1] in 1996, has since seen a great deal of development and application by a number of researchers. The idea of the method is a relatively simple one. With the goal of modeling an atomistic system without explicitly treating every atom in the problem, the QC provides a framework whereby degrees of freedom are judiciously eliminated and force/energy calculations are expedited. This is combined with adaptive model refinement to ensure that full atomistic detail is retained in regions of the problem where it is required while continuum assumptions reduce the computational demand elsewhere. This article provides a review of the method, from its original motivations and formulation to recent improvements and developments. A summary of the important mechanics of materials results that have been obtained using the QC approach is presented. Finally, several related modeling techniques from the literature are briefly discussed. As an accompaniment to this paper, a website designed to serve as a clearinghouse for information on the QC method has been established at www.qcmethod.com. The site includes information on QC research, links to researchers, downloadable QC code and documentation.
Cai, Rui; Wang, Shisheng; Tang, Bo; Li, Yueqing; Zhao, Weijie
2018-01-01
Sea cucumber is the major tonic seafood worldwide, and geographical origin traceability is an important part of its quality and safety control. In this work, a non-destructive method for origin traceability of sea cucumber (Apostichopus japonicus) from northern China Sea and East China Sea using near infrared spectroscopy (NIRS) and multivariate analysis methods was proposed. Total fat contents of 189 fresh sea cucumber samples were determined and partial least-squares (PLS) regression was used to establish the quantitative NIRS model. The ordered predictor selection algorithm was performed to select feasible wavelength regions for the construction of PLS and identification models. The identification model was developed by principal component analysis combined with Mahalanobis distance and scaling to the first range algorithms. In the test set of the optimum PLS models, the root mean square error of prediction was 0.45, and correlation coefficient was 0.90. The correct classification rates of 100% were obtained in both identification calibration model and test model. The overall results indicated that NIRS method combined with chemometric analysis was a suitable tool for origin traceability and identification of fresh sea cucumber samples from nine origins in China. PMID:29410795
Guo, Xiuhan; Cai, Rui; Wang, Shisheng; Tang, Bo; Li, Yueqing; Zhao, Weijie
2018-01-01
Sea cucumber is the major tonic seafood worldwide, and geographical origin traceability is an important part of its quality and safety control. In this work, a non-destructive method for origin traceability of sea cucumber ( Apostichopus japonicus ) from northern China Sea and East China Sea using near infrared spectroscopy (NIRS) and multivariate analysis methods was proposed. Total fat contents of 189 fresh sea cucumber samples were determined and partial least-squares (PLS) regression was used to establish the quantitative NIRS model. The ordered predictor selection algorithm was performed to select feasible wavelength regions for the construction of PLS and identification models. The identification model was developed by principal component analysis combined with Mahalanobis distance and scaling to the first range algorithms. In the test set of the optimum PLS models, the root mean square error of prediction was 0.45, and correlation coefficient was 0.90. The correct classification rates of 100% were obtained in both identification calibration model and test model. The overall results indicated that NIRS method combined with chemometric analysis was a suitable tool for origin traceability and identification of fresh sea cucumber samples from nine origins in China.
PPM mixtures of formaldehyde in gas cylinders: Stability and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, K.C.; Miller, S.B.; Patterson, L.M.
1999-07-01
Scott Specialty Gases has been successful in producing stable calibration gases of formaldehyde at low concentration. Critical to this success has been the development of a treatment process for high pressure aluminum cylinders. Formaldehyde cylinders having concentrations of 20ppm and 4ppm were found to show only small decline in concentrations over a period of approximately 12 months. Since no NIST traceable formaldehyde standards (or Standard Reference Material) are available, all Scott's formaldehyde cylinders were originally certified by traditional impinger method. This method involves an extremely tedious purification procedure for 2,4-dinitrophenylhydrazine (2,4-DNPH). A modified version of the impinger method has beenmore » developed and does not require extensive reagent purification for formaldehyde analysis. Extremely low formaldehyde blanks have been obtained with the modified method. The HPLC conditions in the original method were used for chromatographic separations. The modified method results in a lower analytical uncertainty for the formaldehyde standard mixtures. Consequently, it is possible to discern small differences between analytical results that are important for stability study.« less
Mapping replication origins in yeast chromosomes.
Brewer, B J; Fangman, W L
1991-07-01
The replicon hypothesis, first proposed in 1963 by Jacob and Brenner, states that DNA replication is controlled at sites called origins. Replication origins have been well studied in prokaryotes. However, the study of eukaryotic chromosomal origins has lagged behind, because until recently there has been no method for reliably determining the identity and location of origins from eukaryotic chromosomes. Here, we review a technique we developed with the yeast Saccharomyces cerevisiae that allows both the mapping of replication origins and an assessment of their activity. Two-dimensional agarose gel electrophoresis and Southern hybridization with total genomic DNA are used to determine whether a particular restriction fragment acquires the branched structure diagnostic of replication initiation. The technique has been used to localize origins in yeast chromosomes and assess their initiation efficiency. In some cases, origin activation is dependent upon the surrounding context. The technique is also being applied to a variety of eukaryotic organisms.
Simplified Least Squares Shadowing sensitivity analysis for chaotic ODEs and PDEs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chater, Mario, E-mail: chaterm@mit.edu; Ni, Angxiu, E-mail: niangxiu@mit.edu; Wang, Qiqi, E-mail: qiqi@mit.edu
This paper develops a variant of the Least Squares Shadowing (LSS) method, which has successfully computed the derivative for several chaotic ODEs and PDEs. The development in this paper aims to simplify Least Squares Shadowing method by improving how time dilation is treated. Instead of adding an explicit time dilation term as in the original method, the new variant uses windowing, which can be more efficient and simpler to implement, especially for PDEs.
The Clawpack Community of Codes
NASA Astrophysics Data System (ADS)
Mandli, K. T.; LeVeque, R. J.; Ketcheson, D.; Ahmadia, A. J.
2014-12-01
Clawpack, the Conservation Laws Package, has long been one of the standards for solving hyperbolic conservation laws but over the years has extended well beyond this role. Today a community of open-source codes have been developed that address a multitude of different needs including non-conservative balance laws, high-order accurate methods, and parallelism while remaining extensible and easy to use, largely by the judicious use of Python and the original Fortran codes that it wraps. This talk will present some of the recent developments in projects under the Clawpack umbrella, notably the GeoClaw and PyClaw projects. GeoClaw was originally developed as a tool for simulating tsunamis using adaptive mesh refinement but has since encompassed a large number of other geophysically relevant flows including storm surge and debris-flows. PyClaw originated as a Python version of the original Clawpack algorithms but has since been both a testing ground for new algorithmic advances in the Clawpack framework but also an easily extensible framework for solving hyperbolic balance laws. Some of these extensions include the addition of WENO high-order methods, massively parallel capabilities, and adaptive mesh refinement technologies, made possible largely by the flexibility of the Python language and community libraries such as NumPy and PETSc. Because of the tight integration with Python tecnologies, both packages have benefited also from the focus on reproducibility in the Python community, notably IPython notebooks.
Tang, Jin-Fa; Li, Wei-Xia; Zhang, Fan; Li, Yu-Hui; Cao, Ying-Jie; Zhao, Ya; Li, Xue-Lin; Ma, Zhi-Jie
2017-01-01
Nowadays, Radix Polygoni Multiflori (RPM, Heshouwu in Chinese) from different geographical origins were used in clinic. In order to characterize the chemical profiles of different geographical origins of RPM samples, ultra-high performance liquid chromatography quadrupole time of flight mass spectrometry (UPLC-QTOF/MS) combined with chemometrics (partial least squared discriminant analysis, PLS‑DA) method was applied in the present study. The chromatography, chemical composition and MS information of RPM samples from 18 geographical origins were acquired and profiled by UPLC-QTOF/MS. The chemical markers contributing the differentiation of RPM samples were observed and characterized by supervised PLS‑DA method of chemometrics. The chemical composition differences of RPM samples derived from 18 different geographical origins were observed. Nine chemical markers were tentatively identified which could be used as specific chemical markers for the differentiation of geographical RPM samples. UPLC-QTOF/MS method coupled with chemometrics analysis has potential to be used for discriminating different geographical TCMs. Results will help to develop strategies for conservation and utilization of RPM samples.
Color preservation for tone reproduction and image enhancement
NASA Astrophysics Data System (ADS)
Hsin, Chengho; Lee, Zong Wei; Lee, Zheng Zhan; Shin, Shaw-Jyh
2014-01-01
Applications based on luminance processing often face the problem of recovering the original chrominance in the output color image. A common approach to reconstruct a color image from the luminance output is by preserving the original hue and saturation. However, this approach often produces a highly colorful image which is undesirable. We develop a color preservation method that not only retains the ratios of the input tri-chromatic values but also adjusts the output chroma in an appropriate way. Linearizing the output luminance is the key idea to realize this method. In addition, a lightness difference metric together with a colorfulness difference metric are proposed to evaluate the performance of the color preservation methods. It shows that the proposed method performs consistently better than the existing approaches.
Summary of Technical Operations, 1991
1992-01-01
exploit commonality. The project is using the Feature-Oriented Domain Analysis ( FODA ) method, developed by the project in 1990, to perform this...the development of new movement control software. The analysis will also serve as a means of improving the FODA method. The results of this analysis ...STARS environment. The NASA Program Office has officially decided to expand the use of Rate Monotonic Analysis (RMA), which was originally isolated to
Liu, Derek; Sloboda, Ron S
2014-05-01
Boyer and Mok proposed a fast calculation method employing the Fourier transform (FT), for which calculation time is independent of the number of seeds but seed placement is restricted to calculation grid points. Here an interpolation method is described enabling unrestricted seed placement while preserving the computational efficiency of the original method. The Iodine-125 seed dose kernel was sampled and selected values were modified to optimize interpolation accuracy for clinically relevant doses. For each seed, the kernel was shifted to the nearest grid point via convolution with a unit impulse, implemented in the Fourier domain. The remaining fractional shift was performed using a piecewise third-order Lagrange filter. Implementation of the interpolation method greatly improved FT-based dose calculation accuracy. The dose distribution was accurate to within 2% beyond 3 mm from each seed. Isodose contours were indistinguishable from explicit TG-43 calculation. Dose-volume metric errors were negligible. Computation time for the FT interpolation method was essentially the same as Boyer's method. A FT interpolation method for permanent prostate brachytherapy TG-43 dose calculation was developed which expands upon Boyer's original method and enables unrestricted seed placement. The proposed method substantially improves the clinically relevant dose accuracy with negligible additional computation cost, preserving the efficiency of the original method.
Fabrication Process for Large Size Mold and Alignment Method for Nanoimprint System
NASA Astrophysics Data System (ADS)
Ishibashi, Kentaro; Kokubo, Mitsunori; Goto, Hiroshi; Mizuno, Jun; Shoji, Shuichi
Nanoimprint technology is considered one of the mass production methods of the display for cellular phone or notebook computer, with Anti-Reflection Structures (ARS) pattern and so on. In this case, the large size mold with nanometer order pattern is very important. Then, we describe the fabrication process for large size mold, and the alignment method for UV nanoimprint system. We developed the original mold fabrication process using nanoimprint method and etching techniques. In 66 × 45 mm2 area, 200nm period seamless patterns were formed using this process. And, we constructed original alignment system that consists of the CCD-camera system, X-Y-θ table, method of moiré fringe, and image processing system, because the accuracy of pattern connection depends on the alignment method. This alignment system accuracy was within 20nm.
An efficient study design to test parent-of-origin effects in family trios.
Yu, Xiaobo; Chen, Gao; Feng, Rui
2017-11-01
Increasing evidence has shown that genes may cause prenatal, neonatal, and pediatric diseases depending on their parental origins. Statistical models that incorporate parent-of-origin effects (POEs) can improve the power of detecting disease-associated genes and help explain the missing heritability of diseases. In many studies, children have been sequenced for genome-wide association testing. But it may become unaffordable to sequence their parents and evaluate POEs. Motivated by the reality, we proposed a budget-friendly study design of sequencing children and only genotyping their parents through single nucleotide polymorphism array. We developed a powerful likelihood-based method, which takes into account both sequence reads and linkage disequilibrium to infer the parental origins of children's alleles and estimate their POEs on the outcome. We evaluated the performance of our proposed method and compared it with an existing method using only genotypes, through extensive simulations. Our method showed higher power than the genotype-based method. When either the mean read depth or the pair-end length was reasonably large, our method achieved ideal power. When single parents' genotypes were unavailable or parental genotypes at the testing locus were not typed, both methods lost power compared with when complete data were available; but the power loss from our method was smaller than the genotype-based method. We also extended our method to accommodate mixed genotype, low-, and high-coverage sequence data from children and their parents. At presence of sequence errors, low-coverage parental sequence data may lead to lower power than parental genotype data. © 2017 WILEY PERIODICALS, INC.
Neurobehavioral Assessment before Birth
ERIC Educational Resources Information Center
DiPietro, Janet A.
2005-01-01
The complexities of neurobehavioral assessment of the fetus, which can be neither directly viewed nor manipulated, cannot be understated. Impetus to develop methods for measuring fetal neurobehavioral development has been provided by the recognition that individual differences in neurobehavioral functioning do not originate with birth and…
ERIC Educational Resources Information Center
Adamson, Charles
A two-semester English-for-Special-Purposes (ESP) course designed for nursing students at a Japanese university is described, including the origins and development of the course, text development, and teaching methods. The content-based course was designed to meet licensure requirements for English language training, emphasized listening and…
Questionnaire Adapting: Little Changes Mean a Lot.
Sousa, Vanessa E C; Matson, Jeffrey; Dunn Lopez, Karen
2017-09-01
Questionnaire development involves rigorous testing to ensure reliability and validity. Due to time and cost constraints of developing new questionnaires, researchers often adapt existing questionnaires to better fit the purpose of their study. However, the effect of such adaptations is unclear. We conducted cognitive interviews as a method to evaluate the understanding of original and adapted questionnaire items to be applied in a future study. The findings revealed that all subjects (a) comprehended the original and adapted items differently, (b) changed their scores after comparing the original to the adapted items, and (c) were unanimous in stating that the adapted items were easier to understand. Cognitive interviewing allowed us to assess the interpretation of adapted items in a useful and efficient manner before use in data collection.
RNA catalysis and the origins of life
NASA Technical Reports Server (NTRS)
Orgel, Leslie E.
1986-01-01
The role of RNA catalysis in the origins of life is considered in connection with the discovery of riboszymes, which are RNA molecules that catalyze sequence-specific hydrolysis and transesterification reactions of RNA substrates. Due to this discovery, theories positing protein-free replication as preceding the appearance of the genetic code are more plausible. The scope of RNA catalysis in biology and chemistry is discussed, and it is noted that the development of methods to select (or predict) RNA sequences with preassigned catalytic functions would be a major contribution to the study of life's origins.
The Finnish healthcare services lean management.
Hihnala, Susanna; Kettunen, Lilja; Suhonen, Marjo; Tiirinki, Hanna
2018-02-05
Purpose The purpose of this paper is to discuss health services managers' experiences of management in a special health-care unit and development efforts from the point of view of the Lean method. Additionally, the aim is to deepen the knowledge of the managers' work and nature of the Lean method development processes in the workplace. The research focuses on those aspects and results of Lean method that are currently being used in health-care environments. Design/methodology/approach These data were collected through a number of thematic interviews. The participants were nurse managers ( n = 7) and medical managers ( n = 7) who applied Lean management in their work at the University Hospital in the Northern Ostrobothnia Health Care District. The data were analysed with a qualitative content analysis. Findings A common set of values in specialized health-care services, development of activities and challenges for management in the use of the Lean manager development model to improve personal management skills. Practical implications Managers in specialized health-care services can develop and systematically manage with the help of the Lean method. This emphasizes assumptions, from the point of view of management, about systems development when the organization uses the Lean method. The research outcomes originate from specialized health-care settings in Finland in which the Lean method and its associated management principles have been implemented and applied to the delivery of health care. Originality/value The study shows that the research results and in-depth knowledge on Lean method principles can be applied to health-care management and development processes. The research also describes health services managers' experiences of using the Lean method. In the future, these results can be used to improve Lean management skills, identify personal professional competencies and develop skills required in development processes. Also, the research findings can be used in the training of health services managers in the health-care industry worldwide and to help them survive the pressure to change repeatedly.
Chaillet, Nils; Nyström, Marjatta; Demirjian, Arto
2005-09-01
Dental maturity was studied with 9577 dental panoramic tomograms of healthy subjects from 8 countries, aged between 2 and 25 years of age. Demirjian's method based on 7 teeth was used for determining dental maturity scores, establishing gender-specific tables of maturity scores and development graphs. The aim of this study was to give dental maturity standards when the ethnic origin is unknown and to compare the efficiency and applicability of this method to forensic sciences and dental clinicians. The second aim was to compare the dental maturity of these different populations. We noted an high efficiency for International Demirjian's method at 99% CI (0.85% of misclassified and a mean accuracy between 2 to 18 years +/- 2.15 years), which makes it useful for forensic purposes. Nevertheless, this international method is less accurate than Demirjian's method developed for a specific country, because of the inter-ethnic variability obtained by the addition of 8 countries in the dental database. There are inter-ethnic differences classified in three major groups. Australians have the fastest dental maturation and Koreans have the slowest.
77 FR 28882 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-16
... innovative methods, techniques, and approaches for dealing with occupational safety and health problems... ICR was not submitted to OMB before the original expiration date. Since the Right to Know movement in the late 1970s, NIOSH has been developing methods and materials to notify subjects of its...
Reliability of Test Scores in Nonparametric Item Response Theory.
ERIC Educational Resources Information Center
Sijtsma, Klaas; Molenaar, Ivo W.
1987-01-01
Three methods for estimating reliability are studied within the context of nonparametric item response theory. Two were proposed originally by Mokken and a third is developed in this paper. Using a Monte Carlo strategy, these three estimation methods are compared with four "classical" lower bounds to reliability. (Author/JAZ)
78 FR 57860 - Draft NIH Genomic Data Sharing Policy Request for Public Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-20
... underlying disease, development of statistical research methods, the study of populations origins). If so... community will be notified through appropriate communication methods (e.g., The NIH Guide for Grants and... Sharing Policy Request for Public Comments SUMMARY: The National Institutes of Health (NIH) is seeking...
Educational Methods for Deaf-Blind and Severely Handicapped Students. Volume II.
ERIC Educational Resources Information Center
Texas Education Agency, Austin. Div. of Special Education.
The book presents 17 papers originally presented at a 1978 Texas conference on educational methods for deaf-blind and severely handicapped students, covering the areas of motor development, auditory assessment and hearing loss, communication, cognition, prevocational training, and functional living skills. Titles and authors include "An Overview…
Two-step extraction method for lead isotope fractionation to reveal anthropogenic lead pollution.
Katahira, Kenshi; Moriwaki, Hiroshi; Kamura, Kazuo; Yamazaki, Hideo
2018-05-28
This study developed the 2-step extraction method which eluted the Pb adsorbing on the surface of sediments in the first solution by aqua regia and extracted the Pb absorbed inside particles into the second solution by mixed acid of nitric acid, hydrofluoric acid and hydrogen peroxide solution. We applied the method to sediments in the enclosed water area and found out that the isotope ratios of Pb in the second solution represented those of natural origin. This advantage of the method makes it possible to distinguish the Pb between natural origin and anthropogenic source on the basis of the isotope ratios. The results showed that the method was useful to discuss the Pb sources and that anthropogenic Pb in the sediment samples analysed was mainly derived from China because of transboundary air pollution.
Laser triangulation method for measuring the size of parking claw
NASA Astrophysics Data System (ADS)
Liu, Bo; Zhang, Ming; Pang, Ying
2017-10-01
With the development of science and technology and the maturity of measurement technology, the 3D profile measurement technology has been developed rapidly. Three dimensional measurement technology is widely used in mold manufacturing, industrial inspection, automatic processing and manufacturing, etc. There are many kinds of situations in scientific research and industrial production. It is necessary to transform the original mechanical parts into the 3D data model on the computer quickly and accurately. At present, many methods have been developed to measure the contour size, laser triangulation is one of the most widely used methods.
Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics
NASA Astrophysics Data System (ADS)
Chen, Yu-Zhong; Lai, Ying-Cheng
2018-03-01
Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.
Sparse dynamical Boltzmann machine for reconstructing complex networks with binary dynamics.
Chen, Yu-Zhong; Lai, Ying-Cheng
2018-03-01
Revealing the structure and dynamics of complex networked systems from observed data is a problem of current interest. Is it possible to develop a completely data-driven framework to decipher the network structure and different types of dynamical processes on complex networks? We develop a model named sparse dynamical Boltzmann machine (SDBM) as a structural estimator for complex networks that host binary dynamical processes. The SDBM attains its topology according to that of the original system and is capable of simulating the original binary dynamical process. We develop a fully automated method based on compressive sensing and a clustering algorithm to construct the SDBM. We demonstrate, for a variety of representative dynamical processes on model and real world complex networks, that the equivalent SDBM can recover the network structure of the original system and simulates its dynamical behavior with high precision.
Sanders, Sharon; Flaws, Dylan; Than, Martin; Pickering, John W; Doust, Jenny; Glasziou, Paul
2016-01-01
Scoring systems are developed to assist clinicians in making a diagnosis. However, their uptake is often limited because they are cumbersome to use, requiring information on many predictors, or complicated calculations. We examined whether, and how, simplifications affected the performance of a validated score for identifying adults with chest pain in an emergency department who have low risk of major adverse cardiac events. We simplified the Emergency Department Assessment of Chest pain Score (EDACS) by three methods: (1) giving equal weight to each predictor included in the score, (2) reducing the number of predictors, and (3) using both methods--giving equal weight to a reduced number of predictors. The diagnostic accuracy of the simplified scores was compared with the original score in the derivation (n = 1,974) and validation (n = 909) data sets. There was no difference in the overall accuracy of the simplified versions of the score compared with the original EDACS as measured by the area under the receiver operating characteristic curve (0.74 to 0.75 for simplified versions vs. 0.75 for the original score in the validation cohort). With score cut-offs set to maintain the sensitivity of the combination of score and tests (electrocardiogram and cardiac troponin) at a level acceptable to clinicians (99%), simplification reduced the proportion of patients classified as low risk from 50% with the original score to between 22% and 42%. Simplification of a clinical score resulted in similar overall accuracy but reduced the proportion classified as low risk and therefore eligible for early discharge compared with the original score. Whether the trade-off is acceptable, will depend on the context in which the score is to be used. Developers of clinical scores should consider simplification as a method to increase uptake, but further studies are needed to determine the best methods of deriving and evaluating simplified scores. Copyright © 2016 Elsevier Inc. All rights reserved.
Berenbrock, Charles E.
2015-01-01
The effects of reduced cross-sectional data points on steady-flow profiles were also determined. Thirty-five cross sections of the original steady-flow model of the Kootenai River were used. These two methods were tested for all cross sections with each cross section resolution reduced to 10, 20 and 30 data points, that is, six tests were completed for each of the thirty-five cross sections. Generally, differences from the original water-surface elevation were smaller as the number of data points in reduced cross sections increased, but this was not always the case, especially in the braided reach. Differences were smaller for reduced cross sections developed by the genetic algorithm method than the standard algorithm method.
Gary H. Elsner
1971-01-01
A computerized method for gathering market area information from campground permits has been developed. Point-of-origin and length-of-stay of campground users can be estimated and summarized quickly and inexpensively. The method should be equally useful for public as well as private campgrounds-provided basic registration data are available and can be processed...
Li, Nan; Wang, Jiahui; Shen, Qing; Han, Chunhui; Zhang, Jing; Li, Fengqin; Xu, Jin; Jiang, Tao
2013-11-01
To develop a real-time PCR method for identification and detection of domestic horse meat (Equus caballus) in animal-origin products. The primer and TaqMan-probe was designed and synthesized according to the EU reference laboratory and 87 bp fragments was amplified for horse ingredients. The specificity and sensitivity was tested by artificially spiked horse meat into other domestic meat, such as cattle, sheep, pork, chicken, duck and rabbit. 122 samples of cattle and sheep products were random collected in Beijing market and the detection of horse meat was carried out. The real-time PCR in this study has high specificity and sensitivity for horse meat. No cross-reaction was observed between the horse and sheep, pork, chicken, duck and rabbit meat. There was little cross reaction between horse and cattle when the CT value reach 33. 81. The method can detect 0.1% of horse meat mixed with other domestic animal-origin products. No horse meat ingredients were detected in 122 samples in this survey. There was no horse meat mixed into cattle and sheep products in Beijing marked.
Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1989-01-01
An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.
Gliogenesis: historical perspectives, 1839-1985.
Webster, Henry deF; Aström, Karl E
2009-01-01
This historical review of gliogenesis begins with Schwann's introduction of the cell doctrine in 1839. Subsequent microscopic studies revealed the cellular structure of many organs and tissues, but the CNS was thought to be different. In 1864, Virchow created the concept that nerve cells are held together by a "Nervenkitte" which he called"glia" (for glue). He and his contemporaries thought that "glia" was an unstructured, connective tissue-like ground substance that separated nerve cells from each other and from blood vessels. Dieters, a pupil of Virchow, discovered that this ground substance contained cells, which he described and illustrated. Improvements in microscopes and discovery of metallic impregnation methods finally showed convincingly that the "glia" was not a binding substance. Instead, it was composed of cells, each separate and distinct from neighboring cells and each with its own characteristic array of processes. Light microscopic studies of developing and mature nervous tissue led to the discovery of different types of glial cells-astroglia, oligodendroglia, microglia, and ependymal cells in the CNS, and Schwann cells in the peripheral nervous system (PNS). Subsequent studies characterized the origins and development of each type of glial cell. A new era began with the introduction of electron microscopy, immunostaining, and in vitro maintenance of both central and peripheral nervous tissue. Other methods and models greatly expanded our understanding of how glia multiply, migrate, and differentiate. In 1985, almost a century and a half of study had produced substantial progress in our understanding of glial cells, including their origins and development. Major advances were associated with the discovery of new methods. These are summarized first. Then the origins and development of astroglia, oligodendroglia, microglia, ependymal cells, and Schwann cells are described and discussed. In general, morphology is emphasized. Findings related to cytodifferentiation, cellular interactions, functions, and regulation of developing glia have also been included.
NASA Astrophysics Data System (ADS)
Uríčková, Veronika; Sádecká, Jana
2015-09-01
The identification of the geographical origin of beverages is one of the most important issues in food chemistry. Spectroscopic methods provide a relative rapid and low cost alternative to traditional chemical composition or sensory analyses. This paper reviews the current state of development of ultraviolet (UV), visible (Vis), near infrared (NIR) and mid infrared (MIR) spectroscopic techniques combined with pattern recognition methods for determining geographical origin of both wines and distilled drinks. UV, Vis, and NIR spectra contain broad band(s) with weak spectral features limiting their discrimination ability. Despite this expected shortcoming, each of the three spectroscopic ranges (NIR, Vis/NIR and UV/Vis/NIR) provides average correct classification higher than 82%. Although average correct classification is similar for NIR and MIR regions, in some instances MIR data processing improves prediction. Advantage of using MIR is that MIR peaks are better defined and more easily assigned than NIR bands. In general, success in a classification depends on both spectral range and pattern recognition methods. The main problem still remains the construction of databanks needed for all of these methods.
A Study on the State of Development of Higher Education Confucius Institutes from Mainland China
ERIC Educational Resources Information Center
Jeng-Yi, Lin
2016-01-01
Confucius Institutes are the target of this study, and the study uses document analysis method to analyze the origins of their establishment, purpose, organizational structure, administration model, and state of development. This study finds that Confucius Institutes are facing the following problems: problems in the development and management of…
Geographic variability of Escherichia coli ribotypes from animals in Idaho and Georgia.
Hartel, Peter G; Summer, Jacob D; Hill, Jennifer L; Collins, J Victoria; Entry, James A; Segars, William I
2002-01-01
Several genotypic methods have been developed for determining the host origin of fecal bacteria in contaminated waters. Some of these methods rely on a host origin database to identify environmental isolates. It is not well understood to what degree these host origin isolates are geographically variable (i.e., cosmopolitan or endemic). This is important because a geographically limited host origin database may or may not be universally applicable. The objective of our study was to use one genotypic method, ribotyping, to determine the geographic variability of the fecal bacterium, Escherichia coli, from one location in Idaho and three locations in Georgia for cattle (Bos taurus), horse (Equus caballus), swine (Sus scrofa), and chicken (Gallus gallus domesticus). A total of 568 fecal E. coli isolates from Kimberly, ID (125 isolates), Athens, GA (210 isolates), Brunswick, GA (102 isolates), and Tifton, GA (131 isolates), yielded 213 ribotypes. The percentage of ribotype sharing within an animal species increased with decreased distance between geographic locations for cattle and horses, but not for swine and chicken. When the E. coli ribotypes among the four host species were compared at one location, the percent of unshared ribotypes was 86, 89, 81, and 79% for Kimberly, Athens, Brunswick, and Tifton, respectively. These data suggest that there is good ribotype separation among host animal species at each location. The ability to match environmental isolates to a host origin database may depend on a large number of environmental and host origin isolates that ideally are not geographically separated.
Fast sweeping method for the factored eikonal equation
NASA Astrophysics Data System (ADS)
Fomel, Sergey; Luo, Songting; Zhao, Hongkai
2009-09-01
We develop a fast sweeping method for the factored eikonal equation. By decomposing the solution of a general eikonal equation as the product of two factors: the first factor is the solution to a simple eikonal equation (such as distance) or a previously computed solution to an approximate eikonal equation. The second factor is a necessary modification/correction. Appropriate discretization and a fast sweeping strategy are designed for the equation of the correction part. The key idea is to enforce the causality of the original eikonal equation during the Gauss-Seidel iterations. Using extensive numerical examples we demonstrate that (1) the convergence behavior of the fast sweeping method for the factored eikonal equation is the same as for the original eikonal equation, i.e., the number of iterations for the Gauss-Seidel iterations is independent of the mesh size, (2) the numerical solution from the factored eikonal equation is more accurate than the numerical solution directly computed from the original eikonal equation, especially for point sources.
Intrinsically bent DNA in replication origins and gene promoters.
Gimenes, F; Takeda, K I; Fiorini, A; Gouveia, F S; Fernandez, M A
2008-06-24
Intrinsically bent DNA is an alternative conformation of the DNA molecule caused by the presence of dA/dT tracts, 2 to 6 bp long, in a helical turn phase DNA or with multiple intervals of 10 to 11 bp. Other than flexibility, intrinsic bending sites induce DNA curvature in particular chromosome regions such as replication origins and promoters. Intrinsically bent DNA sites are important in initiating DNA replication, and are sometimes found near to regions associated with the nuclear matrix. Many methods have been developed to localize bent sites, for example, circular permutation, computational analysis, and atomic force microscopy. This review discusses intrinsically bent DNA sites associated with replication origins and gene promoter regions in prokaryote and eukaryote cells. We also describe methods for identifying bent DNA sites for circular permutation and computational analysis.
A method for reducing the order of nonlinear dynamic systems
NASA Astrophysics Data System (ADS)
Masri, S. F.; Miller, R. K.; Sassi, H.; Caughey, T. K.
1984-06-01
An approximate method that uses conventional condensation techniques for linear systems together with the nonparametric identification of the reduced-order model generalized nonlinear restoring forces is presented for reducing the order of discrete multidegree-of-freedom dynamic systems that possess arbitrary nonlinear characteristics. The utility of the proposed method is demonstrated by considering a redundant three-dimensional finite-element model half of whose elements incorporate hysteretic properties. A nonlinear reduced-order model, of one-third the order of the original model, is developed on the basis of wideband stationary random excitation and the validity of the reduced-order model is subsequently demonstrated by its ability to predict with adequate accuracy the transient response of the original nonlinear model under a different nonstationary random excitation.
Medical image security using modified chaos-based cryptography approach
NASA Astrophysics Data System (ADS)
Talib Gatta, Methaq; Al-latief, Shahad Thamear Abd
2018-05-01
The progressive development in telecommunication and networking technologies have led to the increased popularity of telemedicine usage which involve storage and transfer of medical images and related information so security concern is emerged. This paper presents a method to provide the security to the medical images since its play a major role in people healthcare organizations. The main idea in this work based on the chaotic sequence in order to provide efficient encryption method that allows reconstructing the original image from the encrypted image with high quality and minimum distortion in its content and doesn’t effect in human treatment and diagnosing. Experimental results prove the efficiency of the proposed method using some of statistical measures and robust correlation between original image and decrypted image.
Quasi-Linear Parameter Varying Representation of General Aircraft Dynamics Over Non-Trim Region
NASA Technical Reports Server (NTRS)
Shin, Jong-Yeob
2007-01-01
For applying linear parameter varying (LPV) control synthesis and analysis to a nonlinear system, it is required that a nonlinear system be represented in the form of an LPV model. In this paper, a new representation method is developed to construct an LPV model from a nonlinear mathematical model without the restriction that an operating point must be in the neighborhood of equilibrium points. An LPV model constructed by the new method preserves local stabilities of the original nonlinear system at "frozen" scheduling parameters and also represents the original nonlinear dynamics of a system over a non-trim region. An LPV model of the motion of FASER (Free-flying Aircraft for Subscale Experimental Research) is constructed by the new method.
Mathematics Teachers' Views of Accountability Testing Revealed through Lesson Study
ERIC Educational Resources Information Center
Yarema, Connie H.
2010-01-01
The practice of lesson study, a professional development model originating in Japan, aligns well with recommendations from research for teacher professional development. Lesson study is also an inductive research method that uncovers student thinking and, in parallel, grants teacher-educators the opportunity to study teachers' thinking about…
Transfer research and impact studies program
NASA Technical Reports Server (NTRS)
Freeman, J. E. (Editor)
1975-01-01
Methods developed for stimulating interest in the transfer of NASA-originated technology are described. These include: new information packaging concepts; technology transfer via people transfer; information management systems; data bank operations; and professional communication activities.
Static aeroelastic analysis and tailoring of a single-element racing car wing
NASA Astrophysics Data System (ADS)
Sadd, Christopher James
This thesis presents the research from an Engineering Doctorate research programme in collaboration with Reynard Motorsport Ltd, a manufacturer of racing cars. Racing car wing design has traditionally considered structures to be rigid. However, structures are never perfectly rigid and the interaction between aerodynamic loading and structural flexibility has a direct impact on aerodynamic performance. This interaction is often referred to as static aeroelasticity and the focus of this research has been the development of a computational static aeroelastic analysis method to improve the design of a single-element racing car wing. A static aeroelastic analysis method has been developed by coupling a Reynolds-Averaged Navier-Stokes CFD analysis method with a Finite Element structural analysis method using an iterative scheme. Development of this method has included assessment of CFD and Finite Element analysis methods and development of data transfer and mesh deflection methods. Experimental testing was also completed to further assess the computational analyses. The computational and experimental results show a good correlation and these studies have also shown that a Navier-Stokes static aeroelastic analysis of an isolated wing can be performed at an acceptable computational cost. The static aeroelastic analysis tool was used to assess methods of tailoring the structural flexibility of the wing to increase its aerodynamic performance. These tailoring methods were then used to produce two final wing designs to increase downforce and reduce drag respectively. At the average operating dynamic pressure of the racing car, the computational analysis predicts that the downforce-increasing wing has a downforce of C[1]=-1.377 in comparison to C[1]=-1.265 for the original wing. The computational analysis predicts that the drag-reducing wing has a drag of C[d]=0.115 in comparison to C[d]=0.143 for the original wing.
Developing tools for digital radar image data evaluation
NASA Technical Reports Server (NTRS)
Domik, G.; Leberl, F.; Raggam, J.
1986-01-01
The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.
NASA Technical Reports Server (NTRS)
Siclari, Michael J.
1988-01-01
A computer code called NCOREL (for Nonconical Relaxation) has been developed to solve for supersonic full potential flows over complex geometries. The method first solves for the conical at the apex and then marches downstream in a spherical coordinate system. Implicit relaxation techniques are used to numerically solve the full potential equation at each subsequent crossflow plane. Many improvements have been made to the original code including more reliable numerics for computing wing-body flows with multiple embedded shocks, inlet flow through simulation, wake model and entropy corrections. Line relaxation or approximate factorization schemes are optionally available. Improved internal grid generation using analytic conformal mappings, supported by a simple geometric Harris wave drag input that was originally developed for panel methods and internal geometry package are some of the new features.
Improving Schools through Inservice Test Construction: The Rossville Model.
ERIC Educational Resources Information Center
Gilman, David Alan
A method for improving curriculum and schools through the local development of competency tests in basic skills--the Competency-Rossville Model (CRM)--is outlined. The method was originated in the school system of Rossville (Illinois) and has been tested in five other midwestern school systems. The approach leads the faculty of the school, with…
Magnetic anomalies in east Pacific using MAGSAT data
NASA Technical Reports Server (NTRS)
Harrison, C. G. A. (Principal Investigator)
1983-01-01
Methods for solving problems encountered in separating the core field from the crustal field are summarized as well as those methods developed for inverting total magnetic field data to obtain source functions for oceanic areas. Accounting for magnetization contrasts and the magnetization values measured in rocks of marine origin are also discussed.
The Origin of a Jury in Ancient Greece and England
ERIC Educational Resources Information Center
Tumanov, Dmitriy Yu.; Sakhapov, Rinat R.; Faizrahmanov, Damir I.; Safin, Robert R.
2016-01-01
The purpose of the study is to analyze the implementation of the democratic principles in the court and judicial process in the trial by jury by the example of the history and development of this institution in Russia. The authors used different methods and approaches, in particular, historical, systemic and Aristotelian method, concrete…
De Micco, Veronica; Ruel, Katia; Joseleau, Jean-Paul; Aronne, Giovanna
2010-08-01
During cell wall formation and degradation, it is possible to detect cellulose microfibrils assembled into thicker and thinner lamellar structures, respectively, following inverse parallel patterns. The aim of this study was to analyse such patterns of microfibril aggregation and cell wall delamination. The thickness of microfibrils and lamellae was measured on digital images of both growing and degrading cell walls viewed by means of transmission electron microscopy. To objectively detect, measure and classify microfibrils and lamellae into thickness classes, a method based on the application of computerized image analysis combined with graphical and statistical methods was developed. The method allowed common classes of microfibrils and lamellae in cell walls to be identified from different origins. During both the formation and degradation of cell walls, a preferential formation of structures with specific thickness was evidenced. The results obtained with the developed method allowed objective analysis of patterns of microfibril aggregation and evidenced a trend of doubling/halving lamellar structures, during cell wall formation/degradation in materials from different origin and which have undergone different treatments.
NASA Astrophysics Data System (ADS)
Egorov, D. I.
2017-06-01
Our study focuses on an analysis of the original method of investigation biological tissues in the spectral OCT (optical coherence tomography) with usage hyperchromatic lenses. Using hyperchromatic lens, i.e. the lens with uncorrected longitudinal color allows scanning in the depth of the object by changing the wavelength of the emitter. In this case, the depth of the scan will be determined not by the microlens depth of field, but the value of axial color. In our study, we demonstrated the advantages of this method of research on biological tissues existing. Spectral OCT schemes with the hyperchromatic lens could increase the depth of spectral scanning, eliminate the use of multi-channel systems with a set of microscope objectives, reduce the time of measurement. In our paper, we show the developed method of calculation of hyperchromatic lenses and hybrid hyperchromatic lens consisting of a diffractive and refractive component in spectral OCT systems. We also demonstrate the results of aberration calculation designed microscope lenses. We show examples of developed hyperchromatic lenses with the diffractive element and without it.
Controller design via structural reduced modeling by FETM
NASA Technical Reports Server (NTRS)
Yousuff, Ajmal
1987-01-01
The Finite Element-Transfer Matrix (FETM) method has been developed to reduce the computations involved in analysis of structures. This widely accepted method, however, has certain limitations, and does not address the issues of control design. To overcome these, a modification of the FETM method has been developed. The new method easily produces reduced models tailored toward subsequent control design. Other features of this method are its ability to: (1) extract open loop frequencies and mode shapes with less computations, (2) overcome limitations of the original FETM method, and (3) simplify the design procedures for output feedback, constrained compensation, and decentralized control. This report presents the development of the new method, generation of reduced models by this method, their properties, and the role of these reduced models in control design. Examples are included to illustrate the methodology.
Ultraclean and Direct Transfer of a Wafer-Scale MoS2 Thin Film onto a Plastic Substrate.
Phan, Hoang Danh; Kim, Youngchan; Lee, Jinhwan; Liu, Renlong; Choi, Yongsuk; Cho, Jeong Ho; Lee, Changgu
2017-02-01
An ultraclean method to directly transfer a large-area MoS 2 film from the original growth substrate to a flexible substrate by using epoxy glue is developed. The transferred film is observed to be free of wrinkles and cracks and to be as smooth as the film synthesized on the original substrate. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Arimura, Tatsuyuki; Hosoi, Masako; Tsukiyama, Yoshihiro; Yoshida, Toshiyuki; Fujiwara, Daiki; Tanaka, Masanori; Tamura, Ryuichi; Nakashima, Yasunori; Sudo, Nobuyuki; Kubo, Chiharu
2012-04-01
The present study aimed to develop a Japanese version of the Short-Form McGill Pain Questionnaire (SF-MPQ-J) that focuses on cross-culturally equivalence to the original English version and to test its reliability and validity. Cross-sectional design. In study 1, SF-MPQ was translated and adapted into Japanese. It included construction of response scales equivalent to the original using a variation of the Thurstone method of equal-appearing intervals. A total of 147 undergraduate students and 44 pain patients participated in the development of the Japanese response scales. To measure the equivalence of pain descriptors, 62 pain patients in four diagnostic groups were asked to choose pain descriptors that described their pain. In study 2, chronic pain patients (N=126) completed the SF-MPQ-J, the Long-Form McGill Pain Questionnaire Japanese version (LF-MPQ-J), and the 11-point numerical rating scale of pain intensity. Correlation analysis examined the construct validity of the SF-MPQ-J. The results from study 1 were used to develop SF-MPQ-J, which is linguistically equivalent to the original questionnaire. Response scales from SF-MPQ-J represented the original scale values. All pain descriptors, except one, were used by >33% in at least one of the four diagnostic groups. Study 2 exhibited adequate internal consistency and test-retest reliability, with the construct validity of SF-MPQ-J comparable to the original. These findings suggested that SF-MPQ-J is reliable, valid, and cross-culturally equivalent to the original questionnaire. Researchers might consider using this scale in multicenter, multi-ethnical trials or cross-cultural studies that include Japanese-speaking patients. Wiley Periodicals, Inc.
Controller design via structural reduced modeling by FETM
NASA Technical Reports Server (NTRS)
Yousuff, A.
1986-01-01
The Finite Element - Transfer Matrix (FETM) method has been developed to reduce the computations involved in analysis of structures. This widely accepted method, however, has certain limitations, and does not directly produce reduced models for control design. To overcome these shortcomings, a modification of FETM method has been developed. The modified FETM method easily produces reduced models that are tailored toward subsequent control design. Other features of this method are its ability to: (1) extract open loop frequencies and mode shapes with less computations, (2) overcome limitations of the original FETM method, and (3) simplify the procedures for output feedback, constrained compensation, and decentralized control. This semi annual report presents the development of the modified FETM, and through an example, illustrates its applicability to an output feedback and a decentralized control design.
Brinckmann, J A
2013-11-01
Pharmacopoeial monographs providing specifications for composition, identity, purity, quality, and strength of a botanical are developed based on analysis of presumably authenticated botanical reference materials. The specimens should represent the quality traditionally specified for the intended use, which may require different standards for medicinal versus food use. Development of quality standards monographs may occur through collaboration between a sponsor company or industry association and a pharmacopoeial expert committee. The sponsor may base proposed standards and methods on their own preferred botanical supply which may, or may not, be geo-authentic and/or correspond to qualities defined in traditional medicine formularies and pharmacopoeias. Geo-authentic botanicals are those with specific germplasm, cultivated or collected in their traditional production regions, of a specified biological age at maturity, with specific production techniques and processing methods. Consequences of developing new monographs that specify characteristics of an 'introduced' cultivated species or of a material obtained from one unique origin could lead to exclusion of geo-authentic herbs and may have therapeutic implications for clinical practice. In this review, specifications of selected medicinal plants with either a geo-authentic or geographical indication designation are discussed and compared against official pharmacopoeial standards for same genus and species regardless of origin. Copyright © 2012 John Wiley & Sons, Ltd.
A Protocol for Epigenetic Imprinting Analysis with RNA-Seq Data.
Zou, Jinfeng; Xiang, Daoquan; Datla, Raju; Wang, Edwin
2018-01-01
Genomic imprinting is an epigenetic regulatory mechanism that operates through expression of certain genes from maternal or paternal in a parent-of-origin-specific manner. Imprinted genes have been identified in diverse biological systems that are implicated in some human diseases and in embryonic and seed developmental programs in plants. The molecular underpinning programs and mechanisms involved in imprinting are yet to be explored in depth in plants. The recent advances in RNA-Seq-based methods and technologies offer an opportunity to systematically analyze epigenetic imprinting that operates at the whole genome level in the model and crop plants. We are interested using Arabidopsis model system, to investigate gene expression patterns associated with parent of origin and their implications to imprinting during embryo and seed development. Toward this, we have generated early embryo development RNA-Seq-based transcriptome datasets in F1s from a genetic cross between two diverse Arabidopsis thaliana ecotypes Col-0 and Tsu-1. With the data, we developed a protocol for evaluating the maternal and paternal contributions of genes during the early stages of embryo development after fertilization. This protocol is also designed to consider the contamination from other potential seed tissues, sequencing quality, proper processing of sequenced reads and variant calling, and appropriate inference of the parental contributions based on the parent-of-origin-specific single-nucleotide polymorphisms within the expressed genes. The approach, methods and the protocol developed in this study can be used for evaluating the effects of epigenetic imprinting in plants.
NASA Astrophysics Data System (ADS)
Hu, Leqian; Ma, Shuai; Yin, Chunling
2018-03-01
In this work, fluorescence spectroscopy combined with multi-way pattern recognition techniques were developed for determining the geographical origin of kudzu root and detection and quantification of adulterants in kudzu root. Excitation-emission (EEM) spectra were obtained for 150 pure kudzu root samples of different geographical origins and 150 fake kudzu roots with different adulteration proportions by recording emission from 330 to 570 nm with excitation in the range of 320-480 nm, respectively. Multi-way principal components analysis (M-PCA) and multilinear partial least squares discriminant analysis (N-PLS-DA) methods were used to decompose the excitation-emission matrices datasets. 150 pure kudzu root samples could be differentiated exactly from each other according to their geographical origins by M-PCA and N-PLS-DA models. For the adulteration kudzu root samples, N-PLS-DA got better and more reliable classification result comparing with the M-PCA model. The results obtained in this study indicated that EEM spectroscopy coupling with multi-way pattern recognition could be used as an easy, rapid and novel tool to distinguish the geographical origin of kudzu root and detect adulterated kudzu root. Besides, this method was also suitable for determining the geographic origin and detection the adulteration of the other foodstuffs which can produce fluorescence.
Koski, Antti; Tossavainen, Timo; Juhola, Martti
2004-01-01
Electrocardiogram (ECG) signals are the most prominent biomedical signal type used in clinical medicine. Their compression is important and widely researched in the medical informatics community. In the previous literature compression efficacy has been investigated only in the context of how much known or developed methods reduced the storage required by compressed forms of original ECG signals. Sometimes statistical signal evaluations based on, for example, root mean square error were studied. In previous research we developed a refined method for signal compression and tested it jointly with several known techniques for other biomedical signals. Our method of so-called successive approximation quantization used with wavelets was one of the most successful in those tests. In this paper, we studied to what extent these lossy compression methods altered values of medical parameters (medical information) computed from signals. Since the methods are lossy, some information is lost due to the compression when a high enough compression ratio is reached. We found that ECG signals sampled at 400 Hz could be compressed to one fourth of their original storage space, but the values of their medical parameters changed less than 5% due to compression, which indicates reliable results.
Simplified Computation for Nonparametric Windows Method of Probability Density Function Estimation.
Joshi, Niranjan; Kadir, Timor; Brady, Michael
2011-08-01
Recently, Kadir and Brady proposed a method for estimating probability density functions (PDFs) for digital signals which they call the Nonparametric (NP) Windows method. The method involves constructing a continuous space representation of the discrete space and sampled signal by using a suitable interpolation method. NP Windows requires only a small number of observed signal samples to estimate the PDF and is completely data driven. In this short paper, we first develop analytical formulae to obtain the NP Windows PDF estimates for 1D, 2D, and 3D signals, for different interpolation methods. We then show that the original procedure to calculate the PDF estimate can be significantly simplified and made computationally more efficient by a judicious choice of the frame of reference. We have also outlined specific algorithmic details of the procedures enabling quick implementation. Our reformulation of the original concept has directly demonstrated a close link between the NP Windows method and the Kernel Density Estimator.
FFT-enhanced IHS transform method for fusing high-resolution satellite images
Ling, Y.; Ehlers, M.; Usery, E.L.; Madden, M.
2007-01-01
Existing image fusion techniques such as the intensity-hue-saturation (IHS) transform and principal components analysis (PCA) methods may not be optimal for fusing the new generation commercial high-resolution satellite images such as Ikonos and QuickBird. One problem is color distortion in the fused image, which causes visual changes as well as spectral differences between the original and fused images. In this paper, a fast Fourier transform (FFT)-enhanced IHS method is developed for fusing new generation high-resolution satellite images. This method combines a standard IHS transform with FFT filtering of both the panchromatic image and the intensity component of the original multispectral image. Ikonos and QuickBird data are used to assess the FFT-enhanced IHS transform method. Experimental results indicate that the FFT-enhanced IHS transform method may improve upon the standard IHS transform and the PCA methods in preserving spectral and spatial information. ?? 2006 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).
Single-strain-gage force/stiffness buckling prediction techniques on a hat-stiffened panel
NASA Technical Reports Server (NTRS)
Hudson, Larry D.; Thompson, Randolph C.
1991-01-01
Predicting the buckling characteristics of a test panel is necessary to ensure panel integrity during a test program. A single-strain-gage buckling prediction method was developed on a hat-stiffened, monolithic titanium buckling panel. The method is an adaptation of the original force/stiffness method which requires back-to-back gages. The single-gage method was developed because the test panel did not have back-to-back gages. The method was used to predict buckling loads and temperatures under various heating and loading conditions. The results correlated well with a finite element buckling analysis. The single-gage force/stiffness method was a valid real-time and post-test buckling prediction technique.
Theory and Method in Cross-Cultural Psychology
ERIC Educational Resources Information Center
Malpass, Roy S.
1977-01-01
Cross cultural psychology is considered as a methodological strategy, as a means of evaluating hypotheses of unicultural origins with evidence of more panhuman relevance, and as a means of developing new theoretical psychological phenomena. (Author)
A Data-Centric Strategy for Modern Biobanking.
Quinlan, Philip R; Gardner, Stephen; Groves, Martin; Emes, Richard; Garibaldi, Jonathan
2015-01-01
Biobanking has been in existence for many decades and over that time has developed significantly. Biobanking originated from a need to collect, store and make available biological samples for a range of research purposes. It has changed as the understanding of biological processes has increased and new sample handling techniques have been developed to ensure samples were fit-for-purpose.As a result of these developments, modern biobanking is now facing two substantial new challenges. Firstly, new research methods such as next generation sequencing can generate datasets that are at an infinitely greater scale and resolution than previous methods. Secondly, as the understanding of diseases increases researchers require a far richer data set about the donors from which the sample originate.To retain a sample-centric strategy in a research environment that is increasingly dictated by data will place a biobank at a significant disadvantage and even result in the samples collected going unused. As a result biobanking is required to change strategic focus from a sample dominated perspective to a data-centric strategy.
Development of a Short Form of the Boston Naming Test for Individuals with Aphasia
ERIC Educational Resources Information Center
del Toro, Christina M.; Bislick, Lauren P.; Comer, Matthew; Velozo, Craig; Romero, Sergio; Rothi, Leslie J. Gonzalez; Kendall, Diane L.
2011-01-01
Purpose: The purpose of this study was to develop a short form of the Boston Naming Test (BNT; Kaplan, Goodglass, & Weintraub, 2001) for individuals with aphasia and compare it with 2 existing short forms originally analyzed with responses from people with dementia and neurologically healthy adults. Method: Development of the new BNT-Aphasia Short…
Connecting Sociocultural Theory and Educational Practice: Galperin's Approach
ERIC Educational Resources Information Center
Arievitch, Igor M.; Haenen, Jacques P. P.
2005-01-01
Learning and instruction have always been important topics in the sociocultural school of thought founded by Vygotsky and further developed by his followers. Taking sociocultural ideas as a starting point, Piotr Galperin developed an original conceptual system and a new method of investigation that made teaching and learning a central part of…
Rahman, Md Musfiqur; Abd El-Aty, A M; Na, Tae-Woong; Park, Joon-Seong; Kabir, Md Humayun; Chung, Hyung Suk; Lee, Han Sol; Shin, Ho-Chul; Shim, Jae-Han
2017-08-15
A simultaneous analytical method was developed for the determination of methiocarb and its metabolites, methiocarb sulfoxide and methiocarb sulfone, in five livestock products (chicken, pork, beef, table egg, and milk) using liquid chromatography-tandem mass spectrometry. Due to the rapid degradation of methiocarb and its metabolites, a quick sample preparation method was developed using acetonitrile and salts followed by purification via dispersive- solid phase extraction (d-SPE). Seven-point calibration curves were constructed separately in each matrix, and good linearity was observed in each matrix-matched calibration curve with a coefficient of determination (R 2 ) ≥ 0.991. The limits of detection and quantification were 0.0016 and 0.005mg/kg, respectively, for all tested analytes in various matrices. The method was validated in triplicate at three fortification levels (equivalent to 1, 2, and 10 times the limit of quantification) with a recovery rate ranging between 76.4-118.0% and a relative standard deviation≤10.0%. The developed method was successfully applied to market samples, and no residues of methiocarb and/or its metabolites were observed in the tested samples. In sum, this method can be applied for the routine analysis of methiocarb and its metabolites in foods of animal origins. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Danilova, L. A.
This four-chapter monograph, translated from a 1977 Russian book written originally in Russian for Russians, describes methodology and results of the study of cognitive activity in children with cerebral palsy. An initial chapter reviews research on impairments in cognitive activity and speech defects in such children and on methods of…
Improve forest inventory with access data-measure transport distance and cost to market.
Dennis P. Bradley
1972-01-01
Describes a method for relating forest inventory volumes to transport distances and costs. The process, originally developed in Sweden, includes a computer program that can be used to summarize volumes by transport costs per cord to specified delivery point. The method has many potential applications in all aspects of resource analysis.
The Reflecting Team: A Training Method for Family Counselors
ERIC Educational Resources Information Center
Chang, Jeff
2010-01-01
The reflecting team (RT) is an innovative method used in the training and supervision of family counselors. In this article, I trace the history, development, and current uses of RTs and review current findings on RTs. In my opinion, many users of RTs have diverged from their original theoretical principles and have adopted RTs mainly as a…
Chrysler improved numerical differencing analyzer for third generation computers CINDA-3G
NASA Technical Reports Server (NTRS)
Gaski, J. D.; Lewis, D. R.; Thompson, L. R.
1972-01-01
New and versatile method has been developed to supplement or replace use of original CINDA thermal analyzer program in order to take advantage of improved systems software and machine speeds of third generation computers. CINDA-3G program options offer variety of methods for solution of thermal analog models presented in network format.
[Academic origin of round magnetic needle and standardization operation].
Cheng, Yan-Ting; Zhang, Tian-Sheng; Meng, Li-Qiang; Shi, Rui-Qi; Ji, Lai-Xi
2014-07-01
The origin and development of round magnetic needle was explored, and the structure of round magnetic needle was introduced in detail, including the handle, the body and the tip of the needle. The clinical opera tion of round magnetic needle were standardized from the aspects of the methods of holding needle, manipulation skill, tapping position, strength of manipulation, application scope and matters needing attention, which laid foundation for the popularization and application of round magnetic needle.
NASA Astrophysics Data System (ADS)
Diego, M. C. R.; Purwanto, Y. A.; Sutrisno; Budiastra, I. W.
2018-05-01
Research related to the non-destructive method of near-infrared (NIR) spectroscopy in aromatic oil is still in development in Indonesia. The objectives of the study were to determine the characteristics of the near-infrared spectra of patchouli oil and classify it based on its origin. The samples were selected from seven different places in Indonesia (Bogor and Garut from West Java, Aceh, and Jambi from Sumatra and Konawe, Masamba and Kolaka from Sulawesi Island). The spectral data of patchouli oil was obtained by FT-NIR spectrometer at the wavelength of 1000-2500 nm, and after that, the samples were subjected to composition analysis using Gas Chromatography-Mass Spectrometry. The transmittance and absorbance spectra were analyzed and then principal component analysis (PCA) was carried out. Discriminant analysis (DA) of the principal component was developed to classify patchouli oil based on its origin. The result shows that the data of both spectra (transmittance and absorbance spectra) by the PC analysis give a similar result for discriminating the seven types of patchouli oil due to their distribution and behavior. The DA of the three principal component in both data processed spectra could classify patchouli oil accurately. This result exposed that NIR spectroscopy can be successfully used as a correct method to classify patchouli oil based on its origin.
THEMIS Surface-Atmosphere Separation Strategy and Preliminary Results
NASA Technical Reports Server (NTRS)
Bandfield, J. L.; Smith, M. D.; Christensen, P. R.
2002-01-01
Methods refined and adapted from the TES investigation are used to develop a surface-atmosphere separation strategy for THEMIS image analysis and atmospheric temperature and opacity retrievals. Additional information is contained in the original extended abstract.
Goods Movement: Regional Analysis and Database Final Report
DOT National Transportation Integrated Search
1993-03-26
The project reported here was undertaken to create and test methods for synthesizing truck flow patterns in urban areas from partial and fragmentary observations. More specifically, the project sought to develop a way to estimate origin-destination (...
An original approach to elastic constants determination using a self-developed EMAT system
NASA Astrophysics Data System (ADS)
Jenot, Frédéric; Rivart, Frédéric; Camus, Liévin
2018-04-01
Electromagnetic Acoustic Transducers (EMATs) allow non-contact ultrasonic measurements in order to characterize structures for a wide range of applications. Considering non-ferromagnetic metal materials, excitation of elastic waves is due to Lorentz forces that result from an applied magnetic field and induced eddy currents in a near surface region of the sample. EMAT's design is based on a magnet structure associated with a coil leading to multiple configurations, which are able to excite bulk and guided acoustic waves. In this work, we first present a self-developed EMAT system composed of multiple emission and reception channels. In a second part, we propose an original method in order to determine the elastic constants of an isotropic material. To achieve this goal, Rayleigh and shear waves are used and the advantages of this method are clearly highlighted. The results obtained are then compared with conventional measurements achieved with piezoelectric transducers.
[Up-to-date methods of cell therapy in treatment of burns].
Smirnov, S V; Kiselev, I V; Vasil'ev, A V; Terskikh, V V
2003-01-01
Complex of methods for repair of wounded and burned skin with transplantation of keratinocytes and fibroblasts grown in vitro is proposed. Five different tissue constructions may be used. Original method of skin recovery based on use of alive equivalent of skin is developed. Results of its clinical application are presented. It is concluded that cell constructions may be used in combined treatment of wounds and burns.
Application of ICME Methods for the Development of Rapid Manufacturing Technologies
NASA Astrophysics Data System (ADS)
Maiwald-Immer, T.; Göhler, T.; Fischersworring-Bunk, A.; Körner, C.; Osmanlic, F.; Bauereiß, A.
Rapid manufacturing technologies are lately gaining interest as alternative manufacturing method. Due to the large parameter sets applicable in these manufacturing methods and their impact on achievable material properties and quality, support of the manufacturing process development by the use of simulation is highly attractive. This is especially true for aerospace applications with their high quality demands and controlled scatter in the resulting material properties. The applicable simulation techniques to these manufacturing methods are manifold. The paper will focus on the melt pool simulation for a SLM (selective laser melting) process which was originally developed for EBM (electron beam melting). It will be discussed in the overall context of a multi-scale simulation within a virtual process chain.
Extension of the ratio method to low energy
Colomer, Frederic; Capel, Pierre; Nunes, F. M.; ...
2016-05-25
The ratio method has been proposed as a means to remove the reaction model dependence in the study of halo nuclei. Originally, it was developed for higher energies but given the potential interest in applying the method at lower energy, in this work we explore its validity at 20 MeV/nucleon. The ratio method takes the ratio of the breakup angular distribution and the summed angular distribution (which includes elastic, inelastic and breakup) and uses this observable to constrain the features of the original halo wave function. In this work we use the Continuum Discretized Coupled Channel method and the Coulomb-correctedmore » Dynamical Eikonal Approximation for the study. We study the reactions of 11Be on 12C, 40Ca and 208Pb at 20 MeV/nucleon. We compare the various theoretical descriptions and explore the dependence of our result on the core-target interaction. Lastly, our study demonstrates that the ratio method is valid at these lower beam energies.« less
Guo, Jing; Yue, Tianli; Yuan, Yahong
2012-10-01
Apple juice is a complex mixture of volatile and nonvolatile components. To develop discrimination models on the basis of the volatile composition for an efficient classification of apple juices according to apple variety and geographical origin, chromatography volatile profiles of 50 apple juice samples belonging to 6 varieties and from 5 counties of Shaanxi (China) were obtained by headspace solid-phase microextraction coupled with gas chromatography. The volatile profiles were processed as continuous and nonspecific signals through multivariate analysis techniques. Different preprocessing methods were applied to raw chromatographic data. The blind chemometric analysis of the preprocessed chromatographic profiles was carried out. Stepwise linear discriminant analysis (SLDA) revealed satisfactory discriminations of apple juices according to variety and geographical origin, provided respectively 100% and 89.8% success rate in terms of prediction ability. Finally, the discriminant volatile compounds selected by SLDA were identified by gas chromatography-mass spectrometry. The proposed strategy was able to verify the variety and geographical origin of apple juices involving only a reduced number of discriminate retention times selected by the stepwise procedure. This result encourages the similar procedures to be considered in quality control of apple juices. This work presented a method for an efficient discrimination of apple juices according to apple variety and geographical origin using HS-SPME-GC-MS together with chemometric tools. Discrimination models developed could help to achieve greater control over the quality of the juice and to detect possible adulteration of the product. © 2012 Institute of Food Technologists®
Multimodel Ensemble Methods for Prediction of Wake-Vortex Transport and Decay Originating NASA
NASA Technical Reports Server (NTRS)
Korner, Stephan; Ahmad, Nashat N.; Holzapfel, Frank; VanValkenburg, Randal L.
2017-01-01
Several multimodel ensemble methods are selected and further developed to improve the deterministic and probabilistic prediction skills of individual wake-vortex transport and decay models. The different multimodel ensemble methods are introduced, and their suitability for wake applications is demonstrated. The selected methods include direct ensemble averaging, Bayesian model averaging, and Monte Carlo simulation. The different methodologies are evaluated employing data from wake-vortex field measurement campaigns conducted in the United States and Germany.
a Data Field Method for Urban Remotely Sensed Imagery Classification Considering Spatial Correlation
NASA Astrophysics Data System (ADS)
Zhang, Y.; Qin, K.; Zeng, C.; Zhang, E. B.; Yue, M. X.; Tong, X.
2016-06-01
Spatial correlation between pixels is important information for remotely sensed imagery classification. Data field method and spatial autocorrelation statistics have been utilized to describe and model spatial information of local pixels. The original data field method can represent the spatial interactions of neighbourhood pixels effectively. However, its focus on measuring the grey level change between the central pixel and the neighbourhood pixels results in exaggerating the contribution of the central pixel to the whole local window. Besides, Geary's C has also been proven to well characterise and qualify the spatial correlation between each pixel and its neighbourhood pixels. But the extracted object is badly delineated with the distracting salt-and-pepper effect of isolated misclassified pixels. To correct this defect, we introduce the data field method for filtering and noise limitation. Moreover, the original data field method is enhanced by considering each pixel in the window as the central pixel to compute statistical characteristics between it and its neighbourhood pixels. The last step employs a support vector machine (SVM) for the classification of multi-features (e.g. the spectral feature and spatial correlation feature). In order to validate the effectiveness of the developed method, experiments are conducted on different remotely sensed images containing multiple complex object classes inside. The results show that the developed method outperforms the traditional method in terms of classification accuracies.
Roubinov, Danielle S.; Luecken, Linda J.; Gonzales, Nancy A.; Crnic, Keith A.
2015-01-01
Objectives An increasing body of research has documented the significant influence of father involvement on children’s development and overall well-being. However, extant research has predominately focused on middle-class Caucasian samples with little examination of fathering in ethnic minority and low-income families, particularly during the infancy period. The present study evaluated measures of early father involvement (paternal engagement, accessibility, and responsibility) that were adapted to capture important cultural values relevant to the paternal role in Mexican origin families. Methods A sample of 180 Mexican origin mothers (M age = 28.3) and 83 Mexican origin fathers (M age = 31.5) were interviewed during the perinatal period. Results Descriptive analyses indicated that Mexican origin fathers are involved in meaningful levels of direct interaction with their infant. A two-factor model of paternal responsibility was supported by factor analyses, consisting of a behavioral responsibility factor aligned with previous literature and culturally-derived positive machismo factor. Qualities of the romantic relationship, cultural orientation, and maternal employment status were related to indices of father involvement. Conclusions These preliminary results contribute to understanding of the transition to fatherhood among low-income Mexican origin men and bring attention to the demographic, social, and cultural contexts in which varying levels of father involvement may emerge. PMID:26237543
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jan Hesthaven
2012-02-06
Final report for DOE Contract DE-FG02-98ER25346 entitled Parallel High Order Accuracy Methods Applied to Non-Linear Hyperbolic Equations and to Problems in Materials Sciences. Principal Investigator Jan S. Hesthaven Division of Applied Mathematics Brown University, Box F Providence, RI 02912 Jan.Hesthaven@Brown.edu February 6, 2012 Note: This grant was originally awarded to Professor David Gottlieb and the majority of the work envisioned reflects his original ideas. However, when Prof Gottlieb passed away in December 2008, Professor Hesthaven took over as PI to ensure proper mentoring of students and postdoctoral researchers already involved in the project. This unusual circumstance has naturally impacted themore » project and its timeline. However, as the report reflects, the planned work has been accomplished and some activities beyond the original scope have been pursued with success. Project overview and main results The effort in this project focuses on the development of high order accurate computational methods for the solution of hyperbolic equations with application to problems with strong shocks. While the methods are general, emphasis is on applications to gas dynamics with strong shocks.« less
NASA Technical Reports Server (NTRS)
Arnold, Steven M. (Technical Monitor); Bansal, Yogesh; Pindera, Marek-Jerzy
2004-01-01
The High-Fidelity Generalized Method of Cells is a new micromechanics model for unidirectionally reinforced periodic multiphase materials that was developed to overcome the original model's shortcomings. The high-fidelity version predicts the local stress and strain fields with dramatically greater accuracy relative to the original model through the use of a better displacement field representation. Herein, we test the high-fidelity model's predictive capability in estimating the elastic moduli of periodic composites characterized by repeating unit cells obtained by rotation of an infinite square fiber array through an angle about the fiber axis. Such repeating unit cells may contain a few or many fibers, depending on the rotation angle. In order to analyze such multi-inclusion repeating unit cells efficiently, the high-fidelity micromechanics model's framework is reformulated using the local/global stiffness matrix approach. The excellent agreement with the corresponding results obtained from the standard transformation equations confirms the new model's predictive capability for periodic composites characterized by multi-inclusion repeating unit cells lacking planes of material symmetry. Comparison of the effective moduli and local stress fields with the corresponding results obtained from the original Generalized Method of Cells dramatically highlights the original model's shortcomings for certain classes of unidirectional composites.
Houiste, Céline; Auguste, Cécile; Macrez, Céline; Dereux, Stéphanie; Derouet, Angélique; Anger, Pascal
2009-02-01
Low-molecular-weight heparins (LMWHs) are widely used in the management of thrombosis and acute coronary syndromes. They are obtained by the enzymatic or chemical depolymerization of porcine intestinal heparin. Enoxaparin sodium, a widely used LMWH, has a unique and reproducible oligosaccharide profile which is determined by the origin of the starting material and a tightly controlled manufacturing process. Although other enoxaparin-like LMWHs do exist, specific release criteria including the origin of the crude heparin utilized for their production, have not been established. A quantitative polymerase chain reaction method has been developed to ensure the purity of the porcine origin of crude heparin, with a DNA detection limit as low as 1 ppm for bovine, or 10 ppm for ovine contaminants. This method is routinely used as the release acceptance criterion during enoxaparin sodium manufacturing. Furthermore, when the process removes DNA, other analytical techniques can be used to assess any contamination. Disaccharide profiling after exhaustive depolymerization can determine the presence of at least 10% bovine or 20% ovine material; multivariate analysis is useful to perform the data analysis. Consistent with the availability of newer technology, these methods should be required as acceptance criteria for crude heparins used in the manufacture of LMWHs to ensure their safety, quality, and immunologic profile.
Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation
ERIC Educational Resources Information Center
Ayre, Colin; Scally, Andrew John
2014-01-01
The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.
NASA Astrophysics Data System (ADS)
Sanchez, J.
2018-06-01
In this paper, the application and analysis of the asymptotic approximation method to a single degree-of-freedom has recently been produced. The original concepts are summarized, and the necessary probabilistic concepts are developed and applied to single degree-of-freedom systems. Then, these concepts are united, and the theoretical and computational models are developed. To determine the viability of the proposed method in a probabilistic context, numerical experiments are conducted, and consist of a frequency analysis, analysis of the effects of measurement noise, and a statistical analysis. In addition, two examples are presented and discussed.
Original analytic solution of a half-bridge modelled as a statically indeterminate system
NASA Astrophysics Data System (ADS)
Oanta, Emil M.; Panait, Cornel; Raicu, Alexandra; Barhalescu, Mihaela
2016-12-01
The paper presents an original computer based analytical model of a half-bridge belonging to a circular settling tank. The primary unknown is computed using the force method, the coefficients of the canonical equation being calculated using either the discretization of the bending moment diagram in trapezoids, or using the relations specific to the polygons. A second algorithm based on the method of initial parameters is also presented. Analyzing the new solution we came to the conclusion that most of the computer code developed for other model may be reused. The results are useful to evaluate the behavior of the structure and to compare with the results of the finite element models.
Willers, H; Heilmann, H P; Beck-Bornholdt, H P
1998-02-01
This review studies the historical development of fractionated X-ray therapy in the German speaking countries until World War II. Fractionated treatments appear to have their origin already in the first attempts of fractionation performed by Freund in 1896. In the following, fractionated treatments could not compete with the so-called single dose or intensive radiation treatment in Germany. However, until the 1920s there were repeated sporadic attempts of various motifs to distribute the radiation doses over a prolonged period of time. Only with the end of the 1920s the conditions for a rise of fractionated irradiation were favourable, mainly due the fiasco of the intensive treatments and their subsequent decline. The impulse for this rise, however, came from France where Coutard had developed an individual empirical treatment regimen with great clinical success. This technique and its modifications and developments spread over Europe and America fast. In spite of the similarities between the various fractionation methods used during the first decades of the century, review of the radiological literature of that time fails to show any logical scientific connection of these methods, but rather there was a mix and overlap of irradiation attempts of various motivations. Thus, radiation therapy, and thereby the current standard fractionation scheme of 1.8 to 2 Gy per fraction 5 times per week, obviously did not grow out of a scientific basis, but originated from individual observation of patients and empirical experience.
Structural analysis of paintings based on brush strokes
NASA Astrophysics Data System (ADS)
Sablatnig, Robert; Kammerer, Paul; Zolda, Ernestine
1998-05-01
The origin of works of art can often not be attributed to a certain artist. Likewise it is difficult to say whether paintings or drawings are originals or forgeries. In various fields of art new technical methods are used to examine the age, the state of preservation and the origin of the materials used. For the examination of paintings, radiological methods like X-ray and infra-red diagnosis, digital radiography, computer-tomography, etc. and color analyzes are employed to authenticate art. But all these methods do not relate certain characteristics in art work to a specific artist -- the artist's personal style. In order to study this personal style of a painter, experts in art history and image processing try to examine the 'structural signature' based on brush strokes within paintings, in particular in portrait miniatures. A computer-aided classification and recognition system for portrait miniatures is developed, which enables a semi- automatic classification and forgery detection based on content, color, and brush strokes. A hierarchically structured classification scheme is introduced which separates the classification into three different levels of information: color, shape of region, and structure of brush strokes.
Steponas Kolupaila's contribution to hydrological science development
NASA Astrophysics Data System (ADS)
Valiuškevičius, Gintaras
2017-08-01
Steponas Kolupaila (1892-1964) was an important figure in 20th century hydrology and one of the pioneers of scientific water gauging in Europe. His research on the reliability of hydrological data and measurement methods was particularly important and contributed to the development of empirical hydrological calculation methods. Kolupaila was one of the first who standardised water-gauging methods internationally. He created several original hydrological and hydraulic calculation methods (his discharge assessment method for winter period was particularly significant). His innate abilities and frequent travel made Kolupaila a universal specialist in various fields and an active public figure. He revealed his multilayered scientific and cultural experiences in his most famous book, Bibliography of Hydrometry. This book introduced the unique European hydrological-measurement and computation methods to the community of world hydrologists at that time and allowed the development and adaptation of these methods across the world.
TECHNICAL ASPECTS OF UNDERGROUND STORAGE TANK CLOSURE
The overall objective of the study was to develop a deeper understanding of UST residuals at closure: their quantities, origins, physical/chemical properties, ease of removal by various cleaning methods, and their environmental mobility and persistence. The investigation covered ...
The paper discusses a data attribute rating system (DARS), developed by EPA to assist in evaluating data associated with emission inventories. he paper presents DARS for evaluation by the potential user community. ARS was originally conceived as a method for evaluating country-sp...
Kelly, S; Wickstead, B; Gull, K
2011-04-07
We have developed a machine-learning approach to identify 3537 discrete orthologue protein sequence groups distributed across all available archaeal genomes. We show that treating these orthologue groups as binary detection/non-detection data is sufficient to capture the majority of archaeal phylogeny. We subsequently use the sequence data from these groups to infer a method and substitution-model-independent phylogeny. By holding this phylogeny constrained and interrogating the intersection of this large dataset with both the Eukarya and the Bacteria using Bayesian and maximum-likelihood approaches, we propose and provide evidence for a methanogenic origin of the Archaea. By the same criteria, we also provide evidence in support of an origin for Eukarya either within or as sisters to the Thaumarchaea.
2009-01-01
Background Quantitative survey of research articles, as an application of bibliometrics, is an effective tool for grasping overall trends in various medical research fields. This type of survey has been also applied to infectious disease research; however, previous studies were insufficient as they underestimated articles published in non-English or regional journals. Methods Using a combination of Scopus™ and PubMed, the databases of scientific literature, and English and non-English keywords directly linked to infectious disease control, we identified international and regional infectious disease journals. In order to ascertain whether the newly selected journals were appropriate to survey a wide range of research articles, we compared the number of original articles and reviews registered in the selected journals to those in the 'Infectious Disease Category' of the Science Citation Index Expanded™ (SCI Infectious Disease Category) during 1998-2006. Subsequently, we applied the newly selected journals to survey the number of original articles and reviews originating from 11 Asian countries during the same period. Results One hundred journals, written in English or 7 non-English languages, were newly selected as infectious disease journals. The journals published 14,156 original articles and reviews of Asian origin and 118,158 throughout the world, more than those registered in the SCI Infectious Disease Category (4,621 of Asian origin and 66,518 of the world in the category). In Asian trend analysis of the 100 journals, Japan had the highest percentage of original articles and reviews in the area, and no noticeable increase in articles was revealed during the study period. China, India and Taiwan had relatively large numbers and a high increase rate of original articles among Asian countries. When adjusting the publication of original articles according to the country population and the gross domestic product (GDP), Singapore and Taiwan were the most productive. Conclusion A survey of 100 selected journals is more sensitive than the SCI Infectious Disease Category from the viewpoint of avoiding underestimating the number of infectious disease research articles of Asian origin. The survey method is applicable to grasp global trends in disease research, although the method may require further development. PMID:19804650
Contextualising primate origins--an ecomorphological framework.
Soligo, Christophe; Smaers, Jeroen B
2016-04-01
Ecomorphology - the characterisation of the adaptive relationship between an organism's morphology and its ecological role - has long been central to theories of the origin and early evolution of the primate order. This is exemplified by two of the most influential theories of primate origins: Matt Cartmill's Visual Predation Hypothesis, and Bob Sussman's Angiosperm Co-Evolution Hypothesis. However, the study of primate origins is constrained by the absence of data directly documenting the events under investigation, and has to rely instead on a fragmentary fossil record and the methodological assumptions inherent in phylogenetic comparative analyses of extant species. These constraints introduce particular challenges for inferring the ecomorphology of primate origins, as morphology and environmental context must first be inferred before the relationship between the two can be considered. Fossils can be integrated in comparative analyses and observations of extant model species and laboratory experiments of form-function relationships are critical for the functional interpretation of the morphology of extinct species. Recent developments have led to important advancements, including phylogenetic comparative methods based on more realistic models of evolution, and improved methods for the inference of clade divergence times, as well as an improved fossil record. This contribution will review current perspectives on the origin and early evolution of primates, paying particular attention to their phylogenetic (including cladistic relationships and character evolution) and environmental (including chronology, geography, and physical environments) contextualisation, before attempting an up-to-date ecomorphological synthesis of primate origins. © 2016 Anatomical Society.
Student Health Services in Institutions of Higher Education. Bulletin, 1937, No. 7
ERIC Educational Resources Information Center
Rogers, James Frederick
1937-01-01
Originating 75 years ago as a means of safeguarding students from excesses in physical activities, the student-health service has had a phenomenal development. From an examination, by the methods then at hand, of heart, lungs, and spine and the prescription of gymnastic exercises according to the findings, this examination has developed, with the…
ERIC Educational Resources Information Center
Rizvi, Sadaf, Ed.
2011-01-01
This book provides an original perspective on a range of controversial issues in educational and social research through case studies of multi-disciplinary and mixed-method research involving children, teachers, schools and communities in Europe and the developing world. These case studies from researchers "across continents" and…
Texting for Health: The Use of Participatory Methods to Develop Healthy Lifestyle Messages for Teens
ERIC Educational Resources Information Center
Hingle, Melanie; Nichter, Mimi; Medeiros, Melanie; Grace, Samantha
2013-01-01
Objective: To develop and test messages and a mobile phone delivery protocol designed to influence the nutrition and physical activity knowledge, attitudes, and behavior of adolescents. Design: Nine focus groups, 4 classroom discussions, and an 8-week pilot study exploring message content, format, origin, and message delivery were conducted over…
Children's Biology: A Review of Research on Conceptual Development in the Life Sciences.
ERIC Educational Resources Information Center
Mintzes, Joel J.; Arnaudin, Mary W.
Sixty-eight studies on conceptual development in the biological sciences are reviewed. These studies originated in North America, Europe, Asia, Africa, and Australia. Each study was classified by type of concept(s) under investigation and by the research method employed (focusing on the mode of inquiry and the mode of assessment used). When…
ERIC Educational Resources Information Center
Obrecht, Dean H.
This report contrasts the results of a rigidly specified, pattern-oriented approach to learning Spanish with an approach that emphasizes the origination of sentences by the learner in direct response to stimuli. Pretesting and posttesting statistics are presented and conclusions are discussed. The experimental method, which required the student to…
Hulka, J F; Omran, K; Lieberman, B A; Gordon, A G
1979-12-15
Since the original spring clip sterilization studies were reported, a number of clinically important modifications to the spring clip and applicator have been developed. The spring-loaded clip, manufactured by Richard Wolf Medical Instruments Corporation of Chicago, Illinois, and Rocket of London, Inc., London, England, and New York, New York can be applied with either a one- or two-incision applicator and the clips and applicators currently available incorporate improvements to the original prototypes in design, manufacture, and quality control. The two-incision applicator is associated with significantly fewer misapplications and the high pregnancy rates reported with the original clip and applicator have not occurred with the current designs. Comparative studies between the clip and band have revealed less operative bleeding and pain associated with the clip. The method is appropriate to all women requesting sterilization but especially to those in the younger age group who may subsequently request reversal because of divorce and remarriage.
Bandoniene, Donata; Zettl, Daniela; Meisel, Thomas; Maneiko, Marija
2013-02-15
An analytical method was developed and validated for the classification of the geographical origin of pumpkin seeds and oil from Austria, China and Russia. The distribution of element traces in pumpkin seed and pumpkin seed oils in relation to the geographical origin of soils of several agricultural farms in Austria was studied in detail. Samples from several geographic origins were taken from parts of the pumpkin, pumpkin flesh, seeds, the oil extracted from the seeds and the oil-extraction cake as well as the topsoil on which the plants were grown. Plants from different geographical origin show variations of the elemental patterns that are significantly large, reproducible over the years and ripeness period and show no significant influence of oil production procedure, to allow to a discrimination of geographical origin. A successful differentiation of oils from different regions in Austria, China and Russia classified with multivariate data analysis is demonstrated. Copyright © 2012 Elsevier Ltd. All rights reserved.
de Rijke, E; Schoorl, J C; Cerli, C; Vonhof, H B; Verdegaal, S J A; Vivó-Truyols, G; Lopatka, M; Dekter, R; Bakker, D; Sjerps, M J; Ebskamp, M; de Koster, C G
2016-08-01
Two approaches were investigated to discriminate between bell peppers of different geographic origins. Firstly, δ(18)O fruit water and corresponding source water were analyzed and correlated to the regional GNIP (Global Network of Isotopes in Precipitation) values. The water and GNIP data showed good correlation with the pepper data, with constant isotope fractionation of about -4. Secondly, compound-specific stable hydrogen isotope data was used for classification. Using n-alkane fingerprinting data, both linear discriminant analysis (LDA) and a likelihood-based classification, using the kernel-density smoothed data, were developed to discriminate between peppers from different origins. Both methods were evaluated using the δ(2)H values and n-alkanes relative composition as variables. Misclassification rates were calculated using a Monte-Carlo 5-fold cross-validation procedure. Comparable overall classification performance was achieved, however, the two methods showed sensitivity to different samples. The combined values of δ(2)H IRMS, and complimentary information regarding the relative abundance of four main alkanes in bell pepper fruit water, has proven effective for geographic origin discrimination. Evaluation of the rarity of observing particular ranges for these characteristics could be used to make quantitative assertions regarding geographic origin of bell peppers and, therefore, have a role in verifying compliance with labeling of geographical origin. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Smogorzewska, Joanna
2012-01-01
This article presents the results of a study comparing the originality, the length, the number of neologisms and the syntactic complexity of fairy tales created with "Storyline" and "Associations Pyramid." Both methods were developed to enhance children's language abilities and their creative thinking. One hundred twenty eight 5-year-old children…
Comparison of rotation algorithms for digital images
NASA Astrophysics Data System (ADS)
Starovoitov, Valery V.; Samal, Dmitry
1999-09-01
The paper presents a comparative study of several algorithms developed for digital image rotation. No losing generality we studied gray scale images. We have tested methods preserving gray values of the original images, performing some interpolation and two procedures implemented into the Corel Photo-paint and Adobe Photoshop soft packages. By the similar way methods for rotation of color images may be evaluated also.
Situational Approaches to Direct Practice: Origin, Decline, and Re-Emergence
ERIC Educational Resources Information Center
Murdach, Allison D.
2007-01-01
During the 1890s and the first three decades of the 20th century, social work in the United States developed a community-based direct practice approach to family assistance and social reform. The basis for this method was a situational view of social life that emphasized the use of interpersonal and transactional methods to achieve social and…
Microstructural characterization of Ti-6Al-4V alloy subjected to the duplex SMAT/plasma nitriding.
Pi, Y; Faure, J; Agoda-Tandjawa, G; Andreazza, C; Potiron, S; Levesque, A; Demangel, C; Retraint, D; Benhayoune, H
2013-09-01
In this study, microstructural characterization of Ti-6Al-4V alloy, subjected to the duplex surface mechanical attrition treatment (SMAT)/nitriding treatment, leading to improve its mechanical properties, was carried out through novel and original samples preparation methods. Instead of acid etching which is limited for morphological characterization by scanning electron microscopy (SEM), an original ion polishing method was developed. Moreover, for structural characterization by transmission electron microscopy (TEM), an ion milling method based with the use of two ions guns was also carried out for cross-section preparation. To demonstrate the efficiency of the two developed methods, morphological investigations were done by traditional SEM and field emission gun SEM. This was followed by structural investigations through selected area electron diffraction (SAED) coupled with TEM and X-ray diffraction techniques. The results demonstrated that ionic polishing allowed to reveal a variation of the microstructure according to the surface treatment that could not be observed by acid etching preparation. TEM associated to SAED and X-ray diffraction provided information regarding the nanostructure compositional changes induced by the duplex SMAT/nitriding process. Copyright © 2013 Wiley Periodicals, Inc.
Calculus domains modelled using an original bool algebra based on polygons
NASA Astrophysics Data System (ADS)
Oanta, E.; Panait, C.; Raicu, A.; Barhalescu, M.; Axinte, T.
2016-08-01
Analytical and numerical computer based models require analytical definitions of the calculus domains. The paper presents a method to model a calculus domain based on a bool algebra which uses solid and hollow polygons. The general calculus relations of the geometrical characteristics that are widely used in mechanical engineering are tested using several shapes of the calculus domain in order to draw conclusions regarding the most effective methods to discretize the domain. The paper also tests the results of several CAD commercial software applications which are able to compute the geometrical characteristics, being drawn interesting conclusions. The tests were also targeting the accuracy of the results vs. the number of nodes on the curved boundary of the cross section. The study required the development of an original software consisting of more than 1700 computer code lines. In comparison with other calculus methods, the discretization using convex polygons is a simpler approach. Moreover, this method doesn't lead to large numbers as the spline approximation did, in that case being required special software packages in order to offer multiple, arbitrary precision. The knowledge resulted from this study may be used to develop complex computer based models in engineering.
A new reserve growth model for United States oil and gas fields
Verma, M.K.
2005-01-01
Reserve (or field) growth, which is an appreciation of total ultimate reserves through time, is a well-recognized phenomenon, particularly in mature petroleum provinces. The importance of forecasting reserve growth accurately in a mature petroleum province made it necessary to develop improved growth functions, and a critical review of the original Arrington method was undertaken. During a five-year (1992-1996), the original Arrington method gave 1.03% higher than the actual oil reserve growth, whereas the proposed modified method gave a value within 0.3% of the actual growth, and therefore it was accepted for the development for reserve growth models. During a five-year (1992-1996), the USGS 1995 National Assessment gave 39.3% higher oil and 33.6% lower gas than the actual growths, whereas the new model based on Modified Arrington method gave 11.9% higher oil and 29.8% lower gas than the actual growths. The new models forecast predict reserve growths of 4.2 billion barrels of oil (2.7%) and 30.2 trillion cubic feet of gas (5.4%) for the conterminous U.S. for the next five years (1997-2001). ?? 2005 International Association for Mathematical Geology.
[New approaches to the treatment of keratoconjunctivitis sicca].
Safonova, T N; Gladkova, O V; Novikov, I A; Boev, V I; Fedorov, A A
A new method has been developed for the treatment of severe forms of keratoconjunctivitis sicca (KCS) that involves the use of an original cyclosporine A (CyA) saturated soft contact lens (SCL) together with preservative-free artificial tears therapy. to evaluate the effectiveness of the newly developed treatment for KCS based on the use of medical SCL saturated with 0.05% CyA. The patients (43 men, 60 eyes) with severe KCS were divided into 2 groups. Group 1 included 21 patients (30 eyes), who received artificial tears and wore 0.05% CyA-saturated silicone-hydrogel SCLs. Group 2 included 22 patients (30 eyes), who wore unsaturated original SCLs and received CyA instillations 2 times daily and, also, artificial tears. Apart from a standard ophthalmic examination, the assessment included Schirmer's test, Norn's test, vital eye stain tests, tear osmometry, laser confocal tomography of the cornea, optical coherence tomography of the anterior segment with meniscometry, impression cytology of the conjunctiva, tear pH measurement, plating of the content of the conjunctival cavity, measurement of the width of the palpebral fissure, and calculation of the ocular surface disease index. Treatment results were followed up at 1, 3, 6, and 12 months. The use of 0.05% CyA-saturated SCLs allows to halve treatment time for patients with severe KSC (down to 1 week - 1 month) as compared to unsaturated original SCLs in combination with 0.05% CyA instillations and to reduce it 5 times as compared to 0.05% CyA instillations only. The new method of KSC treatment that involves the use of medical SCL of original design (ensures even distribution of 0.05% CyA across the ocular surface) and preservative-free artificial tears has demonstrated high therapeutic effectiveness as compared to existing methods.
Watanabe, T; Tokunaga, R; Iwahana, T; Tati, M; Ikeda, M
1978-01-01
The direct chelation-extraction method, originally developed by Hessel (1968) for blood lead analysis, has been successfully applied to urinalysis for manganese. The analyses of 35 urine samples containing up to 100 microgram/1 of manganese from manganese-exposed workers showed that the data obtained by this method agree well with those by wet digestion-flame atomic absorption spectrophotometry and also by flameless atomic absorption spectrophotometry. PMID:629893
Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes
NASA Astrophysics Data System (ADS)
Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.
2017-12-01
Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.
Solbu, Kasper; Daae, Hanne Line; Olsen, Raymond; Thorud, Syvert; Ellingsen, Dag Gunnar; Lindgren, Torsten; Bakke, Berit; Lundanes, Elsa; Molander, Paal
2011-05-01
Methods for measurements and the potential for occupational exposure to organophosphates (OPs) originating from turbine and hydraulic oils among flying personnel in the aviation industry are described. Different sampling methods were applied, including active within-day methods for OPs and VOCs, newly developed passive long-term sample methods (deposition of OPs to wipe surface areas and to activated charcoal cloths), and measurements of OPs in high-efficiency particulate air (HEPA) recirculation filters (n = 6). In total, 95 and 72 within-day OP and VOC samples, respectively, have been collected during 47 flights in six different models of turbine jet engine, propeller and helicopter aircrafts (n = 40). In general, the OP air levels from the within-day samples were low. The most relevant OP in this regard originating from turbine and engine oils, tricresyl phosphate (TCP), was detected in only 4% of the samples (min-max
NASA Technical Reports Server (NTRS)
Klein, Vladislav
2002-01-01
The program objectives are fully defined in the original proposal entitled 'Program of Research in Flight Dynamics in GW at NASA Langley Research Center,' which was originated March 20, 1975, and in the renewals of the research program from December 1, 2000 to November 30, 2001. The program in its present form includes three major topics: 1) the improvement of existing methods and development of new methods for wind tunnel and flight test data analysis, 2) the application of these methods to wind tunnel and flight test data obtained from advanced airplanes, 3) the correlation of flight results with wind tunnel measurements, and theoretical predictions. The Principal Investigator of the program is Dr. Vladislav Klein. Three Graduate Research Scholar Assistants (K. G. Mas, M. M. Eissa and N. M. Szyba) also participated in the program. Specific developments in the program during the period Dec. 1, 2001 through Nov. 30, 2002 included: 1) Data analysis of highly swept delta wing aircraft from wind and water tunnel data, and 2) Aerodynamic characteristics of the radio control aircraft from flight test.
Kobayashi, Kazuhiro; Tanaka, Masaharu; Yatsukawa, Yoichi; Tanabe, Soichi; Tanaka, Mitsuru; Ohkouchi, Naohiko
2018-01-01
Recent growing health awareness is leading to increasingly conscious decisions by consumers regarding the production and traceability of food. Stable isotopic compositions provide useful information for tracing the origin of foodstuffs and processes of food production. Plants exhibit different ratios of stable carbon isotopes (δ 13 C) because they utilized different photosynthetic (carbon fixation) pathways and grow in various environments. The origins of glutamic acid in foodstuffs can be differentiated on the basis of these photosynthetic characteristics. Here, we have developed a method to isolate glutamic acid in foodstuffs for determining the δ 13 C value by elemental analyzer-isotope-ratio mass spectrometry (EA/IRMS) without unintended isotopic fractionation. Briefly, following acid-hydrolysis, samples were defatted and passed through activated carbon and a cation-exchange column. Then, glutamic acid was isolated using preparative HPLC. This method is applicable to measuring, with a low standard deviation, the δ 13 C values of glutamic acid from foodstuffs derived from C3 and C4 plants and marine algae.
Vemić, Ana; Rakić, Tijana; Malenović, Anđelija; Medenica, Mirjana
2015-01-01
The aim of this paper is to present a development of liquid chromatographic method when chaotropic salts are used as mobile phase additives following the QbD principles. The effect of critical process parameters (column chemistry, salt nature and concentration, acetonitrile content and column temperature) on the critical quality attributes (retention of the first and last eluting peak and separation of the critical peak pairs) was studied applying the design of experiments-design space methodology (DoE-DS). D-optimal design is chosen in order to simultaneously examine both categorical and numerical factors in minimal number of experiments. Two ways for the achievement of quality assurance were performed and compared. Namely, the uncertainty originating from the models was assessed by Monte Carlo simulations propagating the error equal to the variance of the model residuals and propagating the error originating from the model coefficients' calculation. The baseline separation of pramipexole and its five impurities is achieved fulfilling all the required criteria while the method validation proved its reliability. Copyright © 2014 Elsevier B.V. All rights reserved.
Tămăşan, M; Ozyegin, L S; Oktar, F N; Simon, V
2013-07-01
The study reports the preparation and characterization of powders consisting of the different phases of calcium phosphates that were obtained from the naturally derived raw materials of sea-shell origins reacted with H3PO4. Species of sea origin, such as corals and nacres, attracted a special interest in bone tissue engineering area. Nacre shells are built up of calcium carbonate in aragonite form crystallized in an organic matrix. In this work two natural marine origin materials (shells of echinoderm Sputnik sea urchin - Phyllacanthus imperialis and Trochidae Infundibulum concavus mollusk) were involved in the developing powders of calcium phosphate based biomaterials (as raw materials for bone-scaffolds) by hotplate and ultrasound methods. Thermal analyses of the as-prepared materials were made for an assessment of the thermal behavior and heat treatment temperatures. Samples from both sea shells each of them prepared by the above mentioned methods were subjected to thermal treatments at 450 °C and 850 °C in order to evaluate the crystalline transformations of the calcium phosphate structures in the heating process. By X-ray diffraction analyses various calcium phosphate phases were identified. In Sputnik sea urchins originated samples were found predominantly brushite and calcite as a small secondary phase, while in Trochidae I. concavus samples mainly monetite and HA phases were identified. Thermal treatment at 850 °C resulted flat-plate whitlockite crystals - β-MgTCP [(Ca, Mg)3 (PO4)2] for both samples regardless the preparation method (ultrasound or hotplate) or the targeted Ca/P molar ratio according with XRD patterns. Scanning electron microscopy and Fourier transformed infrared spectroscopy were involved more in the characterization of these materials and the good correlations of the results of these methods were made. Copyright © 2013 Elsevier B.V. All rights reserved.
Cruz, Mutya; Wang, Miao; Frisch-Daiello, Jessica; Han, Xianlin
2016-07-01
Extraction of lipids from biological samples is a critical step in lipidomics, especially for shotgun lipidomics where lipid extracts are directly infused into a mass spectrometer. The butanol-methanol (BUME) extraction method was originally developed to extract lipids from plasma samples with 1 % acetic acid. Considering some lipids are sensitive to acidic environments, we modified this protocol by replacing acetic acid with lithium chloride solution and extended the modified extraction to tissue samples. Although no significant reduction of plasmalogen levels in the acidic BUME extracts of rat heart samples was found, the modified method was established to extract various tissue samples, including rat liver, heart, and plasma. Essentially identical profiles of the majority of lipid classes were obtained from the extracts of the modified BUME and traditional Bligh-Dyer methods. However, it was found that neither the original, nor the modified BUME method was suitable for 4-hydroxyalkenal species measurement in biological samples.
Cruz, Mutya; Wang, Miao; Frisch-Daiello, Jessica; Han, Xianlin
2016-01-01
Extraction of lipids from biological samples is a critical step in lipidomics, especially for shotgun lipidomics where lipid extracts are directly infused into a mass spectrometer. The butanol-methanol (BUME) extraction method was originally developed to extract lipids from plasma samples with 1% acetic acid. Considering some lipids are sensitive to acidic environments, we modified this protocol by replacing acetic acid with lithium chloride solution and extended the modified extraction to tissue samples. Although no significant reduction of plasmalogen levels in the acidic BUME extracts of rat heart samples was found, the modified method was established to extract various tissue samples, including rat liver, heart, and plasma. Essentially identical profiles of the majority of lipid classes were obtained from the extracts of the modified BUME and traditional Bligh-Dyer methods. However, it was found that neither the original, nor the modified BUME method was suitable for 4-hydroxyalkenal species measurement in biological samples. PMID:27245345
Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang
2017-05-01
A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.
The discounting model selector: Statistical software for delay discounting applications.
Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A
2017-05-01
Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.
Isolation and In Vitro Culture of Murine and Human Alveolar Macrophages.
Nayak, Deepak K; Mendez, Oscar; Bowen, Sara; Mohanakumar, Thalachallour
2018-04-20
Alveolar macrophages are terminally differentiated, lung-resident macrophages of prenatal origin. Alveolar macrophages are unique in their long life and their important role in lung development and function, as well as their lung-localized responses to infection and inflammation. To date, no unified method for identification, isolation, and handling of alveolar macrophages from humans and mice exists. Such a method is needed for studies on these important innate immune cells in various experimental settings. The method described here, which can be easily adopted by any laboratory, is a simplified approach to harvesting alveolar macrophages from bronchoalveolar lavage fluid or from lung tissue and maintaining them in vitro. Because alveolar macrophages primarily occur as adherent cells in the alveoli, the focus of this method is on dislodging them prior to harvest and identification. The lung is a highly vascularized organ, and various cell types of myeloid and lymphoid origin inhabit, interact, and are influenced by the lung microenvironment. By using the set of surface markers described here, researchers can easily and unambiguously distinguish alveolar macrophages from other leukocytes, and purify them for downstream applications. The culture method developed herein supports both human and mouse alveolar macrophages for in vitro growth, and is compatible with cellular and molecular studies.
CAE "FOCUS" for modelling and simulating electron optics systems: development and application
NASA Astrophysics Data System (ADS)
Trubitsyn, Andrey; Grachev, Evgeny; Gurov, Victor; Bochkov, Ilya; Bochkov, Victor
2017-02-01
Electron optics is a theoretical base of scientific instrument engineering. Mathematical simulation of occurring processes is a base for contemporary design of complicated devices of the electron optics. Problems of the numerical mathematical simulation are effectively solved by CAE system means. CAE "FOCUS" developed by the authors includes fast and accurate methods: boundary element method (BEM) for the electric field calculation, Runge-Kutta- Fieghlberg method for the charged particle trajectory computation controlling an accuracy of calculations, original methods for search of terms for the angular and time-of-flight focusing. CAE "FOCUS" is organized as a collection of modules each of which solves an independent (sub) task. A range of physical and analytical devices, in particular a microfocus X-ray tube of high power, has been developed using this soft.
Cunha, Joana T; Ribeiro, Tânia I B; Rocha, João B; Nunes, João; Teixeira, José A; Domingues, Lucília
2016-11-15
Serra da Estrela Protected Designation of Origin (PDO) cheese is the most famous Portuguese cheese and has a high commercial value. However, the adulteration of production with cheaper/lower-quality milks from non-autochthones ovine breeds compromises the quality of the final product and undervalues the original PDO cheese. A Randomly Amplified Polymorphic DNA (RAPD) method was developed for efficient detection of adulterant breeds in milk mixtures used for fraudulent production of this cheese. Furthermore, Sequence Characterized Amplified Region (SCAR) markers were designed envisioning the detection of milk adulteration in processed dairy foods. The RAPD-SCAR technique is here described, for the first time, to be potentially useful for detection of milk origin in dairy products. In this sense, our findings will play an important role on the valorization of Serra da Estrela cheese, as well as on other high-quality dairy products prone to adulteration, contributing to the further development of the dairy industry. Copyright © 2016 Elsevier Ltd. All rights reserved.
Three-dimensional surface reconstruction for industrial computed tomography
NASA Technical Reports Server (NTRS)
Vannier, M. W.; Knapp, R. H.; Gayou, D. E.; Sammon, N. P.; Butterfield, R. L.; Larson, J. W.
1985-01-01
Modern high resolution medical computed tomography (CT) scanners can produce geometrically accurate sectional images of many types of industrial objects. Computer software has been developed to convert serial CT scans into a three-dimensional surface form, suitable for display on the scanner itself. This software, originally developed for imaging the skull, has been adapted for application to industrial CT scanning, where serial CT scans thrrough an object of interest may be reconstructed to demonstrate spatial relationships in three dimensions that cannot be easily understood using the original slices. The methods of three-dimensional reconstruction and solid modeling are reviewed, and reconstruction in three dimensions from CT scans through familiar objects is demonstrated.
Analytical methods for gelatin differentiation from bovine and porcine origins and food products.
Nhari, Raja Mohd Hafidz Raja; Ismail, Amin; Che Man, Yaakob B
2012-01-01
Usage of gelatin in food products has been widely debated for several years, which is about the source of gelatin that has been used, religion, and health. As an impact, various analytical methods have been introduced and developed to differentiate gelatin whether it is made from porcine or bovine sources. The analytical methods comprise a diverse range of equipment and techniques including spectroscopy, chemical precipitation, chromatography, and immunochemical. Each technique can differentiate gelatins for certain extent with advantages and limitations. This review is focused on overview of the analytical methods available for differentiation of bovine and porcine gelatin and gelatin in food products so that new method development can be established. © 2011 Institute of Food Technologists®
What is the Final Verification of Engineering Requirements?
NASA Technical Reports Server (NTRS)
Poole, Eric
2010-01-01
This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.
Diway, Bibian; Khoo, Eyen
2017-01-01
The development of timber tracking methods based on genetic markers can provide scientific evidence to verify the origin of timber products and fulfill the growing requirement for sustainable forestry practices. In this study, the origin of an important Dark Red Meranti wood, Shorea platyclados, was studied by using the combination of seven chloroplast DNA and 15 short tandem repeats (STRs) markers. A total of 27 natural populations of S. platyclados were sampled throughout Malaysia to establish population level and individual level identification databases. A haplotype map was generated from chloroplast DNA sequencing for population identification, resulting in 29 multilocus haplotypes, based on 39 informative intraspecific variable sites. Subsequently, a DNA profiling database was developed from 15 STRs allowing for individual identification in Malaysia. Cluster analysis divided the 27 populations into two genetic clusters, corresponding to the region of Eastern and Western Malaysia. The conservativeness tests showed that the Malaysia database is conservative after removal of bias from population subdivision and sampling effects. Independent self-assignment tests correctly assigned individuals to the database in an overall 60.60−94.95% of cases for identified populations, and in 98.99−99.23% of cases for identified regions. Both the chloroplast DNA database and the STRs appear to be useful for tracking timber originating in Malaysia. Hence, this DNA-based method could serve as an effective addition tool to the existing forensic timber identification system for ensuring the sustainably management of this species into the future. PMID:28430826
Yan, Yongqiu; Lu, Yu; Jiang, Shiping; Jiang, Yu; Tong, Yingpeng; Zuo, Limin; Yang, Jun; Gong, Feng; Zhang, Ling; Wang, Ping
2018-01-01
Background: Noni juice has been extensively used as folk medicine for the treatment of arthritis, infections, analgesic, colds, cancers, and diabetes by Polynesians for many years. Due to the lack of standard scientific evaluation methods, various kinds of commercial Noni juice with different quality and price were available on the market. Objective: To establish a sensitive, reliable, and accurate high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry (HPLC-ESI-MS/MS) method for separation, identification, and simultaneous quantitative analysis of bioactive constituents in Noni juice. Materials and Methods: The analytes and eight batches of commercially available samples from different origins were separated and analyzed by the HPLC-ESI-MS/MS method on an Agilent ZORBAX SB-C18 (150 mm × 4.6 mm i.d., 5 μm) column using a gradient elution of acetonitrile-methanol-0.05% glacial acetic acid in water (v/v) at a constant flow rate of 0.5 mL/min. Results: Seven components were identification and all of the assay parameters were within the required limits. Components were within the correlation coefficient values (R2 ≥ 0.9993) at the concentration ranges tested. The precision of the assay method was <0.91% and the repeatability between 1.36% and 3.31%. The accuracy varied from 96.40% to 103.02% and the relative standard deviations of stability were <3.91%. Samples from the same origin showed similar content while different origins showed significant different result. Conclusions: The developed methods would provide a reliable basis and be useful in the establishment of a rational quality control standard of Noni juice. SUMMARY Separation, identification, and simultaneous quantitative analysis method of seven bioactive constituents in Noni juice is originally developed by high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometryThe presented method was successfully applied to the quality control of eight batches of commercially available samples of Noni juiceThis method is simple, sensitive, reliable, accurate, and efficient method with strong specificity, good precision, and high recovery rate and provides a reliable basis for quality control of Noni juice. Abbreviations used: HPLC-ESI-MS/MS: High-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry, LOD: Limit of detection, LOQ: Limit of quantitation, S/N: Signal-to-noise ratio, RSD: Relative standard deviations, DP: Declustering potential, CE: Collision energy, MRM: Multiple reaction monitoring, RT: Retention time. PMID:29576704
Don C. Bragg
2017-01-01
Although the Crossett Experimental Forest (CEF) played a well-publicized role in the development of uneven-aged southern pine silviculture, work on a selection method in Arkansas (USA) did not originate there. In 1925, Leslie Pomeroy and Eugene Connor acquired the Ozark Badger Lumber Company and initiated an expert-driven selection management system compatible with...
Patrick H. Brose
2011-01-01
Timely development of newly germinated oak (Quercus spp.) seedlings into competitive-sized regeneration is an essential part of the oak regeneration process. The amount of sunlight reaching the forest floor partly governs this development, and foresters often use the shelterwood system to expose oak seedlings to varying degrees of insolation. To...
Early Musical Training in Bel Canto Vocal Technique: A Brief History and Philosophy.
ERIC Educational Resources Information Center
Gerstein, Christine Wondolowski
This paper offers a brief history and philosophy of the origins of bel canto vocal style and describes the pedagogical methods used to achieve bel canto ideals in singing. The document discusses the adoption and development of this technique and how it developed over long periods of preparation in the foregoing centuries before the Baroque era.…
MWM-Array Characterization of Mechanical Damage and Corrosion
DOT National Transportation Integrated Search
2011-02-09
The MWM-Array is an inductive sensor that operates like a transformer in a plane. The MWMArray is based on the original MWM(R) (Meandering Winding Magnetometer) developed at MIT in the 1980s. A rapid multivariate inverse method converts impedance dat...
QUASI-PML FOR WAVES IN CYLINDRICAL COORDINATES. (R825225)
We prove that the straightforward extension of Berenger's original perfectly matched layer (PML) is not reflectionless at a cylindrical interface in the continuum limit. A quasi-PLM is developed as an absorbing boundary condition (ABC) for the finite-difference time-domain method...
Kelly, S.; Wickstead, B.; Gull, K.
2011-01-01
We have developed a machine-learning approach to identify 3537 discrete orthologue protein sequence groups distributed across all available archaeal genomes. We show that treating these orthologue groups as binary detection/non-detection data is sufficient to capture the majority of archaeal phylogeny. We subsequently use the sequence data from these groups to infer a method and substitution-model-independent phylogeny. By holding this phylogeny constrained and interrogating the intersection of this large dataset with both the Eukarya and the Bacteria using Bayesian and maximum-likelihood approaches, we propose and provide evidence for a methanogenic origin of the Archaea. By the same criteria, we also provide evidence in support of an origin for Eukarya either within or as sisters to the Thaumarchaea. PMID:20880885
Single Wall Carbon Nanotube Alignment Mechanisms for Non-Destructive Evaluation
NASA Technical Reports Server (NTRS)
Hong, Seunghun
2002-01-01
As proposed in our original proposal, we developed a new innovative method to assemble millions of single wall carbon nanotube (SWCNT)-based circuit components as fast as conventional microfabrication processes. This method is based on surface template assembly strategy. The new method solves one of the major bottlenecks in carbon nanotube based electrical applications and, potentially, may allow us to mass produce a large number of SWCNT-based integrated devices of critical interests to NASA.
Methodes entropiques appliquees au probleme inverse en magnetoencephalographie
NASA Astrophysics Data System (ADS)
Lapalme, Ervig
2005-07-01
This thesis is devoted to biomagnetic source localization using magnetoencephalography. This problem is known to have an infinite number of solutions. So methods are required to take into account anatomical and functional information on the solution. The work presented in this thesis uses the maximum entropy on the mean method to constrain the solution. This method originates from statistical mechanics and information theory. This thesis is divided into two main parts containing three chapters each. The first part reviews the magnetoencephalographic inverse problem: the theory needed to understand its context and the hypotheses for simplifying the problem. In the last chapter of this first part, the maximum entropy on the mean method is presented: its origins are explained and also how it is applied to our problem. The second part is the original work of this thesis presenting three articles; one of them already published and two others submitted for publication. In the first article, a biomagnetic source model is developed and applied in a theoretical con text but still demonstrating the efficiency of the method. In the second article, we go one step further towards a realistic modelization of the cerebral activation. The main priors are estimated using the magnetoencephalographic data. This method proved to be very efficient in realistic simulations. In the third article, the previous method is extended to deal with time signals thus exploiting the excellent time resolution offered by magnetoencephalography. Compared with our previous work, the temporal method is applied to real magnetoencephalographic data coming from a somatotopy experience and results agree with previous physiological knowledge about this kind of cognitive process.
Crack resistance determination of material by wedge splitting a chevron-notched specimen
NASA Astrophysics Data System (ADS)
Deryugin, Ye. Ye.
2017-12-01
An original method is proposed for the crack resistance determination of a material by wedge splitting of a chevron-notched specimen. It was developed at the Institute of Strength Physics and Materials Science SB RAS in the laboratory of Physical Mesomechanics and Nondestructive Methods of Control. An example of the crack resistance test of technical titanium VT1-0 is considered.
Portrait of a rural health graduate: exploring alternative learning spaces.
Ross, Andrew; Pillay, Daisy
2015-05-01
Given that the staffing of rural facilities represents an international challenge, the support, training and development of students of rural origin at institutions of higher learning (IHLs) should be an integral dimension of health care provisioning. International studies have shown these students to be more likely than students of urban origin to return to work in rural areas. However, the crisis in formal school education in some countries, such as South Africa, means that rural students with the capacity to pursue careers in health care are least likely to access the necessary training at an IHL. In addition to challenges of access, throughput is relatively low at IHLs and is determined by a range of learning experiences. Insight into the storied educational experiences of health care professionals (HCPs) of rural origin has the potential to inform the training and development of rural-origin students. Six HCPs of rural origin were purposively selected. Using a narrative inquiry approach, data were generated from long interviews and a range of arts-based methods to create and reconstruct the storied narratives of the six participants. Codes, categories and themes were developed from the reconstructed stories. Reid's four-quadrant model of learning theory was used to focus on the learning experiences of one participant. Alternative learning spaces were identified, which were made available through particular social spaces outwith formal lecture rooms. These offered opportunities for collaboration and for the reconfiguring of the participants' agency to be, think and act differently. Through the practices enacted in particular learning spaces, relationships of caring, sharing, motivating and mentoring were formed, which contributed to personal, social, academic and professional development and success. Learning spaces outwith the formal lecture theatre are critical to the acquisition of good clinical skills and knowledge in the development of socially accountable HCPs of rural origin. © 2015 John Wiley & Sons Ltd.
2013-01-01
Background Developed countries use generic competition to contain pharmaceutical expenditure. China, as a developing and transitional country, has not yet deemed an increase in the use of generic products as important; otherwise, much effort has been made to decrease the drug prices. This paper aims to explore dynamically the price and use comparison of generic and originator drugs in China, and estimate the potential savings of patients from switching originator drugs to generics. Methods A typical hospital in Chongqing, China, was selected to examine the price and use comparisons of 12 cardiovascular drugs from 2006 to 2011. Results The market share of the 12 generic medicines studied in this paper was 34.37% for volume and 31.33% for value in the second half of 2011. The price ratio of generic to originator drugs was between 0.34 and 0.98, and the volume price index of originators to generics was 1.63. The potential savings of patients from switching originator drugs to generics is 65%. Conclusion The market share of the generics was lowering and the weighted mean price kept increasing in face of the strict price control. Under the background of hospitals both prescribing and dispensing medicines, China’s comprehensive healthcare policy makers should take measures from supply and demand sides to promote the consumption of generic medicines. PMID:24093493
Ageing airplane repair assessment program for Airbus A300
NASA Technical Reports Server (NTRS)
Gaillardon, J. M.; Schmidt, HANS-J.; Brandecker, B.
1992-01-01
This paper describes the current status of the repair categorization activities and includes all details about the methodologies developed for determination of the inspection program for the skin on pressurized fuselages. For inspection threshold determination two methods are defined based on fatigue life approach, a simplified and detailed method. The detailed method considers 15 different parameters to assess the influences of material, geometry, size location, aircraft usage, and workmanship on the fatigue life of the repair and the original structure. For definition of the inspection intervals a general method is developed which applies to all concerned repairs. For this the initial flaw concept is used by considering 6 parameters and the detectable flaw sizes depending on proposed nondestructive inspection methods. An alternative method is provided for small repairs allowing visual inspection with shorter intervals.
Method of identification of patent trends based on descriptions of technical functions
NASA Astrophysics Data System (ADS)
Korobkin, D. M.; Fomenkov, S. A.; Golovanchikov, A. B.
2018-05-01
The use of the global patent space to determine the scientific and technological priorities for the technical systems development (identifying patent trends) allows one to forecast the direction of the technical systems development and, accordingly, select patents of priority technical subjects as a source for updating the technical functions database and physical effects database. The authors propose an original method that uses as trend terms not individual unigrams or n-gram (usually for existing methods and systems), but structured descriptions of technical functions in the form “Subject-Action-Object” (SAO), which in the authors’ opinion are the basis of the invention.
Participatory design of healthcare technology with children.
Sims, Tara
2018-02-12
Purpose There are many frameworks and methods for involving children in design research. Human-Computer Interaction provides rich methods for involving children when designing technologies. The paper aims to discuss these issues. Design/methodology/approach This paper examines various approaches to involving children in design, considering whether users view children as study objects or active participants. Findings The BRIDGE method is a sociocultural approach to product design that views children as active participants, enabling them to contribute to the design process as competent and resourceful partners. An example is provided, in which BRIDGE was successfully applied to developing upper limb prostheses with children. Originality/value Approaching design in this way can provide children with opportunities to develop social, academic and design skills and to develop autonomy.
[Study of "Bishu Yakuen Ransho-Roku (the origin of herb garden in Owari Clan)].
Goto, T; Yamaguchi, S; Tanaka, T
1995-01-01
"Bishu Yakuen Ransho-Roku (The origin of herb garden in Owari Clan)" is in the possession of the Institution of Tokugawa Rinseishi in Tokyo. This paper was written about the origin of the herb garden established by Mr. Shinken Mimura, an herbalist in the Owari clan between 1735 and 1746. Mr. Shinken Mimura cultivated ginseng by according to the guide issued by the shogunate, but he found the methods unsuitable. Therefore, he made efforts to improve the cultivation of ginseng. As a result, he succeeded in the cultivation of good ginseng. He had contributed to the development of the production of ginseng in the Owari clan. He write this document so that his methods could be handed down for posterity. This document has two parts: one is the growth of ginseng in the form of a diary and the other is the conditions of cultivation as to seeding, fertilization, the counter-measures for damage due to blight and insects, and so on.
How to Compress Sequential Memory Patterns into Periodic Oscillations: General Reduction Rules
Zhang, Kechen
2017-01-01
A neural network with symmetric reciprocal connections always admits a Lyapunov function, whose minima correspond to the memory states stored in the network. Networks with suitable asymmetric connections can store and retrieve a sequence of memory patterns, but the dynamics of these networks cannot be characterized as readily as that of the symmetric networks due to the lack of established general methods. Here, a reduction method is developed for a class of asymmetric attractor networks that store sequences of activity patterns as associative memories, as in a Hopfield network. The method projects the original activity pattern of the network to a low-dimensional space such that sequential memory retrievals in the original network correspond to periodic oscillations in the reduced system. The reduced system is self-contained and provides quantitative information about the stability and speed of sequential memory retrievals in the original network. The time evolution of the overlaps between the network state and the stored memory patterns can also be determined from extended reduced systems. The reduction procedure can be summarized by a few reduction rules, which are applied to several network models, including coupled networks and networks with time-delayed connections, and the analytical solutions of the reduced systems are confirmed by numerical simulations of the original networks. Finally, a local learning rule that provides an approximation to the connection weights involving the pseudoinverse is also presented. PMID:24877729
Isotopic tracing of perchlorate in the environment
Sturchio, Neil C.; Böhlke, John Karl; Gu, Baohua; Hatzinger, Paul B.; Jackson, W. Andrew; Baskaran, Mark
2012-01-01
Isotopic measurements can be used for tracing the sources and behavior of environmental contaminants. Perchlorate (ClO 4 − ) has been detected widely in groundwater, soils, fertilizers, plants, milk, and human urine since 1997, when improved analytical methods for analyzing ClO 4 −concentration became available for routine use. Perchlorate ingestion poses a risk to human health because of its interference with thyroidal hormone production. Consequently, methods for isotopic analysis of ClO 4 − have been developed and applied to assist evaluation of the origin and migration of this common contaminant. Isotopic data are now available for stable isotopes of oxygen and chlorine, as well as 36Cl isotopic abundances, in ClO 4 − samples from a variety of natural and synthetic sources. These isotopic data provide a basis for distinguishing sources of ClO 4 − found in the environment, and for understanding the origin of natural ClO 4 − . In addition, the isotope effects of microbial ClO 4 − reduction have been measured in laboratory and field experiments, providing a tool for assessing ClO 4 − attenuation in the environment. Isotopic data have been used successfully in some areas for identifying major sources of ClO 4 − contamination in drinking water supplies. Questions about the origin and global biogeochemical cycle of natural ClO 4 − remain to be addressed; such work would benefit from the development of methods for preparation and isotopic analysis of ClO 4 − in samples with low concentrations and complex matrices.
NASA Astrophysics Data System (ADS)
Monserrat, Carlos; Alcaniz-Raya, Mariano L.; Juan, M. Carmen; Grau Colomer, Vincente; Albalat, Salvador E.
1997-05-01
This paper describes a new method for 3D orthodontics treatment simulation developed for an orthodontics planning system (MAGALLANES). We develop an original system for 3D capturing and reconstruction of dental anatomy that avoid use of dental casts in orthodontic treatments. Two original techniques are presented, one direct in which data are acquired directly form patient's mouth by mean of low cost 3D digitizers, and one mixed in which data are obtained by 3D digitizing of hydrocollids molds. FOr this purpose we have designed and manufactured an optimized optical measuring system based on laser structured light. We apply these 3D dental models to simulate 3D movement of teeth, including rotations, during orthodontic treatment. The proposed algorithms enable to quantify the effect of orthodontic appliance on tooth movement. The developed techniques has been integrated in a system named MAGALLANES. This original system present several tools for 3D simulation and planning of orthodontic treatments. The prototype system has been tested in several orthodontic clinic with very good results.
Real-Time Generation of the Footprints both on Floor and Ground
NASA Astrophysics Data System (ADS)
Hirano, Yousuke; Tanaka, Toshimitsu; Sagawa, Yuji
This paper presents a real-time method for generating various footprints in relation to state of walking. In addition, the method is expanded to cover both on hard floor and soft ground. Results of the previous method were not so realistic, because the method places same simple foot prints on the motion path. Our method runs filters on the original pattern of footprint on GPU. And then our method gradates intensity of the pattern to two directions, in order to create partially dark footprints. Here parameters of the filter and the gradation are changed by move speed and direction. The pattern is mapped on a polygon. If the walker is pigeon-toed or bandy-legged, the polygon is rotated inside or outside, respectively. Finally, it is placed on floor. Footprints on soft ground are concavity and convexity caused by walking. Thus an original pattern of footprints on ground is defined as a height map. The height map is modified using the filter and the gradation operation developed for floor footprints. The height map is converted to a bump map to fast display the concavity and convexity of footprints.
Quantitative comparison of alternative methods for coarse-graining biological networks
Bowman, Gregory R.; Meng, Luming; Huang, Xuhui
2013-01-01
Markov models and master equations are a powerful means of modeling dynamic processes like protein conformational changes. However, these models are often difficult to understand because of the enormous number of components and connections between them. Therefore, a variety of methods have been developed to facilitate understanding by coarse-graining these complex models. Here, we employ Bayesian model comparison to determine which of these coarse-graining methods provides the models that are most faithful to the original set of states. We find that the Bayesian agglomerative clustering engine and the hierarchical Nyström expansion graph (HNEG) typically provide the best performance. Surprisingly, the original Perron cluster cluster analysis (PCCA) method often provides the next best results, outperforming the newer PCCA+ method and the most probable paths algorithm. We also show that the differences between the models are qualitatively significant, rather than being minor shifts in the boundaries between states. The performance of the methods correlates well with the entropy of the resulting coarse-grainings, suggesting that finding states with more similar populations (i.e., avoiding low population states that may just be noise) gives better results. PMID:24089717
Generalization of Equivalent Crystal Theory to Include Angular Dependence
NASA Technical Reports Server (NTRS)
Ferrante, John; Zypman, Fredy R.
2004-01-01
In the original Equivalent Crystal Theory, each atomic site in the real crystal is assigned an equivalent lattice constant, in general different from the ground state one. This parameter corresponds to a local compression or expansion of the lattice. The basic method considers these volumetric transformations and, in addition, introduces the possibility that the reference lattice is anisotropically distorted. These distortions however, were introduced ad-hoc. In this work, we generalize the original Equivalent Crystal Theory by systematically introducing site-dependent directional distortions of the lattice, whose corresponding distortions account for the dependence of the energy on anisotropic local density variations. This is done in the spirit of the original framework, but including a gradient term in the density. This approach is introduced to correct a deficiency in the original Equivalent Crystal Theory and other semiempirical methods in quantitatively obtaining the correct ratios of the surface energies of low index planes of cubic metals (100), (110), and (111). We develop here the basic framework, and apply it to the calculation of Fe (110) and Fe (111) surface energy formation. The results, compared with first principles calculations, show an improvement over previous semiempirical approaches.
Lee, Raymond M
2011-01-01
In the 1940s, interviewing practice in sociology became decisively influenced by techniques that had originally been developed by researchers in other disciplines working within a number of therapeutic or quasi-therapeutic contexts, in particular the "nondirective interviewing" methods developed by Carl Rogers and the interviewing procedures developed during the Hawthorne studies. This article discusses the development of nondirective interviewing and looks at how in the 1930s and '40s the approach came to be used in sociology. It examines the factors leading to both the popularity of the method and its subsequent fall from favor. © 2011 Wiley Periodicals, Inc.
Takei, Takaaki; Ikeda, Mitsuru; Imai, Kuniharu; Yamauchi-Kawaura, Chiyo; Kato, Katsuhiko; Isoda, Haruo
2013-09-01
The automated contrast-detail (C-D) analysis methods developed so-far cannot be expected to work well on images processed with nonlinear methods, such as noise reduction methods. Therefore, we have devised a new automated C-D analysis method by applying support vector machine (SVM), and tested for its robustness to nonlinear image processing. We acquired the CDRAD (a commercially available C-D test object) images at a tube voltage of 120 kV and a milliampere-second product (mAs) of 0.5-5.0. A partial diffusion equation based technique was used as noise reduction method. Three radiologists and three university students participated in the observer performance study. The training data for our SVM method was the classification data scored by the one radiologist for the CDRAD images acquired at 1.6 and 3.2 mAs and their noise-reduced images. We also compared the performance of our SVM method with the CDRAD Analyser algorithm. The mean C-D diagrams (that is a plot of the mean of the smallest visible hole diameter vs. hole depth) obtained from our devised SVM method agreed well with the ones averaged across the six human observers for both original and noise-reduced CDRAD images, whereas the mean C-D diagrams from the CDRAD Analyser algorithm disagreed with the ones from the human observers for both original and noise-reduced CDRAD images. In conclusion, our proposed SVM method for C-D analysis will work well for the images processed with the non-linear noise reduction method as well as for the original radiographic images.
Comparison of Transmission Line Methods for Surface Acoustic Wave Modeling
NASA Technical Reports Server (NTRS)
Wilson, William; Atkinson, Gary
2009-01-01
Surface Acoustic Wave (SAW) technology is low cost, rugged, lightweight, extremely low power and can be used to develop passive wireless sensors. For these reasons, NASA is investigating the use of SAW technology for Integrated Vehicle Health Monitoring (IVHM) of aerospace structures. To facilitate rapid prototyping of passive SAW sensors for aerospace applications, SAW models have been developed. This paper reports on the comparison of three methods of modeling SAWs. The three models are the Impulse Response Method (a first order model), and two second order matrix methods; the conventional matrix approach, and a modified matrix approach that is extended to include internal finger reflections. The second order models are based upon matrices that were originally developed for analyzing microwave circuits using transmission line theory. Results from the models are presented with measured data from devices. Keywords: Surface Acoustic Wave, SAW, transmission line models, Impulse Response Method.
Yan, Yongqiu; Lu, Yu; Jiang, Shiping; Jiang, Yu; Tong, Yingpeng; Zuo, Limin; Yang, Jun; Gong, Feng; Zhang, Ling; Wang, Ping
2018-01-01
Noni juice has been extensively used as folk medicine for the treatment of arthritis, infections, analgesic, colds, cancers, and diabetes by Polynesians for many years. Due to the lack of standard scientific evaluation methods, various kinds of commercial Noni juice with different quality and price were available on the market. To establish a sensitive, reliable, and accurate high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry (HPLC-ESI-MS/MS) method for separation, identification, and simultaneous quantitative analysis of bioactive constituents in Noni juice. The analytes and eight batches of commercially available samples from different origins were separated and analyzed by the HPLC-ESI-MS/MS method on an Agilent ZORBAX SB-C 18 (150 mm × 4.6 mm i.d., 5 μm) column using a gradient elution of acetonitrile-methanol-0.05% glacial acetic acid in water (v/v) at a constant flow rate of 0.5 mL/min. Seven components were identification and all of the assay parameters were within the required limits. Components were within the correlation coefficient values ( R 2 ≥ 0.9993) at the concentration ranges tested. The precision of the assay method was <0.91% and the repeatability between 1.36% and 3.31%. The accuracy varied from 96.40% to 103.02% and the relative standard deviations of stability were <3.91%. Samples from the same origin showed similar content while different origins showed significant different result. The developed methods would provide a reliable basis and be useful in the establishment of a rational quality control standard of Noni juice. Separation, identification, and simultaneous quantitative analysis method of seven bioactive constituents in Noni juice is originally developed by high-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometryThe presented method was successfully applied to the quality control of eight batches of commercially available samples of Noni juiceThis method is simple, sensitive, reliable, accurate, and efficient method with strong specificity, good precision, and high recovery rate and provides a reliable basis for quality control of Noni juice. Abbreviations used: HPLC-ESI-MS/MS: High-performance liquid chromatography with electrospray ionization triple quadrupole mass spectrometry, LOD: Limit of detection, LOQ: Limit of quantitation, S/N: Signal-to-noise ratio, RSD: Relative standard deviations, DP: Declustering potential, CE: Collision energy, MRM: Multiple reaction monitoring, RT: Retention time.
DOT National Transportation Integrated Search
2008-11-01
The Texas Department of Transportation (TxDOT) uses the modified triaxial design procedure to check : pavement designs from the flexible pavement system program. Since its original development more than : 50 years ago, little modification has been ma...
Suggestopaedia-Canada. 1977-2.
ERIC Educational Resources Information Center
Racle, Gabriel
This issue presents the original English version and a French translation of a text written at the Sofia Institute of Suggestology around April 1971. The document defines suggestology and suggestopedia, and traces the development of the suggestopedic method, from the first collective experiments aimed at attaining suggestive hypermnesia carried…
Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D
2015-06-01
The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.
Brain vascular image enhancement based on gradient adjust with split Bregman
NASA Astrophysics Data System (ADS)
Liang, Xiao; Dong, Di; Hui, Hui; Zhang, Liwen; Fang, Mengjie; Tian, Jie
2016-04-01
Light Sheet Microscopy is a high-resolution fluorescence microscopic technique which enables to observe the mouse brain vascular network clearly with immunostaining. However, micro-vessels are stained with few fluorescence antibodies and their signals are much weaker than large vessels, which make micro-vessels unclear in LSM images. In this work, we developed a vascular image enhancement method to enhance micro-vessel details which should be useful for vessel statistics analysis. Since gradient describes the edge information of the vessel, the main idea of our method is to increase the gradient values of the enhanced image to improve the micro-vessels contrast. Our method contained two steps: 1) calculate the gradient image of LSM image, and then amplify high gradient values of the original image to enhance the vessel edge and suppress low gradient values to remove noises. Then we formulated a new L1-norm regularization optimization problem to find an image with the expected gradient while keeping the main structure information of the original image. 2) The split Bregman iteration method was used to deal with the L1-norm regularization problem and generate the final enhanced image. The main advantage of the split Bregman method is that it has both fast convergence and low memory cost. In order to verify the effectiveness of our method, we applied our method to a series of mouse brain vascular images acquired from a commercial LSM system in our lab. The experimental results showed that our method could greatly enhance micro-vessel edges which were unclear in the original images.
Bayesian evaluation of effect size after replicating an original study
van Aert, Robbie C. M.; van Assen, Marcel A. L. M.
2017-01-01
The vast majority of published results in the literature is statistically significant, which raises concerns about their reliability. The Reproducibility Project Psychology (RPP) and Experimental Economics Replication Project (EE-RP) both replicated a large number of published studies in psychology and economics. The original study and replication were statistically significant in 36.1% in RPP and 68.8% in EE-RP suggesting many null effects among the replicated studies. However, evidence in favor of the null hypothesis cannot be examined with null hypothesis significance testing. We developed a Bayesian meta-analysis method called snapshot hybrid that is easy to use and understand and quantifies the amount of evidence in favor of a zero, small, medium and large effect. The method computes posterior model probabilities for a zero, small, medium, and large effect and adjusts for publication bias by taking into account that the original study is statistically significant. We first analytically approximate the methods performance, and demonstrate the necessity to control for the original study’s significance to enable the accumulation of evidence for a true zero effect. Then we applied the method to the data of RPP and EE-RP, showing that the underlying effect sizes of the included studies in EE-RP are generally larger than in RPP, but that the sample sizes of especially the included studies in RPP are often too small to draw definite conclusions about the true effect size. We also illustrate how snapshot hybrid can be used to determine the required sample size of the replication akin to power analysis in null hypothesis significance testing and present an easy to use web application (https://rvanaert.shinyapps.io/snapshot/) and R code for applying the method. PMID:28388646
Asadpour-Zeynali, Karim; Maryam Sajjadi, S; Taherzadeh, Fatemeh; Rahmanian, Reza
2014-04-05
Bilinear least square (BLLS) method is one of the most suitable algorithms for second-order calibration. Original BLLS method is not applicable to the second order pH-spectral data when an analyte has more than one spectroscopically active species. Bilinear least square-residual bilinearization (BLLS-RBL) was developed to achieve the second order advantage for analysis of complex mixtures. Although the modified method is useful, the pure profiles cannot be obtained and only the linear combination will be obtained. Moreover, for prediction of analyte in an unknown sample, the original algorithm of RBL may diverge; instead of converging to the desired analyte concentrations. Therefore, Gauss Newton-RLB algorithm should be used, which is not as simple as original protocol. Also, the analyte concentration can be predicted on the basis of each of the equilibrating species of the component of interest that are not exactly the same. The aim of the present work is to tackle the non-uniqueness problem in the second order calibration of monoprotic acid mixtures and divergence of RBL. Each pH-absorbance matrix was pretreated by subtraction of the first spectrum from other spectra in the data set to produce full rank array that is called variation matrix. Then variation matrices were analyzed uniquely by original BLLS-RBL that is more parsimonious than its modified counterpart. The proposed method was performed on the simulated as well as the analysis of real data. Sunset yellow and Carmosine as monoprotic acids were determined in candy sample in the presence of unknown interference by this method. Copyright © 2013 Elsevier B.V. All rights reserved.
Tallman, Melissa; Amenta, Nina; Delson, Eric; Frost, Stephen R.; Ghosh, Deboshmita; Klukkert, Zachary S.; Morrow, Andrea; Sawyer, Gary J.
2014-01-01
Diagenetic distortion can be a major obstacle to collecting quantitative shape data on paleontological specimens, especially for three-dimensional geometric morphometric analysis. Here we utilize the recently -published algorithmic symmetrization method of fossil reconstruction and compare it to the more traditional reflection & averaging approach. In order to have an objective test of this method, five casts of a female cranium of Papio hamadryas kindae were manually deformed while the plaster hardened. These were subsequently “retrodeformed” using both algorithmic symmetrization and reflection & averaging and then compared to the original, undeformed specimen. We found that in all cases, algorithmic retrodeformation improved the shape of the deformed cranium and in four out of five cases, the algorithmically symmetrized crania were more similar in shape to the original crania than the reflected & averaged reconstructions. In three out of five cases, the difference between the algorithmically symmetrized crania and the original cranium could be contained within the magnitude of variation among individuals in a single subspecies of Papio. Instances of asymmetric distortion, such as breakage on one side, or bending in the axis of symmetry, were well handled, whereas symmetrical distortion remained uncorrected. This technique was further tested on a naturally deformed and fossilized cranium of Paradolichopithecus arvernensis. Results, based on a principal components analysis and Procrustes distances, showed that the algorithmically symmetrized Paradolichopithecus cranium was more similar to other, less-deformed crania from the same species than was the original. These results illustrate the efficacy of this method of retrodeformation by algorithmic symmetrization for the correction of asymmetrical distortion in fossils. Symmetrical distortion remains a problem for all currently developed methods of retrodeformation. PMID:24992483
A Comparison of Surface Acoustic Wave Modeling Methods
NASA Technical Reports Server (NTRS)
Wilson, W. c.; Atkinson, G. M.
2009-01-01
Surface Acoustic Wave (SAW) technology is low cost, rugged, lightweight, extremely low power and can be used to develop passive wireless sensors. For these reasons, NASA is investigating the use of SAW technology for Integrated Vehicle Health Monitoring (IVHM) of aerospace structures. To facilitate rapid prototyping of passive SAW sensors for aerospace applications, SAW models have been developed. This paper reports on the comparison of three methods of modeling SAWs. The three models are the Impulse Response Method a first order model, and two second order matrix methods; the conventional matrix approach, and a modified matrix approach that is extended to include internal finger reflections. The second order models are based upon matrices that were originally developed for analyzing microwave circuits using transmission line theory. Results from the models are presented with measured data from devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saeki, Hiroshi, E-mail: saeki@spring8.or.jp; Magome, Tamotsu, E-mail: saeki@spring8.or.jp
2014-10-06
To compensate pressure-measurement errors caused by a synchrotron radiation environment, a precise method using a hot-cathode-ionization-gauge head with correcting electrode, was developed and tested in a simulation experiment with excess electrons in the SPring-8 storage ring. This precise method to improve the measurement accuracy, can correctly reduce the pressure-measurement errors caused by electrons originating from the external environment, and originating from the primary gauge filament influenced by spatial conditions of the installed vacuum-gauge head. As the result of the simulation experiment to confirm the performance reducing the errors caused by the external environment, the pressure-measurement error using this method wasmore » approximately less than several percent in the pressure range from 10{sup −5} Pa to 10{sup −8} Pa. After the experiment, to confirm the performance reducing the error caused by spatial conditions, an additional experiment was carried out using a sleeve and showed that the improved function was available.« less
Fast sweeping methods for hyperbolic systems of conservation laws at steady state II
NASA Astrophysics Data System (ADS)
Engquist, Björn; Froese, Brittany D.; Tsai, Yen-Hsi Richard
2015-04-01
The idea of using fast sweeping methods for solving stationary systems of conservation laws has previously been proposed for efficiently computing solutions with sharp shocks. We further develop these methods to allow for a more challenging class of problems including problems with sonic points, shocks originating in the interior of the domain, rarefaction waves, and two-dimensional systems. We show that fast sweeping methods can produce higher-order accuracy. Computational results validate the claims of accuracy, sharp shock curves, and optimal computational efficiency.
Killing-Yano tensors in spaces admitting a hypersurface orthogonal Killing vector
NASA Astrophysics Data System (ADS)
Garfinkle, David; Glass, E. N.
2013-03-01
Methods are presented for finding Killing-Yano tensors, conformal Killing-Yano tensors, and conformal Killing vectors in spacetimes with a hypersurface orthogonal Killing vector. These methods are similar to a method developed by the authors for finding Killing tensors. In all cases one decomposes both the tensor and the equation it satisfies into pieces along the Killing vector and pieces orthogonal to the Killing vector. Solving the separate equations that result from this decomposition requires less computing than integrating the original equation. In each case, examples are given to illustrate the method.
Potential and viscous flow in VTOL, STOL or CTOL propulsion system inlets
NASA Technical Reports Server (NTRS)
Stockman, N. O.
1975-01-01
A method was developed for analyzing the flow in subsonic axisymmetric inlets at arbitrary conditions of freestream velocity, incidence angle, and inlet mass flow. An improved version of the method is discussed and comparisons of results obtained with the original and improved methods are given. Comparisons with experiments are also presented for several inlet configurations and for various conditions of the boundary layer from insignificant to separated. Applications of the method are discussed, with several examples given for specific cases involving inlets for VTOL lift fans and for STOL engine nacelles.
Improved definition of crustal anomalies for Magsat data
NASA Technical Reports Server (NTRS)
1981-01-01
A scheme was developed for separating the portions of the magnetic field measured by the Magsat 1 satellite that arise from internal and external sources. To test this method, a set of sample coefficients were used to compute the field values along a simulated satellite orbit. This data was then used to try to recover the original coefficients. Matrix inversion and recursive least squares methods were used to solve for the input coefficients. The accuracy of the two methods are compared.
Root Gravitropism: Quantification, Challenges, and Solutions.
Muller, Lukas; Bennett, Malcolm J; French, Andy; Wells, Darren M; Swarup, Ranjan
2018-01-01
Better understanding of root traits such as root angle and root gravitropism will be crucial for development of crops with improved resource use efficiency. This chapter describes a high-throughput, automated image analysis method to trace Arabidopsis (Arabidopsis thaliana) seedling roots grown on agar plates. The method combines a "particle-filtering algorithm with a graph-based method" to trace the center line of a root and can be adopted for the analysis of several root parameters such as length, curvature, and stimulus from original root traces.
Beyond single-stream with the Schrödinger method
NASA Astrophysics Data System (ADS)
Uhlemann, Cora; Kopp, Michael
2016-10-01
We investigate large scale structure formation of collisionless dark matter in the phase space description based on the Vlasov-Poisson equation. We present the Schrödinger method, originally proposed by \\cite{WK93} as numerical technique based on the Schrödinger Poisson equation, as an analytical tool which is superior to the common standard pressureless fluid model. Whereas the dust model fails and develops singularities at shell crossing the Schrödinger method encompasses multi-streaming and even virialization.
Manifold Learning for 3D Shape Description and Classification
2014-06-09
sportswear, personal protection clothing and equipment, office and health care device, etc. Therefore it is desirable to develop an effective shape...Modeling Figure 2: Toy example for submanifold decomposition. (a) The original data on the top are fused by two uncorrelated manifolds, blue and red... developed which is effective to extract two linear submanifolds. We demonstrated that comparing with existing manifold learning methods that only
Detection of Parent-of-Origin Effects Using General Pedigree Data
Zhou, Ji-Yuan; Ding, Jie; Fung, Wing K.; Lin, Shili
2010-01-01
Genomic imprinting is an important epigenetic factor in complex traits study, which has generally been examined by testing for parent-of-origin effects of alleles. For a diallelic marker locus, the parental-asymmetry test (PAT) based on case-parents trios and its extensions to incomplete nuclear families (1-PAT and C-PAT) are simple and powerful for detecting parent-of-origin effects. However, these methods are suitable only for nuclear families and thus are not amenable to general pedigree data. Use of data from extended pedigrees, if available, may lead to more powerful methods than randomly selecting one two-generation nuclear family from each pedigree. In this study, we extend PAT to accommodate general pedigree data by proposing the pedigree PAT (PPAT) statistic, which uses all informative family trios from pedigrees. To fully utilize pedigrees with some missing genotypes, we further develop the Monte Carlo (MC) PPAT (MCPPAT) statistic based on MC sampling and estimation. Extensive simulations were carried out to evaluate the performance of the proposed methods. Under the assumption that the pedigrees and their associated affection patterns are randomly drawn from a population of pedigrees with at least one affected offspring, we demonstrated that MCPPAT is a valid test for parent-of-origin effects in the presence of association. Further, MCPPAT is much more powerful compared to PAT for trios or even PPAT for all informative family trios from the same pedigrees if there is missing data. Application of the proposed methods to a rheumatoid arthritis dataset further demonstrates the advantage of MCPPAT. PMID:19676055
Sinigalliano, Christopher D.; Ervin, Jared S.; Van De Werfhorst, Laurie C.; Badgley, Brian D.; Ballestée, Elisenda; Bartkowiaka, Jakob; Boehm, Alexandria B.; Byappanahalli, Muruleedhara N.; Goodwin, Kelly D.; Gourmelon, Michèle; Griffith, John; Holden, Patricia A.; Jay, Jenny; Layton, Blythe; Lee, Cheonghoon; Lee, Jiyoung; Meijer, Wim G.; Noble, Rachel; Raith, Meredith; Ryu, Hodon; Sadowsky, Michael J.; Schriewer, Alexander; Wang, Dan; Wanless, David; Whitman, Richard; Wuertz, Stefan; Santo Domingo, Jorge W.
2013-01-01
Here we report results from a multi-laboratory (n = 11) evaluation of four different PCR methods targeting the 16S rRNA gene of Catellicoccus marimammalium originally developed to detect gull fecal contamination in coastal environments. The methods included a conventional end-point PCR method, a SYBR® Green qPCR method, and two TaqMan® qPCR methods. Different techniques for data normalization and analysis were tested. Data analysis methods had a pronounced impact on assay sensitivity and specificity calculations. Across-laboratory standardization of metrics including the lower limit of quantification (LLOQ), target detected but not quantifiable (DNQ), and target not detected (ND) significantly improved results compared to results submitted by individual laboratories prior to definition standardization. The unit of measure used for data normalization also had a pronounced effect on measured assay performance. Data normalization to DNA mass improved quantitative method performance as compared to enterococcus normalization. The MST methods tested here were originally designed for gulls but were found in this study to also detect feces from other birds, particularly feces composited from pigeons. Sequencing efforts showed that some pigeon feces from California contained sequences similar to C. marimammalium found in gull feces. These data suggest that the prevalence, geographic scope, and ecology of C. marimammalium in host birds other than gulls require further investigation. This study represents an important first step in the multi-laboratory assessment of these methods and highlights the need to broaden and standardize additional evaluations, including environmentally relevant target concentrations in ambient waters from diverse geographic regions.
Deutsch, Jean
2010-02-01
In this essay, I discuss the origin of Charles Darwin's interest in cirripedes (barnacles). Indeed, he worked intensively on cirripedes during the years in which he was developing the theory that eventually led to the publication of The Origin of Species. In the light of our present knowledge, I present Darwin's achievements in the morphology, systematics and biology of these small marine invertebrates, and also his mistakes. I suggest that the word that sheds the most light here is homology, and that his mistakes were due to following Richard Owen's method of determining homologies by reference to an ideal archetype. I discuss the ways in which his studies on cirripedes influenced the writing of The Origin. 2009 Académie des sciences. Published by Elsevier SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Ahooghalandari, Matin; Khiadani, Mehdi; Jahromi, Mina Esmi
2017-05-01
Reference evapotranspiration (ET0) is a critical component of water resources management and planning. Different methods have been developed to estimate ET0 with various required data. In this study, Hargreaves, Turc, Oudin, Copais, Abtew methods and three forms of Valiantzas' formulas, developed in recent years, were used to estimate ET0 for the Pilbara region of Western Australia. The estimated ET0 values from these methods were compared with those from the FAO-56 Penman-Monteith (PM) method. The results showed that the Copais methods and two of Valiantzas' equations, in their original forms, are suitable for estimating ET0 for the study area. A modification of Honey-Bee Mating Optimization (MHBMO) algorithm was further implemented, and three Valiantzas' equations for a region located in the southern hemisphere were calibrated.
2018-01-01
In recent years, the in ovo feeding in fertilized broiler (Gallus gallus) eggs approach was further developed and currently is widely applied in the evaluation process of the effects of functional foods (primarily plant origin compounds) on the functionality of the intestinal brush border membrane, as well as potential prebiotic properties and interactions with the intestinal microbial populations. This review collates the information of potential nutrients and their effects on the mineral absorption, gut development, brush border membrane functionality, and immune system. In addition, the advantages and limitations of the in ovo feeding method in the assessment of potential prebiotic effects of plant origin compounds is discussed. PMID:29597266
DNA Fingerprinting of Pearls to Determine Their Origins
Meyer, Joana B.; Cartier, Laurent E.; Pinto-Figueroa, Eric A.; Krzemnicki, Michael S.; Hänni, Henry A.; McDonald, Bruce A.
2013-01-01
We report the first successful extraction of oyster DNA from a pearl and use it to identify the source oyster species for the three major pearl-producing oyster species Pinctada margaritifera, P. maxima and P. radiata. Both mitochondrial and nuclear gene fragments could be PCR-amplified and sequenced. A polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) assay in the internal transcribed spacer (ITS) region was developed and used to identify 18 pearls of unknown origin. A micro-drilling technique was developed to obtain small amounts of DNA while maintaining the commercial value of the pearls. This DNA fingerprinting method could be used to document the source of historic pearls and will provide more transparency for traders and consumers within the pearl industry. PMID:24130725
DNA fingerprinting of pearls to determine their origins.
Meyer, Joana B; Cartier, Laurent E; Pinto-Figueroa, Eric A; Krzemnicki, Michael S; Hänni, Henry A; McDonald, Bruce A
2013-01-01
We report the first successful extraction of oyster DNA from a pearl and use it to identify the source oyster species for the three major pearl-producing oyster species Pinctada margaritifera, P. maxima and P. radiata. Both mitochondrial and nuclear gene fragments could be PCR-amplified and sequenced. A polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) assay in the internal transcribed spacer (ITS) region was developed and used to identify 18 pearls of unknown origin. A micro-drilling technique was developed to obtain small amounts of DNA while maintaining the commercial value of the pearls. This DNA fingerprinting method could be used to document the source of historic pearls and will provide more transparency for traders and consumers within the pearl industry.
Pathways to lean software development: An analysis of effective methods of change
NASA Astrophysics Data System (ADS)
Hanson, Richard D.
This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.
A modified Finite Element-Transfer Matrix for control design of space structures
NASA Technical Reports Server (NTRS)
Tan, T.-M.; Yousuff, A.; Bahar, L. Y.; Konstandinidis, M.
1990-01-01
The Finite Element-Transfer Matrix (FETM) method was developed for reducing the computational efforts involved in structural analysis. While being widely used by structural analysts, this method does, however, have certain limitations, particularly when used for the control design of large flexible structures. In this paper, a new formulation based on the FETM method is presented. The new method effectively overcomes the limitations in the original FETM method, and also allows an easy construction of reduced models that are tailored for the control design. Other advantages of this new method include the ability to extract open loop frequencies and mode shapes with less computation, and simplification of the design procedures for output feedback, constrained compensation, and decentralized control. The development of this new method and the procedures for generating reduced models using this method are described in detail and the role of the reduced models in control design is discussed through an illustrative example.
Expulsion as an Issue of World History.
ERIC Educational Resources Information Center
Kedar, Benjamin Z.
1996-01-01
Locates the origin and development of corporate expulsion: the permanent, government-sponsored banishment of a category of subjects beyond the physical boundaries of a political entity--in medieval Western Europe. This method of consolidating political power, creating convenient scapegoats, and eliminating perceived internal threats soon spread to…
Theory and computation of optimal low- and medium-thrust transfers
NASA Technical Reports Server (NTRS)
Chuang, C.-H.
1994-01-01
This report describes the current state of development of methods for calculating optimal orbital transfers with large numbers of burns. Reported on first is the homotopy-motivated and so-called direction correction method. So far this method has been partially tested with one solver; the final step has yet to be implemented. Second is the patched transfer method. This method is rooted in some simplifying approximations made on the original optimal control problem. The transfer is broken up into single-burn segments, each single-burn solved as a predictor step and the whole problem then solved with a corrector step.
González-Madroño, A; Mancha, A; Rodríguez, F J; Culebras, J; de Ulibarri, J I
2012-01-01
To ratify previous validations of the CONUT nutritional screening tool by the development of two probabilistic models using the parameters included in the CONUT, to see if the CONUT´s effectiveness could be improved. It is a two step prospective study. In Step 1, 101 patients were randomly selected, and SGA and CONUT was made. With data obtained an unconditional logistic regression model was developed, and two variants of CONUT were constructed: Model 1 was made by a method of logistic regression. Model 2 was made by dividing the probabilities of undernutrition obtained in model 1 in seven regular intervals. In step 2, 60 patients were selected and underwent the SGA, the original CONUT and the new models developed. The diagnostic efficacy of the original CONUT and the new models was tested by means of ROC curves. Both samples 1 and 2 were put together to measure the agreement degree between the original CONUT and SGA, and diagnostic efficacy parameters were calculated. No statistically significant differences were found between sample 1 and 2, regarding age, sex and medical/surgical distribution and undernutrition rates were similar (over 40%). The AUC for the ROC curves were 0.862 for the original CONUT, and 0.839 and 0.874, for model 1 and 2 respectively. The kappa index for the CONUT and SGA was 0.680. The CONUT, with the original scores assigned by the authors is equally good than mathematical models and thus is a valuable tool, highly useful and efficient for the purpose of Clinical Undernutrition screening.
Regional W-Phase Source Inversion for Moderate to Large Earthquakes in China and Neighboring Areas
NASA Astrophysics Data System (ADS)
Zhao, Xu; Duputel, Zacharie; Yao, Zhenxing
2017-12-01
Earthquake source characterization has been significantly speeded up in the last decade with the development of rapid inversion techniques in seismology. Among these techniques, the W-phase source inversion method quickly provides point source parameters of large earthquakes using very long period seismic waves recorded at teleseismic distances. Although the W-phase method was initially developed to work at global scale (within 20 to 30 min after the origin time), faster results can be obtained when seismological data are available at regional distances (i.e., Δ ≤ 12°). In this study, we assess the use and reliability of regional W-phase source estimates in China and neighboring areas. Our implementation uses broadband records from the Chinese network supplemented by global seismological stations installed in the region. Using this data set and minor modifications to the W-phase algorithm, we show that reliable solutions can be retrieved automatically within 4 to 7 min after the earthquake origin time. Moreover, the method yields stable results down to Mw = 5.0 events, which is well below the size of earthquakes that are rapidly characterized using W-phase inversions at teleseismic distances.
The use of UV-visible reflectance spectroscopy as an objective tool to evaluate pearl quality.
Agatonovic-Kustrin, Snezana; Morton, David W
2012-07-01
Assessing the quality of pearls involves the use of various tools and methods, which are mainly visual and often quite subjective. Pearls are normally classified by origin and are then graded by luster, nacre thickness, surface quality, size, color and shape. The aim of this study was to investigate the capacity of Artificial Neural Networks (ANNs) to classify and estimate the quality of 27 different pearls from their UV-Visible spectra. Due to the opaque nature of pearls, spectroscopy measurements were performed using the Diffuse Reflectance UV-Visible spectroscopy technique. The spectra were acquired at two different locations on each pearl sample in order to assess surface homogeneity. The spectral data (inputs) were smoothed to reduce the noise, fed into ANNs and correlated to the pearl's quality/grading criteria (outputs). The developed ANNs were successful in predicting pearl type, mollusk growing species, possible luster and color enhancing, donor condition/type, recipient/host color, donor color, pearl luster, pearl color, origin. The results of this study shows that the developed UV-Vis spectroscopy-ANN method could be used as a more objective method of assessing pearl quality (grading) and may become a valuable tool for the pearl grading industry.
NASA Technical Reports Server (NTRS)
1992-01-01
Organon Teknika Corporation's REDY 2000 dialysis machine employs technology originally developed under NASA contract by Marquardt Corporation. The chemical process developed during the project could be applied to removing toxic waste from used dialysis fluid. This discovery led to the development of a kidney dialysis machine using "sorbent" dialysis, a method of removing urea from human blood by treating a dialysate solution. The process saves electricity and, because the need for a continuous water supply is eliminated, the patient has greater freedom.
Evaluation of speaker de-identification based on voice gender and age conversion
NASA Astrophysics Data System (ADS)
Přibil, Jiří; Přibilová, Anna; Matoušek, Jindřich
2018-03-01
Two basic tasks are covered in this paper. The first one consists in the design and practical testing of a new method for voice de-identification that changes the apparent age and/or gender of a speaker by multi-segmental frequency scale transformation combined with prosody modification. The second task is aimed at verification of applicability of a classifier based on Gaussian mixture models (GMM) to detect the original Czech and Slovak speakers after applied voice deidentification. The performed experiments confirm functionality of the developed gender and age conversion for all selected types of de-identification which can be objectively evaluated by the GMM-based open-set classifier. The original speaker detection accuracy was compared also for sentences uttered by German and English speakers showing language independence of the proposed method.
Comparative study of flare control laws. [optimal control of b-737 aircraft approach and landing
NASA Technical Reports Server (NTRS)
Nadkarni, A. A.; Breedlove, W. J., Jr.
1979-01-01
A digital 3-D automatic control law was developed to achieve an optimal transition of a B-737 aircraft between various initial glid slope conditions and the desired final touchdown condition. A discrete, time-invariant, optimal, closed-loop control law presented for a linear regulator problem, was extended to include a system being acted upon by a constant disturbance. Two forms of control laws were derived to solve this problem. One method utilized the feedback of integral states defined appropriately and augmented with the original system equations. The second method formulated the problem as a control variable constraint, and the control variables were augmented with the original system. The control variable constraint control law yielded a better performance compared to feedback control law for the integral states chosen.
Calibration of the Urbana lidar system
NASA Technical Reports Server (NTRS)
Cerny, T.; Sechrist, C. F., Jr.
1980-01-01
A method for calibrating data obtained by the Urban sodium lidar system is presented. First, an expression relating the number of photocounts originating from a specific altitude range to the soodium concentration is developed. This relation is then simplified by normalizing the sodium photocounts with photocounts originating from the Rayleigh region of the atmosphere. To evaluate the calibration expression, the laser linewidth must be known. Therefore, a method for measuring the laser linewidth using a Fabry-Perot interferometer is given. The laser linewidth was found to be 6 + or - 2.5 pm. Problems due to photomultiplier tube overloading are discussed. Finally, calibrated data is presented. The sodium column abundance exhibits something close to a sinusoidal variation throughout the year with the winter months showing an enhancement of a factor of 5 to 7 over the summer months.
The History of Optical Analysis of Milk: The Development and Use of Lactoscopes
NASA Astrophysics Data System (ADS)
Millan-Verdú, C.; Garrigós-Oltra, Ll.; Blanes-Nadal, G.; Domingo-Beltrán, M.
2003-07-01
The 19th century use of optical methods to detect the fraudulent adulteration of milk (by means of adding water) is presented in this article. The development and use of these optical methods was based on the principle of diaphanometry and illustrates the conflict that existed between two different approaches to scientific knowledge at that time. On the one hand, there were physicians who were more concerned with the practicality of the methods than on their accuracy; on the other hand, there were physicists who were fundamentally more interested in the accuracy of the results than the practicality of its use. This paper examines that conflict by analyzing the original lactoscope of Alfred Donné and the history of subsequent developments of the lactoscope.
de Almeida, Geane Silva; de Oliveira, Iara Brandão
2018-03-07
This work applied the Water Quality Index developed by the Canadian Council of Ministers of the Environment (WQI-CCME), to communicate the water quality per section of the Joanes River basin, State of Bahia, Brazil. WQI-CCME is a statistical procedure that originally requires the execution of at least four monitoring campaigns per monitoring location and the measurement of at least four parameters. This paper presents a new aggregation method to calculate the WQI-CCME because, to apply the original method in Joanes River, a huge loss of information would occur, by the fact that, the number of analyzed parameters varied between the monitoring campaigns developed by the Government Monitoring Program. This work modified the original aggregation method replacing it by a data aggregation for a single monitoring campaign, in a minimum of four monitoring locations per section of the river and a minimum of four parameters per monitoring location. Comparison between the calculation of WQI-CCME for river sections, with the index, WQI-CETESB, developed by the Brazilian Environmental Sanitation and Technology Company-CETESB, proved the applicability of the new aggregation method. The WQI-CETESB has it bases on the WQI from the National Sanitation Foundation and uses nine fixed parameters. As WQI-CCME uses the totality of the analyzed parameters without restrictions, it is more flexible, and the results seem more adequate to indicate the real river water quality. However, the WQI-CCME has a more stringent water quality scale in comparison with the WQI-CETESB, resulting in inferior water quality information. In conclusion, the WQI-CCME with a new aggregation method is adequate for communicating the water quality at a given time, per section of a river, respecting the minimum number of four analyses and four monitoring points. As a result, without a need to wait for other campaigns, it reduces the cost of a monitoring program and the period to communicate the water quality. The adequacy of the WQI-CCME was similar to the finding of others.
Fukatsu, Hiroshi; Naganawa, Shinji; Yumura, Shinnichiro
2008-04-01
This study was aimed to validate the performance of a novel image compression method using a neural network to achieve a lossless compression. The encoding consists of the following blocks: a prediction block; a residual data calculation block; a transformation and quantization block; an organization and modification block; and an entropy encoding block. The predicted image is divided into four macro-blocks using the original image for teaching; and then redivided into sixteen sub-blocks. The predicted image is compared to the original image to create the residual image. The spatial and frequency data of the residual image are compared and transformed. Chest radiography, computed tomography (CT), magnetic resonance imaging, positron emission tomography, radioisotope mammography, ultrasonography, and digital subtraction angiography images were compressed using the AIC lossless compression method; and the compression rates were calculated. The compression rates were around 15:1 for chest radiography and mammography, 12:1 for CT, and around 6:1 for other images. This method thus enables greater lossless compression than the conventional methods. This novel method should improve the efficiency of handling of the increasing volume of medical imaging data.
Pendar, Hodjat; Socha, John J.
2015-01-01
Flow-through respirometry systems provide accurate measurement of gas exchange over long periods of time. However, these systems have limitations in tracking rapid changes. When an animal infuses a metabolic gas into the respirometry chamber in a short burst, diffusion and airflow in the chamber gradually alter the original signal before it arrives at the gas analyzer. For single or multiple bursts, the recorded signal is smeared or mixed, which may result in dramatically altered recordings compared to the emitted signal. Recovering the original metabolic signal is a difficult task because of the inherent ill conditioning problem. Here, we present two new methods to recover the fast dynamics of metabolic patterns from recorded data. We first re-derive the equations of the well-known Z-transform method (ZT method) to show the source of imprecision in this method. Then, we develop a new model of analysis for respirometry systems based on the experimentally determined impulse response, which is the response of the system to a very short unit input. As a result, we present a major modification of the ZT method (dubbed the ‘EZT method’) by using a new model for the impulse response, enhancing its precision to recover the true metabolic signals. The second method, the generalized Z-transform (GZT) method, was then developed by generalizing the EZT method; it can be applied to any flow-through respirometry system with any arbitrary impulse response. Experiments verified that the accuracy of recovering the true metabolic signals is significantly improved by the new methods. These new methods can be used more broadly for input estimation in variety of physiological systems. PMID:26466361
Design and preparation of polymeric scaffolds for tissue engineering.
Weigel, Thomas; Schinkel, Gregor; Lendlein, Andreas
2006-11-01
Polymeric scaffolds for tissue engineering can be prepared with a multitude of different techniques. Many diverse approaches have recently been under development. The adaptation of conventional preparation methods, such as electrospinning, induced phase separation of polymer solutions or porogen leaching, which were developed originally for other research areas, are described. In addition, the utilization of novel fabrication techniques, such as rapid prototyping or solid free-form procedures, with their many different methods to generate or to embody scaffold structures or the usage of self-assembly systems that mimic the properties of the extracellular matrix are also described. These methods are reviewed and evaluated with specific regard to their utility in the area of tissue engineering.
Tay, Wee Tek; Walsh, Thomas K.; Downes, Sharon; Anderson, Craig; Jermiin, Lars S.; Wong, Thomas K. F.; Piper, Melissa C.; Chang, Ester Silva; Macedo, Isabella Barony; Czepak, Cecilia; Behere, Gajanan T.; Silvie, Pierre; Soria, Miguel F.; Frayssinet, Marie; Gordon, Karl H. J.
2017-01-01
The Old World bollworm Helicoverpa armigera is now established in Brazil but efforts to identify incursion origin(s) and pathway(s) have met with limited success due to the patchiness of available data. Using international agricultural/horticultural commodity trade data and mitochondrial DNA (mtDNA) cytochrome oxidase I (COI) and cytochrome b (Cyt b) gene markers, we inferred the origins and incursion pathways into Brazil. We detected 20 mtDNA haplotypes from six Brazilian states, eight of which were new to our 97 global COI-Cyt b haplotype database. Direct sequence matches indicated five Brazilian haplotypes had Asian, African, and European origins. We identified 45 parsimoniously informative sites and multiple substitutions per site within the concatenated (945 bp) nucleotide dataset, implying that probabilistic phylogenetic analysis methods are needed. High diversity and signatures of uniquely shared haplotypes with diverse localities combined with the trade data suggested multiple incursions and introduction origins in Brazil. Increasing agricultural/horticultural trade activities between the Old and New Worlds represents a significant biosecurity risk factor. Identifying pest origins will enable resistance profiling that reflects countries of origin to be included when developing a resistance management strategy, while identifying incursion pathways will improve biosecurity protocols and risk analysis at biosecurity hotspots including national ports. PMID:28350004
Tay, Wee Tek; Walsh, Thomas K; Downes, Sharon; Anderson, Craig; Jermiin, Lars S; Wong, Thomas K F; Piper, Melissa C; Chang, Ester Silva; Macedo, Isabella Barony; Czepak, Cecilia; Behere, Gajanan T; Silvie, Pierre; Soria, Miguel F; Frayssinet, Marie; Gordon, Karl H J
2017-03-28
The Old World bollworm Helicoverpa armigera is now established in Brazil but efforts to identify incursion origin(s) and pathway(s) have met with limited success due to the patchiness of available data. Using international agricultural/horticultural commodity trade data and mitochondrial DNA (mtDNA) cytochrome oxidase I (COI) and cytochrome b (Cyt b) gene markers, we inferred the origins and incursion pathways into Brazil. We detected 20 mtDNA haplotypes from six Brazilian states, eight of which were new to our 97 global COI-Cyt b haplotype database. Direct sequence matches indicated five Brazilian haplotypes had Asian, African, and European origins. We identified 45 parsimoniously informative sites and multiple substitutions per site within the concatenated (945 bp) nucleotide dataset, implying that probabilistic phylogenetic analysis methods are needed. High diversity and signatures of uniquely shared haplotypes with diverse localities combined with the trade data suggested multiple incursions and introduction origins in Brazil. Increasing agricultural/horticultural trade activities between the Old and New Worlds represents a significant biosecurity risk factor. Identifying pest origins will enable resistance profiling that reflects countries of origin to be included when developing a resistance management strategy, while identifying incursion pathways will improve biosecurity protocols and risk analysis at biosecurity hotspots including national ports.
NASA Astrophysics Data System (ADS)
Tay, Wee Tek; Walsh, Thomas K.; Downes, Sharon; Anderson, Craig; Jermiin, Lars S.; Wong, Thomas K. F.; Piper, Melissa C.; Chang, Ester Silva; Macedo, Isabella Barony; Czepak, Cecilia; Behere, Gajanan T.; Silvie, Pierre; Soria, Miguel F.; Frayssinet, Marie; Gordon, Karl H. J.
2017-03-01
The Old World bollworm Helicoverpa armigera is now established in Brazil but efforts to identify incursion origin(s) and pathway(s) have met with limited success due to the patchiness of available data. Using international agricultural/horticultural commodity trade data and mitochondrial DNA (mtDNA) cytochrome oxidase I (COI) and cytochrome b (Cyt b) gene markers, we inferred the origins and incursion pathways into Brazil. We detected 20 mtDNA haplotypes from six Brazilian states, eight of which were new to our 97 global COI-Cyt b haplotype database. Direct sequence matches indicated five Brazilian haplotypes had Asian, African, and European origins. We identified 45 parsimoniously informative sites and multiple substitutions per site within the concatenated (945 bp) nucleotide dataset, implying that probabilistic phylogenetic analysis methods are needed. High diversity and signatures of uniquely shared haplotypes with diverse localities combined with the trade data suggested multiple incursions and introduction origins in Brazil. Increasing agricultural/horticultural trade activities between the Old and New Worlds represents a significant biosecurity risk factor. Identifying pest origins will enable resistance profiling that reflects countries of origin to be included when developing a resistance management strategy, while identifying incursion pathways will improve biosecurity protocols and risk analysis at biosecurity hotspots including national ports.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Abhijit; Voter, Arthur
2009-01-01
We develop a variation of the temperature accelerated dynamics (TAD) method, called the p-TAD method, that efficiently generates an on-the-fly kinetic Monte Carlo (KMC) process catalog with control over the accuracy of the catalog. It is assumed that transition state theory is valid. The p-TAD method guarantees that processes relevant at the timescales of interest to the simulation are present in the catalog with a chosen confidence. A confidence measure associated with the process catalog is derived. The dynamics is then studied using the process catalog with the KMC method. Effective accuracy of a p-TAD calculation is derived when amore » KMC catalog is reused for conditions different from those the catalog was originally generated for. Different KMC catalog generation strategies that exploit the features of the p-TAD method and ensure higher accuracy and/or computational efficiency are presented. The accuracy and the computational requirements of the p-TAD method are assessed. Comparisons to the original TAD method are made. As an example, we study dynamics in sub-monolayer Ag/Cu(110) at the time scale of seconds using the p-TAD method. It is demonstrated that the p-TAD method overcomes several challenges plaguing the conventional KMC method.« less
Data structures supporting multi-region adaptive isogeometric analysis
NASA Astrophysics Data System (ADS)
Perduta, Anna; Putanowicz, Roman
2018-01-01
Since the first paper published in 2005 Isogeometric Analysis (IGA) has gained strong interest and found applications in many engineering problems. Despite the advancement of the method, there are still far fewer software implementations comparing to Finite Element Method. The paper presents an approach to the development of data structures that can support multi-region IGA with local mesh refinement (patch-based) and possible application in IGA-FEM models. The purpose of this paper is to share original design concepts, that authors have created while developing an IGA package, which other researchers may find beneficial for their own simulation codes.
NASA Astrophysics Data System (ADS)
Iwase, Shigeru; Futamura, Yasunori; Imakura, Akira; Sakurai, Tetsuya; Tsukamoto, Shigeru; Ono, Tomoya
2018-05-01
We propose an efficient computational method for evaluating the self-energy matrices of electrodes to study ballistic electron transport properties in nanoscale systems. To reduce the high computational cost incurred in large systems, a contour integral eigensolver based on the Sakurai-Sugiura method combined with the shifted biconjugate gradient method is developed to solve an exponential-type eigenvalue problem for complex wave vectors. A remarkable feature of the proposed algorithm is that the numerical procedure is very similar to that of conventional band structure calculations. We implement the developed method in the framework of the real-space higher-order finite-difference scheme with nonlocal pseudopotentials. Numerical tests for a wide variety of materials validate the robustness, accuracy, and efficiency of the proposed method. As an illustration of the method, we present the electron transport property of the freestanding silicene with the line defect originating from the reversed buckled phases.
Using Importance-Performance Analysis To Evaluate Teaching Effectiveness.
ERIC Educational Resources Information Center
Attarian, Aram
This paper introduces Importance-Performance (IP) analysis as a method to evaluate teaching effectiveness in a university outdoor program. Originally developed for use in the field of marketing, IP analysis is simple and easy to administer, and provides the instructor with a visual representation of what teaching attributes are important, how…
Feminist Research Methodology Groups: Origins, Forms, Functions.
ERIC Educational Resources Information Center
Reinharz, Shulamit
Feminist Research Methodology Groups (FRMGs) have developed as a specific type of women's group in which feminist academics can find supportive audiences for their work while contributing to a feminist redefinition of research methods. An analysis of two FRMGs reveals common characteristics, dynamics, and outcomes. Both were limited to small…
Don't Stop Believing: Mapping Distance Learners' Research Journeys
ERIC Educational Resources Information Center
Brahme, Maria E.; Gabriel, Lizette; Stenis, Paul V.
2016-01-01
Journey mapping, a method of collecting data that illustrates individuals' paths toward a specific goal, was originally developed for use in retail/customer service environments. Much of the literature describes its application in examining customer behavior when navigating merchants' Websites, allowing researchers to examine the effectiveness,…
Adapting geostatistics to analyze spatial and temporal trends in weed populations
USDA-ARS?s Scientific Manuscript database
Geostatistics were originally developed in mining to estimate the location, abundance and quality of ore over large areas from soil samples to optimize future mining efforts. Here, some of these methods were adapted to weeds to account for a limited distribution area (i.e., inside a field), variatio...
Harry T. Valentine
2002-01-01
Randomized branch sampling (RBS) is a special application of multistage probability sampling (see Sampling, environmental), which was developed originally by Jessen [3] to estimate fruit counts on individual orchard trees. In general, the method can be used to obtain estimates of many different attributes of trees or other branched plants. The usual objective of RBS is...
USDA-ARS?s Scientific Manuscript database
Critical path analysis (CPA) is a method for estimating macroscopic transport coefficients of heterogeneous materials that are highly disordered at the micro-scale. Developed originally to model conduction in semiconductors, numerous researchers have noted that CPA might also have relevance to flow ...
Scale Development: Perceived Barriers to Public Use of School Recreational Facilities
ERIC Educational Resources Information Center
Spengler, John O.; Ko, Yong Jae; Connaughton, Daniel P.
2012-01-01
Objectives: To test an original scale assessing perceived barriers among school administrators to allowing community use of school recreational facilities outside of regular school hours. Methods: Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). Results: Using EFA and CFA, we found that a model including factors of…
Educational Innovations as Staff Development: An Evaluative Approach.
ERIC Educational Resources Information Center
Marcelo, Carlos; And Others
1996-01-01
Reports a study that analyzed the processes of educational innovation in seven projects in the Andalusian region of Spain. Using qualitative and quantitative methods, the study determined that the quality of the impact of innovation on students, teachers, schools, and communities was determined by the original source, the relationship to perceived…
Fetal Origins of Child Non-Right-Handedness and Mental Health
ERIC Educational Resources Information Center
Rodriguez, Alina; Waldenstrom, Ulla
2008-01-01
Background: Environmental risk during fetal development for non-right-handedness, an index of brain asymmetry, and its relevance for child mental health is not fully understood. Methods: A Swedish population-based prospective pregnancy-offspring cohort was followed-up when children were five years old (N = 1714). Prenatal environmental risk…
Focke, Felix; Haase, Ilka; Fischer, Markus
2011-01-26
Usually spices are identified morphologically using simple methods like magnifying glasses or microscopic instruments. On the other hand, molecular biological methods like the polymerase chain reaction (PCR) enable an accurate and specific detection also in complex matrices. Generally, the origins of spices are plants with diverse genetic backgrounds and relationships. The processing methods used for the production of spices are complex and individual. Consequently, the development of a reliable DNA-based method for spice analysis is a challenging intention. However, once established, this method will be easily adapted to less difficult food matrices. In the current study, several alternative methods for the isolation of DNA from spices have been developed and evaluated in detail with regard to (i) its purity (photometric), (ii) yield (fluorimetric methods), and (iii) its amplifiability (PCR). Whole genome amplification methods were used to preamplify isolates to improve the ratio between amplifiable DNA and inhibiting substances. Specific primer sets were designed, and the PCR conditions were optimized to detect 18 spices selectively. Assays of self-made spice mixtures were performed to proof the applicability of the developed methods.
Temporal and Embryonic Lineage-Dependent Regulation of Human Vascular SMC Development by NOTCH3
Granata, Alessandra; Bernard, William G.; Zhao, Ning; Mccafferty, John; Lilly, Brenda
2015-01-01
Vascular smooth muscle cells (SMCs), which arise from multiple embryonic progenitors, have unique lineage-specific properties and this diversity may contribute to spatial patterns of vascular diseases. We developed in vitro methods to generate distinct vascular SMC subtypes from human pluripotent stem cells, allowing us to explore their intrinsic differences and the mechanisms involved in SMC development. Since Notch signaling is thought to be one of the several key regulators of SMC differentiation and function, we profiled the expression of Notch receptors, ligands, and downstream elements during the development of origin-specific SMC subtypes. NOTCH3 expression in our in vitro model varied in a lineage- and developmental stage-specific manner so that the highest expression in mature SMCs was in those derived from paraxial mesoderm (PM). This pattern was consistent with the high expression level of NOTCH3 observed in the 8–9 week human fetal descending aorta, which is populated by SMCs of PM origin. Silencing NOTCH3 in mature SMCs in vitro reduced SMC markers in cells of PM origin preferentially. Conversely, during early development, NOTCH3 was highly expressed in vitro in SMCs of neuroectoderm (NE) origin. Inhibition of NOTCH3 in early development resulted in a significant downregulation of specific SMC markers exclusively in the NE lineage. Corresponding to this prediction, the Notch3-null mouse showed reduced expression of Acta2 in the neural crest-derived SMCs of the aortic arch. Thus, Notch3 signaling emerges as one of the key regulators of vascular SMC differentiation and maturation in vitro and in vivo in a lineage- and temporal-dependent manner. PMID:25539150
Statistics of software vulnerability detection in certification testing
NASA Astrophysics Data System (ADS)
Barabanov, A. V.; Markov, A. S.; Tsirlov, V. L.
2018-05-01
The paper discusses practical aspects of introduction of the methods to detect software vulnerability in the day-to-day activities of the accredited testing laboratory. It presents the approval results of the vulnerability detection methods as part of the study of the open source software and the software that is a test object of the certification tests under information security requirements, including software for communication networks. Results of the study showing the allocation of identified vulnerabilities by types of attacks, country of origin, programming languages used in the development, methods for detecting vulnerability, etc. are given. The experience of foreign information security certification systems related to the detection of certified software vulnerabilities is analyzed. The main conclusion based on the study is the need to implement practices for developing secure software in the development life cycle processes. The conclusions and recommendations for the testing laboratories on the implementation of the vulnerability analysis methods are laid down.
Ultra-low background DNA cloning system.
Goto, Kenta; Nagano, Yukio
2013-01-01
Yeast-based in vivo cloning is useful for cloning DNA fragments into plasmid vectors and is based on the ability of yeast to recombine the DNA fragments by homologous recombination. Although this method is efficient, it produces some by-products. We have developed an "ultra-low background DNA cloning system" on the basis of yeast-based in vivo cloning, by almost completely eliminating the generation of by-products and applying the method to commonly used Escherichia coli vectors, particularly those lacking yeast replication origins and carrying an ampicillin resistance gene (Amp(r)). First, we constructed a conversion cassette containing the DNA sequences in the following order: an Amp(r) 5' UTR (untranslated region) and coding region, an autonomous replication sequence and a centromere sequence from yeast, a TRP1 yeast selectable marker, and an Amp(r) 3' UTR. This cassette allowed conversion of the Amp(r)-containing vector into the yeast/E. coli shuttle vector through use of the Amp(r) sequence by homologous recombination. Furthermore, simultaneous transformation of the desired DNA fragment into yeast allowed cloning of this DNA fragment into the same vector. We rescued the plasmid vectors from all yeast transformants, and by-products containing the E. coli replication origin disappeared. Next, the rescued vectors were transformed into E. coli and the by-products containing the yeast replication origin disappeared. Thus, our method used yeast- and E. coli-specific "origins of replication" to eliminate the generation of by-products. Finally, we successfully cloned the DNA fragment into the vector with almost 100% efficiency.
Singularity computations. [finite element methods for elastoplastic flow
NASA Technical Reports Server (NTRS)
Swedlow, J. L.
1978-01-01
Direct descriptions of the structure of a singularity would describe the radial and angular distributions of the field quantities as explicitly as practicable along with some measure of the intensity of the singularity. This paper discusses such an approach based on recent development of numerical methods for elastoplastic flow. Attention is restricted to problems where one variable or set of variables is finite at the origin of the singularity but a second set is not.
Kim, Junho; Maeng, Ju Heon; Lim, Jae Seok; Son, Hyeonju; Lee, Junehawk; Lee, Jeong Ho; Kim, Sangwoo
2016-10-15
Advances in sequencing technologies have remarkably lowered the detection limit of somatic variants to a low frequency. However, calling mutations at this range is still confounded by many factors including environmental contamination. Vector contamination is a continuously occurring issue and is especially problematic since vector inserts are hardly distinguishable from the sample sequences. Such inserts, which may harbor polymorphisms and engineered functional mutations, can result in calling false variants at corresponding sites. Numerous vector-screening methods have been developed, but none could handle contamination from inserts because they are focusing on vector backbone sequences alone. We developed a novel method-Vecuum-that identifies vector-originated reads and resultant false variants. Since vector inserts are generally constructed from intron-less cDNAs, Vecuum identifies vector-originated reads by inspecting the clipping patterns at exon junctions. False variant calls are further detected based on the biased distribution of mutant alleles to vector-originated reads. Tests on simulated and spike-in experimental data validated that Vecuum could detect 93% of vector contaminants and could remove up to 87% of variant-like false calls with 100% precision. Application to public sequence datasets demonstrated the utility of Vecuum in detecting false variants resulting from various types of external contamination. Java-based implementation of the method is available at http://vecuum.sourceforge.net/ CONTACT: swkim@yuhs.acSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brodin, P; Guha, C; Tome, W
Purpose: To determine patterns of failure in laryngeal cancer treated with definitive IMRT by comparing two different methods for identifying the recurrence epicenter on follow-up PET/CT. Methods: We identified 20 patients treated for laryngeal squamous cell carcinoma with definitive IMRT who had loco-regional recurrence diagnosed on PET/CT. Recurrence PET/CT scans were co-registered with the original treatment planning CT using deformable image registration with the VoxAlign deformation engine in MIM Software. Recurrence volumes were delineated on co-registered follow-up scans using a semi-automatic PETedge tool and two separate methods were used to identify the recurrence point of origin: a) Finding the pointmore » within the recurrence volume for which the maximum distance to the surface of the surrounding recurrence volume is smaller than for any other point. b) Finding the point within the recurrence volume with the maximum standardized uptake value (SUVmax), without geometric restrictions.For each method the failure pattern was determined as whether the recurrence origin fell within the original high-dose target volumes GTV70, CTV70, PTV70 (receiving 70Gy), intermediate-risk PTV59 (receiving 59.4Gy) or low-risk PTV54 (receiving 54.1Gy), in the original treatment planning CT. Results: 23 primary/nodal recurrences from the 20 patients were analyzed. The three-dimensional distance between the two different origins was on average 10.5mm (std.dev. 10mm). Most recurrences originated in the high-dose target volumes for both methods with 13 (57%) and 11 (48%) in the GTV70 and 20 (87%) and 20 (87%) in the PTV70 for method a) and b), respectively. There was good agreement between the two methods in classifying the origin target volumes with 69% concordance for GTV70, 89% for CTV70 and 100% for PTV70. Conclusion: With strong agreement in patterns of failure between two separate methods for determining recurrence origin, we conclude that most recurrences occurred within the high-dose treatment region, which influences potential risk-adaptive treatment strategies.« less
Spalla, S; Baffi, C; Barbante, C; Turetta, C; Turretta, C; Cozzi, G; Beone, G M; Bettinelli, M
2009-10-30
In recent years identification of the geographical origin of food has grown more important as consumers have become interested in knowing the provenance of the food that they purchase and eat. Certification schemes and labels have thus been developed to protect consumers and genuine producers from the improper use of popular brand names or renowned geographical origins. As the tomato is one of the major components of what is considered to be the healthy Mediterranean diet, it is important to be able to determine the geographical origin of tomatoes and tomato-based products such as tomato sauce. The aim of this work is to develop an analytical method to determine rare earth elements (RRE) for the control of the geographic origin of tomatoes. The content of REE in tomato plant samples collected from an agricultural area in Piacenza, Italy, was determined, using four different digestion procedures with and without HF. Microwave dissolution with HNO3 + H2O2 proved to be the most suitable digestion procedure. Inductively coupled plasma quadrupole mass spectrometry (ICPQMS) and inductively coupled plasma sector field plasma mass spectrometry (ICPSFMS) instruments, both coupled with a desolvation system, were used to determine the REE in tomato plants in two different laboratories. A matched calibration curve method was used for the quantification of the analytes. The detection limits (MDLs) of the method ranged from 0.03 ng g(-1) for Ho, Tm, and Lu to 2 ng g(-1) for La and Ce. The precision, in terms of relative standard deviation on six replicates, was good, with values ranging, on average, from 6.0% for LREE (light rare earth elements) to 16.5% for HREE (heavy rare earth elements). These detection limits allowed the determination of the very low concentrations of REE present in tomato berries. For the concentrations of REE in tomato plants, the following trend was observed: roots > leaves > stems > berries. Copyright 2009 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Gustafsson, C.; Nordström, F.; Persson, E.; Brynolfsson, J.; Olsson, L. E.
2017-04-01
Dosimetric errors in a magnetic resonance imaging (MRI) only radiotherapy workflow may be caused by system specific geometric distortion from MRI. The aim of this study was to evaluate the impact on planned dose distribution and delineated structures for prostate patients, originating from this distortion. A method was developed, in which computer tomography (CT) images were distorted using the MRI distortion field. The displacement map for an optimized MRI treatment planning sequence was measured using a dedicated phantom in a 3 T MRI system. To simulate the distortion aspects of a synthetic CT (electron density derived from MR images), the displacement map was applied to CT images, referred to as distorted CT images. A volumetric modulated arc prostate treatment plan was applied to the original CT and the distorted CT, creating a reference and a distorted CT dose distribution. By applying the inverse of the displacement map to the distorted CT dose distribution, a dose distribution in the same geometry as the original CT images was created. For 10 prostate cancer patients, the dose difference between the reference dose distribution and inverse distorted CT dose distribution was analyzed in isodose level bins. The mean magnitude of the geometric distortion was 1.97 mm for the radial distance of 200-250 mm from isocenter. The mean percentage dose differences for all isodose level bins, were ⩽0.02% and the radiotherapy structure mean volume deviations were <0.2%. The method developed can quantify the dosimetric effects of MRI system specific distortion in a prostate MRI only radiotherapy workflow, separated from dosimetric effects originating from synthetic CT generation. No clinically relevant dose difference or structure deformation was found when 3D distortion correction and high acquisition bandwidth was used. The method could be used for any MRI sequence together with any anatomy of interest.
Gustafsson, C; Nordström, F; Persson, E; Brynolfsson, J; Olsson, L E
2017-04-21
Dosimetric errors in a magnetic resonance imaging (MRI) only radiotherapy workflow may be caused by system specific geometric distortion from MRI. The aim of this study was to evaluate the impact on planned dose distribution and delineated structures for prostate patients, originating from this distortion. A method was developed, in which computer tomography (CT) images were distorted using the MRI distortion field. The displacement map for an optimized MRI treatment planning sequence was measured using a dedicated phantom in a 3 T MRI system. To simulate the distortion aspects of a synthetic CT (electron density derived from MR images), the displacement map was applied to CT images, referred to as distorted CT images. A volumetric modulated arc prostate treatment plan was applied to the original CT and the distorted CT, creating a reference and a distorted CT dose distribution. By applying the inverse of the displacement map to the distorted CT dose distribution, a dose distribution in the same geometry as the original CT images was created. For 10 prostate cancer patients, the dose difference between the reference dose distribution and inverse distorted CT dose distribution was analyzed in isodose level bins. The mean magnitude of the geometric distortion was 1.97 mm for the radial distance of 200-250 mm from isocenter. The mean percentage dose differences for all isodose level bins, were ⩽0.02% and the radiotherapy structure mean volume deviations were <0.2%. The method developed can quantify the dosimetric effects of MRI system specific distortion in a prostate MRI only radiotherapy workflow, separated from dosimetric effects originating from synthetic CT generation. No clinically relevant dose difference or structure deformation was found when 3D distortion correction and high acquisition bandwidth was used. The method could be used for any MRI sequence together with any anatomy of interest.
Development of Scanning Ultrafast Electron Microscope Capability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, Kimberlee Chiyoko; Talin, Albert Alec; Chandler, David W.
Modern semiconductor devices rely on the transport of minority charge carriers. Direct examination of minority carrier lifetimes in real devices with nanometer-scale features requires a measurement method with simultaneously high spatial and temporal resolutions. Achieving nanometer spatial resolutions at sub-nanosecond temporal resolution is possible with pump-probe methods that utilize electrons as probes. Recently, a stroboscopic scanning electron microscope was developed at Caltech, and used to study carrier transport across a Si p-n junction [ 1 , 2 , 3 ] . In this report, we detail our development of a prototype scanning ultrafast electron microscope system at Sandia National Laboratoriesmore » based on the original Caltech design. This effort represents Sandia's first exploration into ultrafast electron microscopy.« less
ZAHABIUN, Farzaneh; SADJJADI, Seyed Mahmoud; ESFANDIARI, Farideh
2015-01-01
Background: Permanent slide preparation of nematodes especially small ones is time consuming, difficult and they become scarious margins. Regarding this problem, a modified double glass mounting method was developed and compared with classic method. Methods: A total of 209 nematode samples from human and animal origin were fixed and stained with Formaldehyde Alcohol Azocarmine Lactophenol (FAAL) followed by double glass mounting and classic dehydration method using Canada balsam as their mounting media. The slides were evaluated in different dates and times, more than four years. Different photos were made with different magnification during the evaluation time. Results: The double glass mounting method was stable during this time and comparable with classic method. There were no changes in morphologic structures of nematodes using double glass mounting method with well-defined and clear differentiation between different organs of nematodes in this method. Conclusion: Using this method is cost effective and fast for mounting of small nematodes comparing to classic method. PMID:26811729
A Novel Antibody Humanization Method Based on Epitopes Scanning and Molecular Dynamics Simulation
Zhao, Bin-Bin; Gong, Lu-Lu; Jin, Wen-Jing; Liu, Jing-Jun; Wang, Jing-Fei; Wang, Tian-Tian; Yuan, Xiao-Hui; He, You-Wen
2013-01-01
1-17-2 is a rat anti-human DEC-205 monoclonal antibody that induces internalization and delivers antigen to dendritic cells (DCs). The potentially clinical application of this antibody is limited by its murine origin. Traditional humanization method such as complementarity determining regions (CDRs) graft often leads to a decreased or even lost affinity. Here we have developed a novel antibody humanization method based on computer modeling and bioinformatics analysis. First, we used homology modeling technology to build the precise model of Fab. A novel epitope scanning algorithm was designed to identify antigenic residues in the framework regions (FRs) that need to be mutated to human counterpart in the humanization process. Then virtual mutation and molecular dynamics (MD) simulation were used to assess the conformational impact imposed by all the mutations. By comparing the root-mean-square deviations (RMSDs) of CDRs, we found five key residues whose mutations would destroy the original conformation of CDRs. These residues need to be back-mutated to rescue the antibody binding affinity. Finally we constructed the antibodies in vitro and compared their binding affinity by flow cytometry and surface plasmon resonance (SPR) assay. The binding affinity of the refined humanized antibody was similar to that of the original rat antibody. Our results have established a novel method based on epitopes scanning and MD simulation for antibody humanization. PMID:24278299
A method for inferring regional origins of neurodegeneration.
Torok, Justin; Maia, Pedro D; Powell, Fon; Pandya, Sneha; Raj, Ashish
2018-02-02
Alzheimer's disease, the most common form of dementia, is characterized by the emergence and spread of senile plaques and neurofibrillary tangles, causing widespread neurodegeneration. Though the progression of Alzheimer's disease is considered to be stereotyped, the significant variability within clinical populations obscures this interpretation on the individual level. Of particular clinical importance is understanding where exactly pathology, e.g. tau, emerges in each patient and how the incipient atrophy pattern relates to future spread of disease. Here we demonstrate a newly developed graph theoretical method of inferring prior disease states in patients with Alzheimer's disease and mild cognitive impairment using an established network diffusion model and an L1-penalized optimization algorithm. Although the 'seeds' of origin using our inference method successfully reproduce known trends in Alzheimer's disease staging on a population level, we observed that the high degree of heterogeneity between patients at baseline is also reflected in their seeds. Additionally, the individualized seeds are significantly more predictive of future atrophy than a single seed placed at the hippocampus. Our findings illustrate that understanding where disease originates in individuals is critical to determining how it progresses and that our method allows us to infer early stages of disease from atrophy patterns observed at diagnosis. © The Author(s) (2018). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Cai, Kai; Xiang, Zhangmin; Li, Hongqin; Zhao, Huina; Lin, Yechun; Pan, Wenjie; Lei, Bo
2017-12-01
This work describes a rapid, stable, and accurate method for determining the free amino acids, biogenic amines, and ammonium in tobacco. The target analytes were extracted with microwave-assisted extraction and then derivatized with diethyl ethoxymethylenemalonate, followed by ultra high performance liquid chromatography analysis. The experimental design used to optimize the microwave-assisted extraction conditions showed that the optimal extraction time was 10 min with a temperature of 60°C. The stability of aminoenone derivatives was improved by keeping the pH near 9.0, and there was no obvious degradation during the 80°C heating and room temperature storage. Under optimal conditions, this method showed good linearity (R 2 > 0.999) and sensitivity (limits of detection 0.010-0.081 μg/mL). The extraction recoveries were between 88.4 and 106.5%, while the repeatability and reproducibility ranged from 0.48 to 5.12% and from 1.56 to 6.52%, respectively. The newly developed method was employed to analyze the tobacco from different geographical origins. Principal component analysis showed that four geographical origins of tobacco could be clearly distinguished and that each had their characteristic components. The proposed method also showed great potential for further investigations on nitrogen metabolism in plants. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Crawford, Charles G.; Martin, Jeffrey D.
2017-07-21
In October 2012, the U.S. Geological Survey (USGS) began measuring the concentration of the pesticide fipronil and three of its degradates (desulfinylfipronil, fipronil sulfide, and fipronil sulfone) by a new laboratory method using direct aqueous-injection liquid chromatography tandem mass spectrometry (DAI LC–MS/MS). This method replaced the previous method—in use since 2002—that used gas chromatography/mass spectrometry (GC/MS). The performance of the two methods is not comparable for fipronil and the three degradates. Concentrations of these four chemical compounds determined by the DAI LC–MS/MS method are substantially lower than the GC/MS method. A method was developed to correct for the difference in concentrations obtained by the two laboratory methods based on a methods comparison field study done in 2012. Environmental and field matrix spike samples to be analyzed by both methods from 48 stream sites from across the United States were sampled approximately three times each for this study. These data were used to develop a relation between the two laboratory methods for each compound using regression analysis. The relations were used to calibrate data obtained by the older method to the new method in order to remove any biases attributable to differences in the methods. The coefficients of the equations obtained from the regressions were used to calibrate over 16,600 observations of fipronil, as well as the three degradates determined by the GC/MS method retrieved from the USGS National Water Information System. The calibrated values were then compared to over 7,800 observations of fipronil and to the three degradates determined by the DAI LC–MS/MS method also retrieved from the National Water Information System. The original and calibrated values from the GC/MS method, along with measures of uncertainty in the calibrated values and the original values from the DAI LC–MS/MS method, are provided in an accompanying data release.
Hu, Yue-Qing; Fung, Wing K
2003-08-01
The effect of a structured population on the likelihood ratio of a DNA mixture has been studied by the current authors and others. In practice, contributors of a DNA mixture may belong to different ethnic/racial origins, a situation especially common in multi-racial countries such as the USA and Singapore. We have developed a computer software which is available on the web for evaluating DNA mixtures in multi-structured populations. The software can deal with various DNA mixture problems that cannot be handled by the methods given in a recent article of Fung and Hu.
Early Prediction of Intensive Care Unit-Acquired Weakness: A Multicenter External Validation Study.
Witteveen, Esther; Wieske, Luuk; Sommers, Juultje; Spijkstra, Jan-Jaap; de Waard, Monique C; Endeman, Henrik; Rijkenberg, Saskia; de Ruijter, Wouter; Sleeswijk, Mengalvio; Verhamme, Camiel; Schultz, Marcus J; van Schaik, Ivo N; Horn, Janneke
2018-01-01
An early diagnosis of intensive care unit-acquired weakness (ICU-AW) is often not possible due to impaired consciousness. To avoid a diagnostic delay, we previously developed a prediction model, based on single-center data from 212 patients (development cohort), to predict ICU-AW at 2 days after ICU admission. The objective of this study was to investigate the external validity of the original prediction model in a new, multicenter cohort and, if necessary, to update the model. Newly admitted ICU patients who were mechanically ventilated at 48 hours after ICU admission were included. Predictors were prospectively recorded, and the outcome ICU-AW was defined by an average Medical Research Council score <4. In the validation cohort, consisting of 349 patients, we analyzed performance of the original prediction model by assessment of calibration and discrimination. Additionally, we updated the model in this validation cohort. Finally, we evaluated a new prediction model based on all patients of the development and validation cohort. Of 349 analyzed patients in the validation cohort, 190 (54%) developed ICU-AW. Both model calibration and discrimination of the original model were poor in the validation cohort. The area under the receiver operating characteristics curve (AUC-ROC) was 0.60 (95% confidence interval [CI]: 0.54-0.66). Model updating methods improved calibration but not discrimination. The new prediction model, based on all patients of the development and validation cohort (total of 536 patients) had a fair discrimination, AUC-ROC: 0.70 (95% CI: 0.66-0.75). The previously developed prediction model for ICU-AW showed poor performance in a new independent multicenter validation cohort. Model updating methods improved calibration but not discrimination. The newly derived prediction model showed fair discrimination. This indicates that early prediction of ICU-AW is still challenging and needs further attention.
Human ear detection in the thermal infrared spectrum
NASA Astrophysics Data System (ADS)
Abaza, Ayman; Bourlai, Thirimachos
2012-06-01
In this paper the problem of human ear detection in the thermal infrared (IR) spectrum is studied in order to illustrate the advantages and limitations of the most important steps of ear-based biometrics that can operate in day and night time environments. The main contributions of this work are two-fold: First, a dual-band database is assembled that consists of visible and thermal profile face images. The thermal data was collected using a high definition middle-wave infrared (3-5 microns) camera that is capable of acquiring thermal imprints of human skin. Second, a fully automated, thermal imaging based ear detection method is developed for real-time segmentation of human ears in either day or night time environments. The proposed method is based on Haar features forming a cascaded AdaBoost classifier (our modified version of the original Viola-Jones approach1 that was designed to be applied mainly in visible band images). The main advantage of the proposed method, applied on our profile face image data set collected in the thermal-band, is that it is designed to reduce the learning time required by the original Viola-Jones method from several weeks to several hours. Unlike other approaches reported in the literature, which have been tested but not designed to operate in the thermal band, our method yields a high detection accuracy that reaches ~ 91.5%. Further analysis on our data set yielded that: (a) photometric normalization techniques do not directly improve ear detection performance. However, when using a certain photometric normalization technique (CLAHE) on falsely detected images, the detection rate improved by ~ 4%; (b) the high detection accuracy of our method did not degrade when we lowered down the original spatial resolution of thermal ear images. For example, even after using one third of the original spatial resolution (i.e. ~ 20% of the original computational time) of the thermal profile face images, the high ear detection accuracy of our method remained unaffected. This resulted also in speeding up the detection time of an ear image from 265 to 17 milliseconds per image. To the best of our knowledge this is the first time that the problem of human ear detection in the thermal band is being investigated in the open literature.
Zhou, Y.; Ren, Y.; Tang, D.; Bohor, B.
1994-01-01
Kaolinitic tonsteins of altered synsedimentary volcanic ash-fall origin are well developed in the Late Permian coal-bearing formations of eastern Yunnan Province. Because of their unique origin, wide lateral extent, relatively constant thickness and sharp contacts with enclosing strata, great importance has been attached to these isochronous petrographic markers. In order to compare tonsteins with co-existing, non-cineritic claystones and characterize the individuality of tonsteins from different horizons for coal bed correlation, a semi-quantitative method was developed that is based on statistical analyses of the concentration and morphology of zircons and their spatial distribution patterns. This zircon-based analytical method also serves as a means for reconstructing volcanic ash-fall dispersal patterns. The results demonstrate that zircons from claystones of two different origins (i.e., tonstein and non-cineritic claystone) differ greatly in their relative abundances, crystal morphologies and spatial distribution patterns. Tonsteins from the same area but from different horizons are characterized by their own unique statistical patterns in terms of zircon concentration values and morphologic parameters (crystal length, width and the ratio of these values), thus facilitating stratigraphic correlation. Zircons from the same tonstein horizon also show continuous variation in these statistical patterns as a function of areal distribution, making it possible to identify the main path and direction in which the volcanic source materials were transported by prevailing winds. ?? 1994.
Effective evaluation of privacy protection techniques in visible and thermal imagery
NASA Astrophysics Data System (ADS)
Nawaz, Tahir; Berg, Amanda; Ferryman, James; Ahlberg, Jörgen; Felsberg, Michael
2017-09-01
Privacy protection may be defined as replacing the original content in an image region with a (less intrusive) content having modified target appearance information to make it less recognizable by applying a privacy protection technique. Indeed, the development of privacy protection techniques also needs to be complemented with an established objective evaluation method to facilitate their assessment and comparison. Generally, existing evaluation methods rely on the use of subjective judgments or assume a specific target type in image data and use target detection and recognition accuracies to assess privacy protection. An annotation-free evaluation method that is neither subjective nor assumes a specific target type is proposed. It assesses two key aspects of privacy protection: "protection" and "utility." Protection is quantified as an appearance similarity, and utility is measured as a structural similarity between original and privacy-protected image regions. We performed an extensive experimentation using six challenging datasets (having 12 video sequences), including a new dataset (having six sequences) that contains visible and thermal imagery. The new dataset is made available online for the community. We demonstrate effectiveness of the proposed method by evaluating six image-based privacy protection techniques and also show comparisons of the proposed method over existing methods.
Methods for the Study of Gonadal Development.
Piprek, Rafal P
2016-01-01
Current knowledge on gonadal development and sex determination is the product of many decades of research involving a variety of scientific methods from different biological disciplines such as histology, genetics, biochemistry, and molecular biology. The earliest embryological investigations, followed by the invention of microscopy and staining methods, were based on histological examinations. The most robust development of histological staining techniques occurred in the second half of the nineteenth century and resulted in structural descriptions of gonadogenesis. These first studies on gonadal development were conducted on domesticated animals; however, currently the mouse is the most extensively studied species. The next key point in the study of gonadogenesis was the advancement of methods allowing for the in vitro culture of fetal gonads. For instance, this led to the description of the origin of cell lines forming the gonads. Protein detection using antibodies and immunolabeling methods and the use of reporter genes were also invaluable for developmental studies, enabling the visualization of the formation of gonadal structure. Recently, genetic and molecular biology techniques, especially gene expression analysis, have revolutionized studies on gonadogenesis and have provided insight into the molecular mechanisms that govern this process. The successive invention of new methods is reflected in the progress of research on gonadal development.
Origin and initiation mechanisms of neuroblastoma.
Tsubota, Shoma; Kadomatsu, Kenji
2018-05-01
Neuroblastoma is an embryonal malignancy that affects normal development of the adrenal medulla and paravertebral sympathetic ganglia in early childhood. Extensive studies have revealed the molecular characteristics of human neuroblastomas, including abnormalities at genome, epigenome and transcriptome levels. However, neuroblastoma initiation mechanisms and even its origin are long-standing mysteries. In this review article, we summarize the current knowledge about normal development of putative neuroblastoma sources, namely sympathoadrenal lineage of neural crest cells and Schwann cell precursors that were recently identified as the source of adrenal chromaffin cells. A plausible origin of enigmatic stage 4S neuroblastoma is also discussed. With regard to the initiation mechanisms, we review genetic abnormalities in neuroblastomas and their possible association to initiation mechanisms. We also summarize evidences of neuroblastoma initiation observed in genetically engineered animal models, in which epigenetic alterations were involved, including transcriptomic upregulation by N-Myc and downregulation by polycomb repressive complex 2. Finally, several in vitro experimental methods are proposed that hopefully will accelerate our comprehension of neuroblastoma initiation. Thus, this review summarizes the state-of-the-art knowledge about the mechanisms of neuroblastoma initiation, which is critical for developing new strategies to cure children with neuroblastoma.
Determination of 13C/12C Isotope Ratio in Alcohols of Different Origin by 1н Nuclei NMR-Spectroscopy
NASA Astrophysics Data System (ADS)
Dzhimak, S. S.; Basov, A. A.; Buzko, V. Yu.; Kopytov, G. F.; Kashaev, D. V.; Shashkov, D. I.; Shlapakov, M. S.; Baryshev, M. G.
2017-02-01
A new express method of indirect assessment of 13C/12C isotope ratio on 1H nuclei is developed to verify the authenticity of ethanol origin in alcohol-water-based fluids and assess the facts of various alcoholic beverages falsification. It is established that in water-based alcohol-containing systems, side satellites for the signals of ethanol methyl and methylene protons in the NMR spectra on 1H nuclei, correspond to the protons associated with 13C nuclei. There is a direct correlation between the intensities of the signals of ethanol methyl and methylene protons' 1H- NMR and their side satellites, therefore, the data obtained can be used to assess 13C/12C isotope ratio in water-based alcohol-containing systems. The analysis of integrated intensities of main and satellite signals of methyl and methylene protons of ethanol obtained by NMR on 1H nuclei makes it possible to differentiate between ethanol of synthetic and natural origin. Furthermore, the method proposed made it possible to differentiate between wheat and corn ethanol.
Zahabiun, Farzaneh; Sadjjadi, Seyed Mahmoud; Esfandiari, Farideh
2015-01-01
Permanent slide preparation of nematodes especially small ones is time consuming, difficult and they become scarious margins. Regarding this problem, a modified double glass mounting method was developed and compared with classic method. A total of 209 nematode samples from human and animal origin were fixed and stained with Formaldehyde Alcohol Azocarmine Lactophenol (FAAL) followed by double glass mounting and classic dehydration method using Canada balsam as their mounting media. The slides were evaluated in different dates and times, more than four years. Different photos were made with different magnification during the evaluation time. The double glass mounting method was stable during this time and comparable with classic method. There were no changes in morphologic structures of nematodes using double glass mounting method with well-defined and clear differentiation between different organs of nematodes in this method. Using this method is cost effective and fast for mounting of small nematodes comparing to classic method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitten, W.B.
This report covers the three main projects that collectively comprised the Advanced Ion Trap Mass Spectrometry Program. Chapter 1 describes the direct interrogation of individual particles by laser desorption within the ion trap mass spectrometer analyzer. The goals were (1) to develop an ''intelligent trigger'' capable of distinguishing particles of biological origin from those of nonbiological origin in the background and interferent particles and (2) to explore the capability for individual particle identification. Direct interrogation of particles by laser ablation and ion trap mass spectrometry was shown to have good promise for discriminating between particles of biological origin and thosemore » of nonbiological origin, although detailed protocols and operating conditions were not worked out. A library of more than 20,000 spectra of various types of biological particles has been assembled. Methods based on multivariate analysis and on neural networks were used to discriminate between particles of biological origin and those of nonbiological origin. It was possible to discriminate between at least some species of bacteria if mass spectra of several hundred similar particles were obtained. Chapter 2 addresses the development of a new ion trap mass analyzer geometry that offers the potential for a significant increase in ion storage capacity for a given set of analyzer operating conditions. This geometry may lead to the development of smaller, lower-power field-portable ion trap mass spectrometers while retaining laboratory-scale analytical performance. A novel ion trap mass spectrometer based on toroidal ion storage geometry has been developed. The analyzer geometry is based on the edge rotation of a quadrupolar ion trap cross section into the shape of a torus. Initial performance of this device was poor, however, due to the significant contribution of nonlinear fields introduced by the rotation of the symmetric ion-trapping geometry. These nonlinear resonances contributed to poor mass resolution and sensitivity and to erratic ion ejection behavior. To correct for these nonlinear effects, the geometry of the toroid ion trap analyzer has been modified to create an asymmetric torus, as first suggested by computer simulations that predicted significantly improved performance and unit mass resolution for this geometry. A reduced-sized version (one-fifth scale) has been fabricated but was not tested within the scope of this project. Chapter 3 describes groundbreaking progress toward the use of ion-ion chemistry to control the charge state of ions formed by the electrospray ionization process, which in turn enables precision analysis of whole proteins. In addition, this technique may offer the unique possibility of a priori identification of unknown biological material when employed with existing proteomics and genomic databases. Ion-ion chemistry within the ion trap was used to reduce the ions in highly charged states to states of +1 and +2 charges. Reduction in charge greatly simplifies identification of molecular weights of fragments from large biological molecules. This technique enables the analysis of whole proteins as biomarkers for the detection and identification of all three classes of biological weapons (bacteria, toxins, and viruses). In addition to methods development, tests were carried out with samples of tap water, local creek water, and soil (local red clay) spiked with melittin (bee venom), cholera toxin, and virus MS2. All three analytes were identified in tap water and soil; however, all three were problematic for detection in creek water at concentrations of 1 nM. More development of methods is needed.« less
NASA Technical Reports Server (NTRS)
Adams, Gaynor J; DUGAN DUANE W
1952-01-01
A method of analysis based on slender-wing theory is developed to investigate the characteristics in roll of slender cruciform wings and wing-body combinations. The method makes use of the conformal mapping processes of classical hydrodynamics which transform the region outside a circle and the region outside an arbitrary arrangement of line segments intersecting at the origin. The method of analysis may be utilized to solve other slender cruciform wing-body problems involving arbitrarily assigned boundary conditions. (author)
Roh, Min K; Gillespie, Dan T; Petzold, Linda R
2010-11-07
The weighted stochastic simulation algorithm (wSSA) was developed by Kuwahara and Mura [J. Chem. Phys. 129, 165101 (2008)] to efficiently estimate the probabilities of rare events in discrete stochastic systems. The wSSA uses importance sampling to enhance the statistical accuracy in the estimation of the probability of the rare event. The original algorithm biases the reaction selection step with a fixed importance sampling parameter. In this paper, we introduce a novel method where the biasing parameter is state-dependent. The new method features improved accuracy, efficiency, and robustness.
[Isolation and identification methods of enterobacteria group and its technological advancement].
Furuta, Itaru
2007-08-01
In the last half-century, isolation and identification methods of enterobacteria groups have markedly improved by technological advancement. Clinical microbiology tests have changed overtime from tube methods to commercial identification kits and automated identification. Tube methods are the original method for the identification of enterobacteria groups, that is, a basically essential method to recognize bacterial fermentation and biochemical principles. In this paper, traditional tube tests are discussed, such as the utilization of carbohydrates, indole, methyl red, and citrate and urease tests. Commercial identification kits and automated instruments by computer based analysis as current methods are also discussed, and those methods provide rapidity and accuracy. Nonculture techniques of nucleic acid typing methods using PCR analysis, and immunochemical methods using monoclonal antibodies can be further developed.
Face recognition based on symmetrical virtual image and original training image
NASA Astrophysics Data System (ADS)
Ke, Jingcheng; Peng, Yali; Liu, Shigang; Li, Jun; Pei, Zhao
2018-02-01
In face representation-based classification methods, we are able to obtain high recognition rate if a face has enough available training samples. However, in practical applications, we only have limited training samples to use. In order to obtain enough training samples, many methods simultaneously use the original training samples and corresponding virtual samples to strengthen the ability of representing the test sample. One is directly using the original training samples and corresponding mirror samples to recognize the test sample. However, when the test sample is nearly symmetrical while the original training samples are not, the integration of the original training and mirror samples might not well represent the test samples. To tackle the above-mentioned problem, in this paper, we propose a novel method to obtain a kind of virtual samples which are generated by averaging the original training samples and corresponding mirror samples. Then, the original training samples and the virtual samples are integrated to recognize the test sample. Experimental results on five face databases show that the proposed method is able to partly overcome the challenges of the various poses, facial expressions and illuminations of original face image.
Introducing Systems Approaches
NASA Astrophysics Data System (ADS)
Reynolds, Martin; Holwell, Sue
Systems Approaches to Managing Change brings together five systems approaches to managing complex issues, each having a proven track record of over 25 years. The five approaches are: System Dynamics (SD) developed originally in the late 1950s by Jay Forrester Viable Systems Model (VSM) developed originally in the late 1960s by Stafford Beer Strategic Options Development and Analysis (SODA: with cognitive mapping) developed originally in the 1970s by Colin Eden Soft Systems Methodology (SSM) developed originally in the 1970s by Peter Checkland Critical Systems Heuristics (CSH) developed originally in the late 1970s by Werner Ulrich
High-speed bioimaging with frequency-division-multiplexed fluorescence confocal microscopy
NASA Astrophysics Data System (ADS)
Mikami, Hideharu; Harmon, Jeffrey; Ozeki, Yasuyuki; Goda, Keisuke
2017-04-01
We present methods of fluorescence confocal microscopy that enable unprecedentedly high frame rate of > 10,000 fps. The methods are based on a frequency-division multiplexing technique, which was originally developed in the field of communication engineering. Specifically, we achieved a broad bandwidth ( 400 MHz) of detection signals using a dual- AOD method and overcame limitations in frame rate, due to a scanning device, by using a multi-line focusing method, resulting in a significant increase in frame rate. The methods have potential biomedical applications such as observation of sub-millisecond dynamics in biological tissues, in-vivo three-dimensional imaging, and fluorescence imaging flow cytometry.
Sugar composition of French royal jelly for comparison with commercial and artificial sugar samples.
Daniele, Gaëlle; Casabianca, Hervé
2012-09-15
A gas chromatographic method was developed to quantify the major and minor sugars of 400 Royal Jellies (RJs). Their contents were compared in relation to the geographical origins and different production methods. A reliable database was established from the analysis of 290 RJs harvested in different French areas that took into account the diversity of geographical origin, harvesting season, forage sources available in the environment corresponding to natural food of the bees: pollen and nectar. Around 30 RJ samples produced by Italian beekeepers, about sixty-ones from French market, and around thirty-ones derived from feeding experiments were analysed and compared with our database. Fructose and glucose contents are in the range 2.3-7.8% and 3.4-7.7%, respectively, whatever the RJ's origin. On the contrary, differences in minor sugar composition are observed. Indeed sucrose and erlose contents in French RJs are lesser than 1.7% and 0.3%, respectively, whereas they reach 3.9% and 2.0% in some commercial samples and 5.1% and 1.7% in RJs produced from feeding experiments. This study could be used to discriminate different production methods and provide an additional tool for identifying unknown commercial RJs. Copyright © 2012 Elsevier Ltd. All rights reserved.
Fekrazad, Reza; Naghdi, Nafiseh; Nokhbatolfoghahaei, Hanieh; Bagheri, Hossein
2016-01-01
Several methods have been employed for cancer treatment including surgery, chemotherapy and radiation therapy. Today, recent advances in medical science and development of new technologies, have led to the introduction of new methods such as hormone therapy, Photodynamic therapy (PDT), treatments using nanoparticles and eventually combinations of lasers and nanoparticles. The unique features of LASERs such as photo-thermal properties and the particular characteristics of nanoparticles, given their extremely small size, may provide an interesting combined therapeutic effect. The purpose of this study was to review the simultaneous application of lasers and metal nanoparticles for the treatment of cancers with epithelial origin. A comprehensive search in electronic sources including PubMed, Google Scholar and Science Direct was carried out between 2000 and 2013. Among the initial 400 articles, 250 articles applied nanoparticles and lasers in combination, in which more than 50 articles covered the treatment of cancer with epithelial origin. In the future, the combination of laser and nanoparticles may be used as a new or an alternative method for cancer therapy or diagnosis. Obviously, to exclude the effect of laser’s wavelength and nanoparticle’s properties more animal studies and clinical trials are required as a lack of perfect studies PMID:27330701
Lecrenier, M. C.; Ledoux, Q.; Berben, G.; Fumière, O.; Saegerman, C.; Baeten, V.; Veys, P.
2014-01-01
Molecular biology techniques such as PCR constitute powerful tools for the determination of the taxonomic origin of bones. DNA degradation and contamination by exogenous DNA, however, jeopardise bone identification. Despite the vast array of techniques used to decontaminate bone fragments, the isolation and determination of bone DNA content are still problematic. Within the framework of the eradication of transmissible spongiform encephalopathies (including BSE, commonly known as “mad cow disease”), a fluorescence in situ hybridization (FISH) protocol was developed. Results from the described study showed that this method can be applied directly to bones without a demineralisation step and that it allows the identification of bovine and ruminant bones even after severe processing. The results also showed that the method is independent of exogenous contamination and that it is therefore entirely appropriate for this application. PMID:25034259
Lecrenier, M C; Ledoux, Q; Berben, G; Fumière, O; Saegerman, C; Baeten, V; Veys, P
2014-07-17
Molecular biology techniques such as PCR constitute powerful tools for the determination of the taxonomic origin of bones. DNA degradation and contamination by exogenous DNA, however, jeopardise bone identification. Despite the vast array of techniques used to decontaminate bone fragments, the isolation and determination of bone DNA content are still problematic. Within the framework of the eradication of transmissible spongiform encephalopathies (including BSE, commonly known as "mad cow disease"), a fluorescence in situ hybridization (FISH) protocol was developed. Results from the described study showed that this method can be applied directly to bones without a demineralisation step and that it allows the identification of bovine and ruminant bones even after severe processing. The results also showed that the method is independent of exogenous contamination and that it is therefore entirely appropriate for this application.
Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko; Ikegaya, Hiroshi; Ozawa, Takeaki
2017-09-19
Often in criminal investigations, discrimination of types of body fluid evidence is crucially important to ascertain how a crime was committed. Compared to current methods using biochemical techniques, vibrational spectroscopic approaches can provide versatile applicability to identify various body fluid types without sample invasion. However, their applicability is limited to pure body fluid samples because important signals from body fluids incorporated in a substrate are affected strongly by interference from substrate signals. Herein, we describe a novel approach to recover body fluid signals that are embedded in strong substrate interferences using attenuated total reflection Fourier transform infrared (ATR FT-IR) spectroscopy and an innovative multivariate spectral processing. This technique supported detection of covert features of body fluid signals, and then identified origins of body fluid stains on substrates. We discriminated between ATR FT-IR spectra of postmortem blood (PB) and those of antemortem blood (AB) by creating a multivariate statistics model. From ATR FT-IR spectra of PB and AB stains on interfering substrates (polyester, cotton, and denim), blood-originated signals were extracted by a weighted linear regression approach we developed originally using principal components of both blood and substrate spectra. The blood-originated signals were finally classified by the discriminant model, demonstrating high discriminant accuracy. The present method can identify body fluid evidence independently of the substrate type, which is expected to promote the application of vibrational spectroscopic techniques in forensic body fluid analysis.
Using DNA to track the origin of the largest ivory seizure since the 1989 trade ban.
Wasser, Samuel K; Mailand, Celia; Booth, Rebecca; Mutayoba, Benezeth; Kisamo, Emily; Clark, Bill; Stephens, Matthew
2007-03-06
The illegal ivory trade recently intensified to the highest levels ever reported. Policing this trafficking has been hampered by the inability to reliably determine geographic origin of contraband ivory. Ivory can be smuggled across multiple international borders and along numerous trade routes, making poaching hotspots and potential trade routes difficult to identify. This fluidity also makes it difficult to refute a country's denial of poaching problems. We extend an innovative DNA assignment method to determine the geographic origin(s) of large elephant ivory seizures. A Voronoi tessellation method is used that utilizes genetic similarities across tusks to simultaneously infer the origin of multiple samples that could have one or more common origin(s). We show that this joint analysis performs better than sample-by-sample methods in assigning sample clusters of known origin. The joint method is then used to infer the geographic origin of the largest ivory seizure since the 1989 ivory trade ban. Wildlife authorities initially suspected that this ivory came from multiple locations across forest and savanna Africa. However, we show that the ivory was entirely from savanna elephants, most probably originating from a narrow east-to-west band of southern Africa, centered on Zambia. These findings enabled law enforcement to focus their investigation to a smaller area and fewer trade routes and led to changes within the Zambian government to improve antipoaching efforts. Such outcomes demonstrate the potential of genetic analyses to help combat the expanding wildlife trade by identifying origin(s) of large seizures of contraband ivory. Broader applications to wildlife trade are discussed.
Using DNA to track the origin of the largest ivory seizure since the 1989 trade ban
Wasser, Samuel K.; Mailand, Celia; Booth, Rebecca; Mutayoba, Benezeth; Kisamo, Emily; Clark, Bill; Stephens, Matthew
2007-01-01
The illegal ivory trade recently intensified to the highest levels ever reported. Policing this trafficking has been hampered by the inability to reliably determine geographic origin of contraband ivory. Ivory can be smuggled across multiple international borders and along numerous trade routes, making poaching hotspots and potential trade routes difficult to identify. This fluidity also makes it difficult to refute a country's denial of poaching problems. We extend an innovative DNA assignment method to determine the geographic origin(s) of large elephant ivory seizures. A Voronoi tessellation method is used that utilizes genetic similarities across tusks to simultaneously infer the origin of multiple samples that could have one or more common origin(s). We show that this joint analysis performs better than sample-by-sample methods in assigning sample clusters of known origin. The joint method is then used to infer the geographic origin of the largest ivory seizure since the 1989 ivory trade ban. Wildlife authorities initially suspected that this ivory came from multiple locations across forest and savanna Africa. However, we show that the ivory was entirely from savanna elephants, most probably originating from a narrow east-to-west band of southern Africa, centered on Zambia. These findings enabled law enforcement to focus their investigation to a smaller area and fewer trade routes and led to changes within the Zambian government to improve antipoaching efforts. Such outcomes demonstrate the potential of genetic analyses to help combat the expanding wildlife trade by identifying origin(s) of large seizures of contraband ivory. Broader applications to wildlife trade are discussed. PMID:17360505
ERIC Educational Resources Information Center
Walker, Martyn A.
2015-01-01
This paper traces the origins and development of coal mining education and training in Britain from 1900 to the 1970s, by which time the coal industry had substantially declined. It looks at the progress from working-class self-help to national policy in support of education and training. The research makes use of college prospectuses and…
The Sanctuary Model of Trauma-Informed Organizational Change
ERIC Educational Resources Information Center
Bloom, Sandra L.; Sreedhar, Sarah Yanosy
2008-01-01
This article features the Sanctuary Model[R], a trauma-informed method for creating or changing an organizational culture. Although the model is based on trauma theory, its tenets have application in working with children and adults across a wide diagnostic spectrum. Originally developed in a short-term, acute inpatient psychiatric setting for…
Educational Methods for Deaf-Blind and Severely Handicapped Students, Volume I.
ERIC Educational Resources Information Center
Peak, Orel; And Others
The 17 papers were originally presented at a 1977 series of workshops for personnel serving the deaf-blind and severely handicapped and are organized into the following workshop topics: programing and program development, the senses, cognition, communication, and behavior managment. The papers have the following titles and authors: "Education for…
User Selection of Purchased Information Services. Interim Technical Report (June 1975-January 1976).
ERIC Educational Resources Information Center
Hall, Homer J.
Interviews conducted in the first phase of a project to develop a method for user selection of purchased scientific and technical information services identified a number of relationship among different populations of users. Research scientists, engineers, and patent attorneys want convenient access to original data identified in the search.…
40 CFR 52.875 - Original identification of plan section.
Code of Federal Regulations, 2010 CFR
2010-07-01
... recommendations concerning the designation of Air Quality Maintenance Areas was submitted by letter from the State... Confidential Information, 2A-13 Registration and Permit System; Exemptions, 2A-14 Review of New or Altered... efficiency (TE) and vapor processing systems. Test methods which are developed by the state must be approved...
Using the "Rural Atelier" as an Educational Method in Landscape Studies
ERIC Educational Resources Information Center
Meijles, Erik; Van Hoven, Bettina
2010-01-01
Drawing on experiences from a project conducted in the "Drentsche Aa" area in the Netherlands, this article discusses the concept of the "rural atelier" as a form of problem-based learning. The rural atelier principle was used originally in rural development planning and described as such by Foorthuis (2005) and Elerie and…
ERIC Educational Resources Information Center
Bogo, Marion; Regehr, Cheryl; Logie, Carmen; Katz, Ellen; Mylopoulos, Maria; Regehr, Glenn
2011-01-01
The development of standardized, valid, and reliable methods for assessment of students' practice competence continues to be a challenge for social work educators. In this study, the Objective Structured Clinical Examination (OSCE), originally used in medicine to assess performance through simulated interviews, was adapted for social work to…
ERIC Educational Resources Information Center
Ivey, Allen E.; Daniels, Thomas
2016-01-01
Originating in 1966-68, Microcounseling was the first video-based listening program and its purposes and methods overlap with the communications field. This article presents history, research, and present applications in multiple fields nationally and internationally. Recent neuroscience and neurobiology research is presented with the important…
O'Connor, A M; Sargeant, J M; Dohoo, I R; Erb, H N; Cevallos, M; Egger, M; Ersbøll, A K; Martin, S W; Nielsen, L R; Pearl, D L; Pfeiffer, D U; Sanchez, J; Torrence, M E; Vigre, H; Waldner, C; Ward, M P
2016-11-01
The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement was first published in 2007 and again in 2014. The purpose of the original STROBE was to provide guidance for authors, reviewers, and editors to improve the comprehensiveness of reporting; however, STROBE has a unique focus on observational studies. Although much of the guidance provided by the original STROBE document is directly applicable, it was deemed useful to map those statements to veterinary concepts, provide veterinary examples, and highlight unique aspects of reporting in veterinary observational studies. Here, we present the examples and explanations for the checklist items included in the STROBE-Vet statement. Thus, this is a companion document to the STROBE-Vet statement methods and process document (JVIM_14575 "Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary (STROBE-Vet) Statement" undergoing proofing), which describes the checklist and how it was developed. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
NASA Astrophysics Data System (ADS)
Bai, Rui; Tiejian, Li; Huang, Yuefei; Jiaye, Li; Wang, Guangqian; Yin, Dongqin
2015-12-01
The increasing resolution of Digital Elevation Models (DEMs) and the development of drainage network extraction algorithms make it possible to develop high-resolution drainage networks for large river basins. These vector networks contain massive numbers of river reaches with associated geographical features, including topological connections and topographical parameters. These features create challenges for efficient map display and data management. Of particular interest are the requirements of data management for multi-scale hydrological simulations using multi-resolution river networks. In this paper, a hierarchical pyramid method is proposed, which generates coarsened vector drainage networks from the originals iteratively. The method is based on the Horton-Strahler's (H-S) order schema. At each coarsening step, the river reaches with the lowest H-S order are pruned, and their related sub-basins are merged. At the same time, the topological connections and topographical parameters of each coarsened drainage network are inherited from the former level using formulas that are presented in this study. The method was applied to the original drainage networks of a watershed in the Huangfuchuan River basin extracted from a 1-m-resolution airborne LiDAR DEM and applied to the full Yangtze River basin in China, which was extracted from a 30-m-resolution ASTER GDEM. In addition, a map-display and parameter-query web service was published for the Mississippi River basin, and its data were extracted from the 30-m-resolution ASTER GDEM. The results presented in this study indicate that the developed method can effectively manage and display massive amounts of drainage network data and can facilitate multi-scale hydrological simulations.
Takahashi-Omoe, Hiromi; Omoe, Katsuhiko; Okabe, Nobuhiko
2009-10-06
Quantitative survey of research articles, as an application of bibliometrics, is an effective tool for grasping overall trends in various medical research fields. This type of survey has been also applied to infectious disease research; however, previous studies were insufficient as they underestimated articles published in non-English or regional journals. Using a combination of Scopus and PubMed, the databases of scientific literature, and English and non-English keywords directly linked to infectious disease control, we identified international and regional infectious disease journals. In order to ascertain whether the newly selected journals were appropriate to survey a wide range of research articles, we compared the number of original articles and reviews registered in the selected journals to those in the 'Infectious Disease Category' of the Science Citation Index Expanded (SCI Infectious Disease Category) during 1998-2006. Subsequently, we applied the newly selected journals to survey the number of original articles and reviews originating from 11 Asian countries during the same period. One hundred journals, written in English or 7 non-English languages, were newly selected as infectious disease journals. The journals published 14,156 original articles and reviews of Asian origin and 118,158 throughout the world, more than those registered in the SCI Infectious Disease Category (4,621 of Asian origin and 66,518 of the world in the category). In Asian trend analysis of the 100 journals, Japan had the highest percentage of original articles and reviews in the area, and no noticeable increase in articles was revealed during the study period. China, India and Taiwan had relatively large numbers and a high increase rate of original articles among Asian countries. When adjusting the publication of original articles according to the country population and the gross domestic product (GDP), Singapore and Taiwan were the most productive. A survey of 100 selected journals is more sensitive than the SCI Infectious Disease Category from the viewpoint of avoiding underestimating the number of infectious disease research articles of Asian origin. The survey method is applicable to grasp global trends in disease research, although the method may require further development.
NASA Technical Reports Server (NTRS)
Klein, Vladislav
2002-01-01
The program objectives were defined in the original proposal entitled 'Program of Research in Flight Dynamics in the JIAFS at NASA Langley Research Center' which was originated March 20, 1975, and yearly renewals of the research program dated December 1, 1998 to December 31, 2002. The program included three major topics: 1) Improvement of existing methods and development of new methods for flight and wind tunnel data analysis based on system identification methodology; 2) Application of these methods to flight and wind tunnel data obtained from advanced aircraft; 3) Modeling and control of aircraft. The principal investigator of the program was Dr. Vladislav Klein, Professor Emeritus at The George Washington University, DC. Seven Graduate Research Scholar Assistants (GRSA) participated in the program. The results of the research conducted during four years of the total co-operative period were published in 2 NASA Technical Reports, 3 thesis and 3 papers. The list of these publications is included.
Study and application of acoustic emission testing in fault diagnosis of low-speed heavy-duty gears.
Gao, Lixin; Zai, Fenlou; Su, Shanbin; Wang, Huaqing; Chen, Peng; Liu, Limei
2011-01-01
Most present studies on the acoustic emission signals of rotating machinery are experiment-oriented, while few of them involve on-spot applications. In this study, a method of redundant second generation wavelet transform based on the principle of interpolated subdivision was developed. With this method, subdivision was not needed during the decomposition. The lengths of approximation signals and detail signals were the same as those of original ones, so the data volume was twice that of original signals; besides, the data redundancy characteristic also guaranteed the excellent analysis effect of the method. The analysis of the acoustic emission data from the faults of on-spot low-speed heavy-duty gears validated the redundant second generation wavelet transform in the processing and denoising of acoustic emission signals. Furthermore, the analysis illustrated that the acoustic emission testing could be used in the fault diagnosis of on-spot low-speed heavy-duty gears and could be a significant supplement to vibration testing diagnosis.
All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis.
Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L; Terés, Lluís; Baumann, Reinhard R
2016-09-21
We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement.
He, Xin; Wang, Geng Nan; Yang, Kun; Liu, Hui Zhi; Wu, Xia Jun; Wang, Jian Ping
2017-04-15
In this study, a magnetic graphene-based dispersive solid phase extraction method was developed that was combined with high performance liquid chromatography to determine the residues of fluoroquinolone drugs in foods of animal origin. During the experiments, several parameters possible influencing the extraction performance were optimized (amount of magnetic graphene, sample pH, extraction time and elution solution). This extraction method showed high absorption capacities (>6800ng) and high enrichment factors (68-79-fold) for seven fluoroquinolones. Furthermore, this absorbent could be reused for at least 40 times. The limits of detection were in the range of 0.05-0.3ng/g, and the recoveries from the standards fortified blank samples (bovine milk, chicken muscle and egg) were in the range of 82.4-108.5%. Therefore, this method could be used as a simple and sensitive tool to determine the residues of fluoroquinolones in foods of animal origin. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Method for Exploiting Redundancy to Accommodate Actuator Limits in Multivariable Systems
NASA Technical Reports Server (NTRS)
Litt, Jonathan; Roulette, Greg
1995-01-01
This paper introduces a new method for accommodating actuator saturation in a multivariable system with actuator redundancy. Actuator saturation can cause significant deterioration in control system performance because unmet demand may result in sluggish transients and oscillations in response to setpoint changes. To help compensate for this problem, a technique has been developed which takes advantage of redundancy in multivariable systems to redistribute the unmet control demand over the remaining useful effectors. This method is not a redesign procedure, rather it modifies commands to the unlimited effectors to compensate for those which are limited, thereby exploiting the built-in redundancy. The original commands are modified by the increments due to unmet demand, but when a saturated effector comes off its limit, the incremental commands disappear and the original unmodified controller remains intact. This scheme provides a smooth transition between saturated and unsaturated modes as it divides up the unmet requirement over any available actuators. This way, if there is sufficiently redundant control authority, performance can be maintained.
Fumière, O; Marien, A; Fernández Pierna, J A; Baeten, V; Berben, G
2010-08-01
At present, European legislation prohibits totally the use of processed animal proteins in feed for all farmed animals (Commission Regulation (EC) No. 1234/2003-extended feed ban). A softening of the feed ban for non-ruminants would nevertheless be considered if alternative methods could be used to gain more information concerning the species origin of processed animal proteins than that which can be provided by classical optical microscopy. This would allow control provisions such as the ban of feeding animals with proteins from the same species or intra-species recycling (Regulation (EC) No. 1774/2002). Two promising alternative methods, near-infrared microscopy (NIRM) and real-time polymerase chain reaction (PCR), were combined to authenticate, at the species level, the presence of animal particles. The paper describes the improvements of the real-time PCR method made to the DNA extraction protocol, allowing five PCR analyses to be performed with the DNA extracted from a single particle.
Study and Application of Acoustic Emission Testing in Fault Diagnosis of Low-Speed Heavy-Duty Gears
Gao, Lixin; Zai, Fenlou; Su, Shanbin; Wang, Huaqing; Chen, Peng; Liu, Limei
2011-01-01
Most present studies on the acoustic emission signals of rotating machinery are experiment-oriented, while few of them involve on-spot applications. In this study, a method of redundant second generation wavelet transform based on the principle of interpolated subdivision was developed. With this method, subdivision was not needed during the decomposition. The lengths of approximation signals and detail signals were the same as those of original ones, so the data volume was twice that of original signals; besides, the data redundancy characteristic also guaranteed the excellent analysis effect of the method. The analysis of the acoustic emission data from the faults of on-spot low-speed heavy-duty gears validated the redundant second generation wavelet transform in the processing and denoising of acoustic emission signals. Furthermore, the analysis illustrated that the acoustic emission testing could be used in the fault diagnosis of on-spot low-speed heavy-duty gears and could be a significant supplement to vibration testing diagnosis. PMID:22346592
NASA Technical Reports Server (NTRS)
Cannone, Jaime J.; Barnes, Cindy L.; Achari, Aniruddha; Kundrot, Craig E.; Whitaker, Ann F. (Technical Monitor)
2001-01-01
The Sparse Matrix approach for obtaining lead crystallization conditions has proven to be very fruitful for the crystallization of proteins and nucleic acids. Here we report a Sparse Matrix developed specifically for the crystallization of protein-DNA complexes. This method is rapid and economical, typically requiring 2.5 mg of complex to test 48 conditions. The method was originally developed to crystallize basic fibroblast growth factor (bFGF) complexed with DNA sequences identified through in vitro selection, or SELEX, methods. Two DNA aptamers that bind with approximately nanomolar affinity and inhibit the angiogenic properties of bFGF were selected for co-crystallization. The Sparse Matrix produced lead crystallization conditions for both bFGF-DNA complexes.
Development of Methods of Evaluating Abilities to Make Plans in New Group Work
NASA Astrophysics Data System (ADS)
Kiriyama, Satoshi
The ability to evaluate something vague which is, for example, originality can be regarded as one of important elements which constitute the ability to make plans. The author has made use of cooperative activities in which every member undertakes each stage of a plan-do-check-cycle in order to develop training methods and evaluating methods of evaluating ability. The members of a CHECK team evaluated activities of a PLAN team and a DO team. The author tried to grasp the abilities of the members of a CHECK team by analyzing results of the evaluation. In addition, the author have made some teachers evaluate a sample in order to study the accuracy of criteria and extracted some challenges.
Maghrabi, Mufeed; Al-Abdullah, Tariq; Khattari, Ziad
2018-03-24
The two heating rates method (originally developed for first-order glow peaks) was used for the first time to evaluate the activation energy (E) from glow peaks obeying mixed-order (MO) kinetics. The derived expression for E has an insignificant additional term (on the scale of a few meV) when compared with the first-order case. Hence, the original expression for E using the two heating rates method can be used with excellent accuracy in the case of MO glow peaks. In addition, we derived a simple analytical expression for the MO parameter. The present procedure has the advantage that the MO parameter can now be evaluated using analytical expression instead of using the graphical representation between the geometrical factor and the MO parameter as given by the existing peak shape methods. The applicability of the derived expressions for real samples was demonstrated for the glow curve of Li 2 B 4 O 7 :Mn single crystal. The obtained parameters compare very well with those obtained by glow curve fitting and with the available published data.
The local lymph node assay (LLNA).
Rovida, Costanza; Ryan, Cindy; Cinelli, Serena; Basketter, David; Dearman, Rebecca; Kimber, Ian
2012-02-01
The murine local lymph node assay (LLNA) is a widely accepted method for assessing the skin sensitization potential of chemicals. Compared with other in vivo methods in guinea pig, the LLNA offers important advantages with respect to animal welfare, including a requirement for reduced animal numbers as well as reduced pain and trauma. In addition to hazard identification, the LLNA is used for determining the relative skin sensitizing potency of contact allergens as a pivotal contribution to the risk assessment process. The LLNA is the only in vivo method that has been subjected to a formal validation process. The original LLNA protocol is based on measurement of the proliferative activity of draining lymph node cells (LNC), as determined by incorporation of radiolabeled thymidine. Several variants to the original LLNA have been developed to eliminate the use of radioactive materials. One such alternative is considered here: the LLNA:BrdU-ELISA method, which uses 5-bromo-2-deoxyuridine (BrdU) in place of radiolabeled thymidine to measure LNC proliferation in draining nodes. © 2012 by John Wiley & Sons, Inc.
Automatic classification of fluorescence and optical diffusion spectroscopy data in neuro-oncology
NASA Astrophysics Data System (ADS)
Savelieva, T. A.; Loshchenov, V. B.; Goryajnov, S. A.; Potapov, A. A.
2018-04-01
The complexity of the biological tissue spectroscopic analysis due to the overlap of biological molecules' absorption spectra, multiple scattering effect, as well as measurement geometry in vivo has caused the relevance of this work. In the neurooncology the problem of tumor boundaries delineation is especially acute and requires the development of new methods of intraoperative diagnosis. Methods of optical spectroscopy allow detecting various diagnostically significant parameters non-invasively. 5-ALA induced protoporphyrin IX is frequently used as fluorescent tumor marker in neurooncology. At the same time analysis of the concentration and the oxygenation level of haemoglobin and significant changes of light scattering in tumor tissues have a high diagnostic value. This paper presents an original method for the simultaneous registration of backward diffuse reflectance and fluorescence spectra, which allows defining all the parameters listed above simultaneously. The clinical studies involving 47 patients with intracranial glial tumors of II-IV Grades were carried out in N.N. Burdenko National Medical Research Center of Neurosurgery. To register the spectral dependences the spectroscopic system LESA- 01-BIOSPEC was used with specially developed w-shaped diagnostic fiber optic probe. The original algorithm of combined spectroscopic signal processing was developed. We have created a software and hardware, which allowed (as compared with the methods currently used in neurosurgical practice) to increase the sensitivity of intraoperative demarcation of intracranial tumors from 78% to 96%, specificity of 60% to 82%. The result of analysis of different techniques of automatic classification shows that in our case the most appropriate is the k Nearest Neighbors algorithm with cubic metrics.
Digital image envelope: method and evaluation
NASA Astrophysics Data System (ADS)
Huang, H. K.; Cao, Fei; Zhou, Michael Z.; Mogel, Greg T.; Liu, Brent J.; Zhou, Xiaoqiang
2003-05-01
Health data security, characterized in terms of data privacy, authenticity, and integrity, is a vital issue when digital images and other patient information are transmitted through public networks in telehealth applications such as teleradiology. Mandates for ensuring health data security have been extensively discussed (for example The Health Insurance Portability and Accountability Act, HIPAA) and health informatics guidelines (such as the DICOM standard) are beginning to focus on issues of data continue to be published by organizing bodies in healthcare; however, there has not been a systematic method developed to ensure data security in medical imaging Because data privacy and authenticity are often managed primarily with firewall and password protection, we have focused our research and development on data integrity. We have developed a systematic method of ensuring medical image data integrity across public networks using the concept of the digital envelope. When a medical image is generated regardless of the modality, three processes are performed: the image signature is obtained, the DICOM image header is encrypted, and a digital envelope is formed by combining the signature and the encrypted header. The envelope is encrypted and embedded in the original image. This assures the security of both the image and the patient ID. The embedded image is encrypted again and transmitted across the network. The reverse process is performed at the receiving site. The result is two digital signatures, one from the original image before transmission, and second from the image after transmission. If the signatures are identical, there has been no alteration of the image. This paper concentrates in the method and evaluation of the digital image envelope.
Bioforensics: Characterization of biological weapons agents by NanoSIMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weber, P K; Ghosal, S; Leighton, T J
2007-02-26
The anthrax attacks of Fall 2001 highlight the need to develop forensic methods based on multiple identifiers to determine the origin of biological weapons agents. Genetic typing methods (i.e., DNA and RNA-based) provide one attribution technology, but genetic information alone is not usually sufficient to determine the provenance of the material. Non-genetic identifiers, including elemental and isotopic signatures, provide complementary information that can be used to identify the means, geographic location and date of production. Under LDRD funding, we have successfully developed the techniques necessary to perform bioforensic characterization with the NanoSIMS at the individual spore level. We have developedmore » methods for elemental and isotopic characterization at the single spore scale. We have developed methods for analyzing spore sections to map elemental abundance within spores. We have developed rapid focused ion beam (FIB) sectioning techniques for spores to preserve elemental and structural integrity. And we have developed a high-resolution depth profiling method to characterize the elemental distribution in individual spores without sectioning. We used these newly developed methods to study the controls on elemental abundances in spores, characterize the elemental distribution of in spores, and to study elemental uptake by spores. Our work under this LDRD project attracted FBI and DHS funding for applied purposes.« less
Multi-Optimisation Consensus Clustering
NASA Astrophysics Data System (ADS)
Li, Jian; Swift, Stephen; Liu, Xiaohui
Ensemble Clustering has been developed to provide an alternative way of obtaining more stable and accurate clustering results. It aims to avoid the biases of individual clustering algorithms. However, it is still a challenge to develop an efficient and robust method for Ensemble Clustering. Based on an existing ensemble clustering method, Consensus Clustering (CC), this paper introduces an advanced Consensus Clustering algorithm called Multi-Optimisation Consensus Clustering (MOCC), which utilises an optimised Agreement Separation criterion and a Multi-Optimisation framework to improve the performance of CC. Fifteen different data sets are used for evaluating the performance of MOCC. The results reveal that MOCC can generate more accurate clustering results than the original CC algorithm.
[Analysis on origin and evolution of mumps].
Zhao, Yan
2004-10-01
There was a preliminary recognition on mumps during the Qin-Han to Sui-Tang dynasty, laying a foundation for the scholastic development on this topic in later generations. The title of this disease was identified in Song-Jin-Yuan dynasty with gradual deepening on its principle-method-formula-medication system, a great progress of recognition as compared with the previous ages. In the Ming-Qing dynasty, the recognition became even more systematic, with certain breakthrough in the system of principle-method-formula-medication. In modern age, the experiences were inherited and developed to integrate to modern biomedicine, so that the theory and clinical practice become even more perfect.
The costs of nurse turnover: part 1: an economic perspective.
Jones, Cheryl Bland
2004-12-01
Nurse turnover is costly for healthcare organizations. Administrators and nurse executives need a reliable estimate of nurse turnover costs and the origins of those costs if they are to develop effective measures of reducing nurse turnover and its costs. However, determining how to best capture and quantify nurse turnover costs can be challenging. Part 1 of this series conceptualizes nurse turnover via human capital theory and presents an update of a previously developed method for determining the costs of nurse turnover, the Nursing Turnover Cost Calculation Method. Part 2 (January 2005) presents a recent application of the methodology in an acute care hospital.
Hall, Jim W; Lempert, Robert J; Keller, Klaus; Hackbarth, Andrew; Mijere, Christophe; McInerney, David J
2012-10-01
This study compares two widely used approaches for robustness analysis of decision problems: the info-gap method originally developed by Ben-Haim and the robust decision making (RDM) approach originally developed by Lempert, Popper, and Bankes. The study uses each approach to evaluate alternative paths for climate-altering greenhouse gas emissions given the potential for nonlinear threshold responses in the climate system, significant uncertainty about such a threshold response and a variety of other key parameters, as well as the ability to learn about any threshold responses over time. Info-gap and RDM share many similarities. Both represent uncertainty as sets of multiple plausible futures, and both seek to identify robust strategies whose performance is insensitive to uncertainties. Yet they also exhibit important differences, as they arrange their analyses in different orders, treat losses and gains in different ways, and take different approaches to imprecise probabilistic information. The study finds that the two approaches reach similar but not identical policy recommendations and that their differing attributes raise important questions about their appropriate roles in decision support applications. The comparison not only improves understanding of these specific methods, it also suggests some broader insights into robustness approaches and a framework for comparing them. © 2012 RAND Corporation.
The Use of UV-Visible Reflectance Spectroscopy as an Objective Tool to Evaluate Pearl Quality
Agatonovic-Kustrin, Snezana; Morton, David W.
2012-01-01
Assessing the quality of pearls involves the use of various tools and methods, which are mainly visual and often quite subjective. Pearls are normally classified by origin and are then graded by luster, nacre thickness, surface quality, size, color and shape. The aim of this study was to investigate the capacity of Artificial Neural Networks (ANNs) to classify and estimate the quality of 27 different pearls from their UV-Visible spectra. Due to the opaque nature of pearls, spectroscopy measurements were performed using the Diffuse Reflectance UV-Visible spectroscopy technique. The spectra were acquired at two different locations on each pearl sample in order to assess surface homogeneity. The spectral data (inputs) were smoothed to reduce the noise, fed into ANNs and correlated to the pearl’s quality/grading criteria (outputs). The developed ANNs were successful in predicting pearl type, mollusk growing species, possible luster and color enhancing, donor condition/type, recipient/host color, donor color, pearl luster, pearl color, origin. The results of this study shows that the developed UV-Vis spectroscopy-ANN method could be used as a more objective method of assessing pearl quality (grading) and may become a valuable tool for the pearl grading industry. PMID:22851919
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacFarlane, Michael; Battista, Jerry; Chen, Jeff
Purpose: To develop a radiotherapy dose tracking and plan evaluation technique using cone-beam computed tomography (CBCT) images. Methods: We developed a patient-specific method of calibrating CBCT image sets for dose calculation. The planning CT was first registered with the CBCT using deformable image registration (DIR). A scatter plot was generated between the CT numbers of the planning CT and CBCT for each slice. The CBCT calibration curve was obtained by least-square fitting of the data, and applied to each CBCT slice. The calibrated CBCT was then merged with original planning CT to extend the small field of view of CBCT.more » Finally, the treatment plan was copied to the merged CT for dose tracking and plan evaluation. The proposed patient-specific calibration method was also compared to two methods proposed in literature. To evaluate the accuracy of each technique, 15 head-and-neck patients requiring plan adaptation were arbitrarily selected from our institution. The original plan was calculated on each method’s data set, including a second planning CT acquired within 48 hours of the CBCT (serving as gold standard). Clinically relevant dose metrics and 3D gamma analysis of dose distributions were compared between the different techniques. Results: Compared to the gold standard of using planning CTs, the patient-specific CBCT calibration method was shown to provide promising results with gamma pass rates above 95% and average dose metric agreement within 2.5%. Conclusions: The patient-specific CBCT calibration method could potentially be used for on-line dose tracking and plan evaluation, without requiring a re-planning CT session.« less
Developing the concept of caring in nursing education
Salehian, Maryam; Heydari, Abbas; Moonaghi, Hossein Karimi; Aghebati, Nahid
2017-01-01
Background Caring is a value-based concept in the nursing field and in education. Exact understanding of caring in education and developing this concept in nursing will result in the evolution of the position of nursing science and profession. Aim The aim of this study was to attempt to develop the concept of caring in nursing education. Methods This qualitative study was conducted in 2016 using directed content analysis. Participants were thirteen subjects (6 instructors and 7 senior and junior nursing students) who were selected using purposeful sampling method. Research environment was the Faculty of Nursing and Midwifery in Mashhad. Data collection method was semi-structured interviews for thirty to ninety minutes and sampling continued until data saturation. Interviews were conducted in Persian language and they were immediately transcribed and analyzed using MAXDA10 software. The text of interviews was reviewed several times. First, open codes were extracted, and after several reviews based on similarity in meaning, they were classified into subcategories and finally, similar subcategories were placed in main classes based on meaning. Results Results of this study led to the identification of four themes: 1, ethical and religious commitment, 2, development of knowing and cultural sensitivity, 3, soft assertion, 4, clear describing of objectives, expectations, and educational rules for students. Conclusion Results of this study showed that the cultural and religious background of instructors affects their interaction with students. Instructors’ commitment and compliance to values in interacting with students and other educational colleagues has an origin beyond ethical and human subjects and it is originated from their religious education and training. PMID:28713517
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, E; Lasio, G; Lee, M
Purpose: Only a part of a treatment couch is reconstructed in CBCT due to the limited field of view (FOV). This often generates inaccurate results in the delivered dose evaluation with CBCT and more noise in the CBCT reconstruction. Full reconstruction of the couch at treatment setup can be used for more accurate exit beam dosimetry. The goal of this study is to develop a method to reconstruct a full treatment couch using a pre-scanned couch image and rigid registration. Methods: A full couch (Exact Couch, Varian) model image was reconstructed by rigidly registering and combining two sets of partialmore » CBCT images. The full couch model includes three parts: two side rails and a couch top. A patient CBCT was reconstructed with reconstruction grid size larger than the physical field of view to include the full couch. The image quality of the couch is not good due to data truncation, but good enough to allow rigid registration of the couch. A composite CBCT image of the patient plus couch has been generated from the original reconstruction by replacing couch portion with the pre-acquired model couch, rigidly registered to the original scan. We evaluated the clinical usefulness of this method by comparing treatment plans generated on the original and on the modified scans. Results: The full couch model could be attached to a patient CBCT image set via rigid image registration. Plan DVHs showed 1∼2% difference between plans with and without full couch modeling. Conclusion: The proposed method generated a full treatment couch CBCT model, which can be successfully registered to the original patient image. This method was also shown to be useful in generating more accurate dose distributions, by lowering 1∼2% dose in PTV and a few other critical organs. Part of this study is supported by NIH R01CA133539.« less
A weakly-compressible Cartesian grid approach for hydrodynamic flows
NASA Astrophysics Data System (ADS)
Bigay, P.; Oger, G.; Guilcher, P.-M.; Le Touzé, D.
2017-11-01
The present article aims at proposing an original strategy to solve hydrodynamic flows. In introduction, the motivations for this strategy are developed. It aims at modeling viscous and turbulent flows including complex moving geometries, while avoiding meshing constraints. The proposed approach relies on a weakly-compressible formulation of the Navier-Stokes equations. Unlike most hydrodynamic CFD (Computational Fluid Dynamics) solvers usually based on implicit incompressible formulations, a fully-explicit temporal scheme is used. A purely Cartesian grid is adopted for numerical accuracy and algorithmic simplicity purposes. This characteristic allows an easy use of Adaptive Mesh Refinement (AMR) methods embedded within a massively parallel framework. Geometries are automatically immersed within the Cartesian grid with an AMR compatible treatment. The method proposed uses an Immersed Boundary Method (IBM) adapted to the weakly-compressible formalism and imposed smoothly through a regularization function, which stands as another originality of this work. All these features have been implemented within an in-house solver based on this WCCH (Weakly-Compressible Cartesian Hydrodynamic) method which meets the above requirements whilst allowing the use of high-order (> 3) spatial schemes rarely used in existing hydrodynamic solvers. The details of this WCCH method are presented and validated in this article.
Hong, Sungwook; Park, Miseon
2018-01-19
Electrostatic dust print lift method is known to be able to recover only dry-origin footwear impression. However, the wet-origin footwear impression could also be recovered using this method. As the amount of dust accumulated before deposition of the wet-origin footwear impression increased, the intensity of the footwear impression lifted with this method became stronger. If the footwear impression is not affected by moisture after it is made, the 28-h old wet-origin footwear impression could be recovered using this method. The intensity of the lifted footwear impression did not decrease significantly even when the number of sequential steps increased as long as the shoe sole is wet. However, when the moisture on the shoe sole depleted, the intensity of the footwear impression decreased sharply. This method has the advantage of being able to enhance the footwear impression without being affected by the footwear impressions deposited in the past. © 2018 American Academy of Forensic Sciences.
Gim, Yeonghyeon; Ko, Han Seo
2016-04-15
In this Letter, a three-dimensional (3D) optical correction method, which was verified by simulation, was developed to reconstruct droplet-based flow fields. In the simulation, a synthetic phantom was reconstructed using a simultaneous multiplicative algebraic reconstruction technique with three detectors positioned at the synthetic object (represented by the phantom), with offset angles of 30° relative to each other. Additionally, a projection matrix was developed using the ray tracing method. If the phantom is in liquid, the image of the phantom can be distorted since the light passes through a convex liquid-vapor interface. Because of the optical distortion effect, the projection matrix used to reconstruct a 3D field should be supplemented by the revision ray, instead of the original projection ray. The revision ray can be obtained from the refraction ray occurring on the surface of the liquid. As a result, the error on the reconstruction field of the phantom could be reduced using the developed optical correction method. In addition, the developed optical method was applied to a Taylor cone which was caused by the high voltage between the droplet and the substrate.
NASA Astrophysics Data System (ADS)
Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.
2015-12-01
For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical of (2) and the global averaged methods typical of (3) compare for typical systems? The discussion will use examples of response of the Greenland glacier to global warming and surface and groundwater modeling.
Sutter, Richard C; Verano, John W
2007-02-01
The purpose of this study is to test two competing models regarding the origins of Early Intermediate Period (AD 200-750) sacrificial victims from the Huacas de Moche site using the matrix correlation method. The first model posits the sacrificial victims represent local elites who lost competitions in ritual battles with one another, while the other model suggests the victims were nonlocal warriors captured during warfare with nearby polities. We estimate biodistances for sacrificial victims from Huaca de la Luna Plaza 3C (AD 300-550) with eight previously reported samples from the north coast of Peru using both the mean measure of divergence (MMD) and Mahalanobis' distance (d2). Hypothetical matrices are developed based upon the assumptions of each of the two competing models regarding the origins of Moche sacrificial victims. When the MMD matrix is compared to the two hypothetical matrices using a partial-Mantel test (Smouse et al.: Syst Zool 35 (1986) 627-632), the ritual combat model (i.e. local origins) has a low and nonsignificant correlation (r = 0.134, P = 0.163), while the nonlocal origins model is highly correlated and significant (r = 0.688, P = 0.001). Comparisons of the d2 results and the two hypothetical matrices also produced low and nonsignificant correlation for the ritual combat model (r = 0.210, P = 0.212), while producing a higher and statistically significant result with the nonlocal origins model (r = 0.676, P = 0.002). We suggest that the Moche sacrificial victims represent nonlocal warriors captured in territorial combat with nearby competing polities. Copyright 2006 Wiley-Liss, Inc.
Using rare earth elements for the identification of the geographic origin of food
NASA Astrophysics Data System (ADS)
Meisel, T.; Bandoniene, D.; Joebstl, D.
2009-04-01
The European Union defined regimes within the Protected Geographical Status (PGS) framework to protect names of regional food specialities. Thus only food produced in a specific geographical area with a specific way of production or quality can be protected by a protected geographical indication (PGI) label. As such Styrian Pumpkin Seed Oil has been approved with this label, but as with many other high priced regional specialities, fraud cannot be excluded or nor identified. Thus the aim of this work is, to develop an analytical method for the control of the geographic origin of pumpkin seed oil and also to test the method for other protected products. The development of such a method is not only of interest for scientists, but also of importance for the consumer wanting to know the origin of the food products and the assurance of the purity and quality. The group of rare earth elements (REE) in plants also have a characteristic distribution pattern similar to upper crustal REE distributions. Since the REE concentrations are extremely low in pumpkin seed oil (ppt to low ppb), ICP-MS was the only sensitive tool able to produce validated results. The carrier of the REE are most likely small particles distributed within the pumpkin seed oil. Unlike, e.g., olive oil, pumpkin seed oil is bottled and sold unfiltered, which makes this Styrian speciality an interesting sampling target. As pumpkin seed oils from different geographic origin show variable trace element and rare earth distribution patterns, is should possible to trace the origin of these oils. In the current project pumpkin seeds from different regions in Austria and from abroad were sampled. The trace element patterns in the extracted oil of these seeds were determined and a preliminary classification with discriminate analysis was successfully done on a statistical basis. In addition to the study of the geographic origin it was demonstrated that REE distribution patterns can also be used for the identification of adulteration of high priced pumpkin seed oil with cheap neutral tasting refined oils. Interestingly enough, the variations of the REE patterns between oils from different regions are much more pronounced than their host soils. Thus we assume that microbiological processes in the rhizosphere are in control of the REE uptake into the plant. Regional variations of the microbiological composition of the soils and probably not only a priori the bulk soil composition of the minerals in the soil are the cause of the regional variations making it possible to identify the geographic origin of pumpkin seeds and as a consequence the pumpkin seed oil.
Bennett, Gordon D.; Patten, E.P.
1962-01-01
This report describes the theory and field procedures for determining the transmissibility and storage coefficients and the original hydrostatic head of each aquifer penetrated by a multiaquifer well. The procedure involves pumping the well in such a manner that the drawdown of water level is constant while the discharges of the different aquifers are measured by means of borehole flowmeters. The theory is developed by analogy to the heat-flow problem solved by Smith. The internal discharge between aquifers after the well is completed is analyzed as the first step. Pumping at constant, drawdown constitutes the second step. Transmissibility and storage coefficients are determined by a method described by Jacob and Lohman, after the original internal discharge to or from the aquifer has been compensated for in the calculations. The original hydrostatic head of each aquifer is then determined by resubstituting the transmissibility and storage coefficients into the first step of the analysis. The method was tested on a well in Chester County, Pa., but the results were not entirely satisfactory, owing to the lack of sufficiently accurate methods of flow measurement and, probably, to the effects of entrance losses in the well. The determinations of the transmissibility coefficient and static head can be accepted as having order-of-magnitude significance, but the determinations of the storage coefficient, which is highly sensitive to experimental error, must be rejected. It is felt that better results may be achieved in the future, as more reliable devices for metering the flow become available and as more is learned concerning the nature of entrance losses. If accurate data can be obtained, recently developed techniques of digital or analog computation may permit determination of the response of each aquifer in the well to any form of pumping.
Food security politics and the Millennium Development Goals.
McMichael, Philip; Schneider, Mindi
2011-01-01
This article reviews proposals regarding the recent food crisis in the context of a broader, threshold debate on the future of agriculture and food security. While the MDGs have focused on eradicating extreme poverty and hunger, the food crisis pushed the hungry over the one billion mark. There is thus a renewed focus on agricultural development, which pivots on the salience of industrial agriculture (as a supply source) in addressing food security. The World Bank's new 'agriculture for development' initiative seeks to improve small-farmer productivity with new inputs, and their incorporation into global markets via value-chains originating in industrial agriculture. An alternative claim, originating in 'food sovereignty' politics, demanding small-farmer rights to develop bio-regionally specific agro-ecological methods and provision for local, rather than global, markets, resonates in the IAASTD report, which implies agribusiness as usual ''is no longer an option'. The basic divide is over whether agriculture is a servant of economic growth, or should be developed as a foundational source of social and ecological sustainability. We review and compare these different paradigmatic approaches to food security, and their political and ecological implications.
19 CFR 134.43 - Methods of marking specific articles.
Code of Federal Regulations, 2014 CFR
2014-04-01
... origin by cutting, die-sinking, engraving, stamping, or some other permanent method. The indelible... metal or plastic tag indelibly marked with the country of origin and permanently attached to the article... crafts must be indelibly marked with the country of origin by means of cutting, die-sinking, engraving...
19 CFR 134.43 - Methods of marking specific articles.
Code of Federal Regulations, 2013 CFR
2013-04-01
... origin by cutting, die-sinking, engraving, stamping, or some other permanent method. The indelible... metal or plastic tag indelibly marked with the country of origin and permanently attached to the article... crafts must be indelibly marked with the country of origin by means of cutting, die-sinking, engraving...
Sensitive, Rapid Detection of Bacterial Spores
NASA Technical Reports Server (NTRS)
Kern, Roger G.; Venkateswaran, Kasthuri; Chen, Fei; Pickett, Molly; Matsuyama, Asahi
2009-01-01
A method of sensitive detection of bacterial spores within delays of no more than a few hours has been developed to provide an alternative to a prior three-day NASA standard culture-based assay. A capability for relatively rapid detection of bacterial spores would be beneficial for many endeavors, a few examples being agriculture, medicine, public health, defense against biowarfare, water supply, sanitation, hygiene, and the food-packaging and medical-equipment industries. The method involves the use of a commercial rapid microbial detection system (RMDS) that utilizes a combination of membrane filtration, adenosine triphosphate (ATP) bioluminescence chemistry, and analysis of luminescence images detected by a charge-coupled-device camera. This RMDS has been demonstrated to be highly sensitive in enumerating microbes (it can detect as little as one colony-forming unit per sample) and has been found to yield data in excellent correlation with those of culture-based methods. What makes the present method necessary is that the specific RMDS and the original protocols for its use are not designed for discriminating between bacterial spores and other microbes. In this method, a heat-shock procedure is added prior to an incubation procedure that is specified in the original RMDS protocols. In this heat-shock procedure (which was also described in a prior NASA Tech Briefs article on enumerating sporeforming bacteria), a sample is exposed to a temperature of 80 C for 15 minutes. Spores can survive the heat shock, but nonspore- forming bacteria and spore-forming bacteria that are not in spore form cannot survive. Therefore, any colonies that grow during incubation after the heat shock are deemed to have originated as spores.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yun, Yuxing; Fan, Jiwen; Xiao, Heng
Realistic modeling of cumulus convection at fine model resolutions (a few to a few tens of km) is problematic since it requires the cumulus scheme to adapt to higher resolution than they were originally designed for (~100 km). To solve this problem, we implement the spatial averaging method proposed in Xiao et al. (2015) and also propose a temporal averaging method for the large-scale convective available potential energy (CAPE) tendency in the Zhang-McFarlane (ZM) cumulus parameterization. The resolution adaptability of the original ZM scheme, the scheme with spatial averaging, and the scheme with both spatial and temporal averaging at 4-32more » km resolution is assessed using the Weather Research and Forecasting (WRF) model, by comparing with Cloud Resolving Model (CRM) results. We find that the original ZM scheme has very poor resolution adaptability, with sub-grid convective transport and precipitation increasing significantly as the resolution increases. The spatial averaging method improves the resolution adaptability of the ZM scheme and better conserves the total transport of moist static energy and total precipitation. With the temporal averaging method, the resolution adaptability of the scheme is further improved, with sub-grid convective precipitation becoming smaller than resolved precipitation for resolution higher than 8 km, which is consistent with the results from the CRM simulation. Both the spatial distribution and time series of precipitation are improved with the spatial and temporal averaging methods. The results may be helpful for developing resolution adaptability for other cumulus parameterizations that are based on quasi-equilibrium assumption.« less
van Brabant, A J; Hunt, S Y; Fangman, W L; Brewer, B J
1998-06-01
DNA fragments that contain an active origin of replication generate bubble-shaped replication intermediates with diverging forks. We describe two methods that use two-dimensional (2-D) agarose gel electrophoresis along with DNA sequence information to identify replication origins in natural and artificial Saccharomyces cerevisiae chromosomes. The first method uses 2-D gels of overlapping DNA fragments to locate an active chromosomal replication origin within a region known to confer autonomous replication on a plasmid. A variant form of 2-D gels can be used to determine the direction of fork movement, and the second method uses this technique to find restriction fragments that are replicated by diverging forks, indicating that a bidirectional replication origin is located between the two fragments. Either of these two methods can be applied to the analysis of any genomic region for which there is DNA sequence information or an adequate restriction map.
Balance of trade: export-import in family medicine.
Pust, Ronald E
2007-01-01
North American family physicians leaving for less-developed countries (LDCs) may not be aware of internationally validated diagnostic and treatment technologies originating in LDCs. Thus they may bring with them inappropriate models and methods of medical care. More useful "exports" are based in sharing our collaborative vocational perspective with dedicated indigenous generalist clinicians who serve their communities. More specifically, Western doctors abroad can promote local reanalyses of international evidence-based medicine (EBM) studies, efficient deployment of scarce clinical resources, and a family medicine/generalist career ladder, ultimately reversing the "brain drain" from LDCs. Balancing these exports, we should import the growing number of EBM best practices originated in World Health Organization and other LDCs research that are applicable in developed nations. Many generalist colleagues, expatriate and indigenous, with long-term LDC experience stand ready to help us import these practices and perspectives.
The lost steps of infancy: symbolization, analytic process and the growth of the self.
Feldman, Brian
2002-07-01
In 'The Lost Steps' the Latin American novelist Alejo Carpentier describes the search by the protagonist for the origins of music among native peoples in the Amazon jungle. This metaphor can be utilized as a way of understanding the search for the pre-verbal origins of the self in analysis. The infant's experience of the tempo and rhythmicity of the mother/infant interaction and the bathing in words and sounds of the infant by the mother are at the core of the infant's development of the self. The infant observation method (Tavistock model) will be looked at as a way of developing empathy in the analyst to better understand infantile, pre-verbal states of mind. A case vignette from an adult analysis will be utilized to illustrate the theoretical concepts.
Stem Cells in Skeletal Tissue Engineering: Technologies and Models
Langhans, Mark T.; Yu, Shuting; Tuan, Rocky S.
2017-01-01
This review surveys the use of pluripotent and multipotent stem cells in skeletal tissue engineering. Specific emphasis is focused on evaluating the function and activities of these cells in the context of development in vivo, and how technologies and methods of stem cell-based tissue engineering for stem cells must draw inspiration from developmental biology. Information on the embryonic origin and in vivo differentiation of skeletal tissues is first reviewed, to shed light on the persistence and activities of adult stem cells that remain in skeletal tissues after embryogenesis. Next, the development and differentiation of pluripotent stem cells is discussed, and some of their advantages and disadvantages in the context of tissue engineering is presented. The final section highlights current use of multipotent adult mesenchymal stem cells, reviewing their origin, differentiation capacity, and potential applications to tissue engineering. PMID:26423296
Radiation protective structure alternatives for habitats of a lunar base research outpost
NASA Technical Reports Server (NTRS)
Bell, Fred J.; Foo, Lai T.; Mcgrew, William P.
1988-01-01
The solar and galactic cosmic radiation levels on the Moon pose a hazard to extended manned lunar missions. Lunar soil represents an available, economical material to be used for radiation shielding. Several alternatives have been suggested to use lunar soil to protect the inhabitants of a lunar base research outpost from radiation. The Universities Space Research Association has requested that a comparative analysis of the alternatives be performed, with the purpose of developing the most advantageous design. Eight alternatives have been analyzed, including an original design which was developed to satisfy the identified design criteria. The original design consists of a cylindrical module and airlock, partially buried in the lunar soil, at a depth sufficient to achieve adequate radiation shielding. The report includes descriptions of the alternatives considered, the method of analysis used, and the final design selected.
NASA Astrophysics Data System (ADS)
Lee, Taesam
2018-05-01
Multisite stochastic simulations of daily precipitation have been widely employed in hydrologic analyses for climate change assessment and agricultural model inputs. Recently, a copula model with a gamma marginal distribution has become one of the common approaches for simulating precipitation at multiple sites. Here, we tested the correlation structure of the copula modeling. The results indicate that there is a significant underestimation of the correlation in the simulated data compared to the observed data. Therefore, we proposed an indirect method for estimating the cross-correlations when simulating precipitation at multiple stations. We used the full relationship between the correlation of the observed data and the normally transformed data. Although this indirect method offers certain improvements in preserving the cross-correlations between sites in the original domain, the method was not reliable in application. Therefore, we further improved a simulation-based method (SBM) that was developed to model the multisite precipitation occurrence. The SBM preserved well the cross-correlations of the original domain. The SBM method provides around 0.2 better cross-correlation than the direct method and around 0.1 degree better than the indirect method. The three models were applied to the stations in the Nakdong River basin, and the SBM was the best alternative for reproducing the historical cross-correlation. The direct method significantly underestimates the correlations among the observed data, and the indirect method appeared to be unreliable.
Nielen, Michel W F; Rutgers, Paula; van Bennekom, Eric O; Lasaroms, Johan J P; van Rhijn, J A Hans
2004-03-05
The origin, i.e. natural occurrence or illegal treatment, of findings of 17alpha-boldenone (alpha-Bol) and 17beta-boldenone (beta-Bol) in urine and faeces of cattle is under debate within the European Union. A liquid chromatographic positive ion electrospray tandem mass spectrometric method is presented for the confirmatory analysis of 17beta-boldenone, 17alpha-boldenone and an important metabolite/precursor androsta-1,4-diene-3,17-dione (ADD), using deuterium-labelled 17beta-boldenone (beta-Bol-d3) as internal standard. Detailed sample preparation procedures were developed for a variety of sample matrices such as bovine urine, faeces, feed and skin swab samples. The method was validated as a quantitative confirmatory method according to the latest EU guidelines and shows good precision, linearity and accuracy data, and CCalpha and CCbeta values of 0.1-0.3 and 0.4-1.0 ng/ml, respectively. Currently, the method has been successfully applied to suspect urine samples for more than a year, and occasionally to faeces, feed and swab samples as well. Results obtained from untreated and treated animals are given and their impact on the debate about the origin of residues of 17beta-boldenone is critically discussed. Finally, preliminary data about the degree of conjugation of boldenone residues are presented and a simple procedure for discrimination between residues from abuse versus natural origin is proposed.
An efficient hybrid approach for multiobjective optimization of water distribution systems
NASA Astrophysics Data System (ADS)
Zheng, Feifei; Simpson, Angus R.; Zecchin, Aaron C.
2014-05-01
An efficient hybrid approach for the design of water distribution systems (WDSs) with multiple objectives is described in this paper. The objectives are the minimization of the network cost and maximization of the network resilience. A self-adaptive multiobjective differential evolution (SAMODE) algorithm has been developed, in which control parameters are automatically adapted by means of evolution instead of the presetting of fine-tuned parameter values. In the proposed method, a graph algorithm is first used to decompose a looped WDS into a shortest-distance tree (T) or forest, and chords (Ω). The original two-objective optimization problem is then approximated by a series of single-objective optimization problems of the T to be solved by nonlinear programming (NLP), thereby providing an approximate Pareto optimal front for the original whole network. Finally, the solutions at the approximate front are used to seed the SAMODE algorithm to find an improved front for the original entire network. The proposed approach is compared with two other conventional full-search optimization methods (the SAMODE algorithm and the NSGA-II) that seed the initial population with purely random solutions based on three case studies: a benchmark network and two real-world networks with multiple demand loading cases. Results show that (i) the proposed NLP-SAMODE method consistently generates better-quality Pareto fronts than the full-search methods with significantly improved efficiency; and (ii) the proposed SAMODE algorithm (no parameter tuning) exhibits better performance than the NSGA-II with calibrated parameter values in efficiently offering optimal fronts.
Information theoretic comparisons of original and transformed data from Landsat MSS and TM
NASA Technical Reports Server (NTRS)
Malila, W. A.
1985-01-01
The dispersion and concentration of signal values in transformed data from the Landsat-4 MSS and TM instruments are analyzed using a communications theory approach. The definition of entropy of Shannon was used to quantify information, and the concept of mutual information was employed to develop a measure of information contained in several subsets of variables. Several comparisons of information content are made on the basis of the information content measure, including: system design capacities; data volume occupied by agricultural data; and the information content of original bands and Tasseled Cap variables. A method for analyzing noise effects in MSS and TM data is proposed.
Data processing for a cosmic ray experiment onboard the solar probes Helios 1 and 2: Experiment 6
NASA Technical Reports Server (NTRS)
Mueller-Mellin, R.; Green, G.; Iwers, B.; Kunow, H.; Wibberenz, G.; Fuckner, J.; Hempe, H.; Witte, M.
1982-01-01
The data processing system for the Helios experiment 6, measuring energetic charged particles of solar, planetary and galactic origin in the inner solar system, is described. The aim of this experiment is to extend knowledge on origin and propagation of cosmic rays. The different programs for data reduction, analysis, presentation, and scientific evaluation are described as well as hardware and software of the data processing equipment. A chronological presentation of the data processing operation is given. Procedures and methods for data analysis which were developed can be used with minor modifications for analysis of other space research experiments.
Kulinowski, Piotr; Woyna-Orlewicz, Krzysztof; Rappen, Gerd-Martin; Haznar-Garbacz, Dorota; Węglarz, Władysław P; Dorożyński, Przemysław P
2015-04-30
Motivation for the study was the lack of dedicated and effective research and development (R&D) in vitro methods for oral, generic, modified release formulations. The purpose of the research was to assess multimodal in vitro methodology for further bioequivalence study risk minimization. Principal results of the study are as follows: (i) Pharmaceutically equivalent quetiapine fumarate extended release dosage form of Seroquel XR was developed using a quality by design/design of experiment (QbD/DoE) paradigm. (ii) The developed formulation was then compared with originator using X-ray microtomography, magnetic resonance imaging and texture analysis. Despite similarity in terms of compendial dissolution test, developed and original dosage forms differed in micro/meso structure and consequently in mechanical properties. (iii) These differences were found to be the key factors of failure of biorelevant dissolution test using the stress dissolution apparatus. Major conclusions are as follows: (i) Imaging methods allow to assess internal features of the hydrating extended release matrix and together with the stress dissolution test allow to rationalize the design of generic formulations at the in vitro level. (ii) Technological impact on formulation properties e.g., on pore formation in hydrating matrices cannot be overlooked when designing modified release dosage forms. Copyright © 2015 Elsevier B.V. All rights reserved.
White, Sarah J; Coniston, Devorah; Rogers, Rosannagh; Frith, Uta
2011-04-01
It is now widely accepted that individuals with autism have a Theory of Mind (ToM) or mentalizing deficit. This has traditionally been assessed with false-belief tasks and, more recently, with silent geometric animations, an on-line ToM task. In adults with milder forms of autism standard false-belief tests, originally devised for children, often prove insensitive, while the Frith-Happé animations have had rather better success at capturing the on-line ToM deficit in this population. However, analysis of participants' verbal descriptions of these animations, which span scenarios from "Random" to "Goal-Directed" and "ToM," is time consuming and subjective. In this study, we developed and established the feasibility of an objective method of response through a series of multiple-choice questions. Sixteen adults with autism and 15 typically developing adults took part, matched for age and intelligence. The adults with autism were less accurate as a group at categorizing the Frith-Happé animations by the presence or absence of mental and physical interactions. Furthermore, they were less able to select the correct emotions that are typically attributed to the triangles in the mental state animations. This new objective method for assessing the understanding of the animations succeeded in being as sensitive as the original subjective method in detecting the mentalizing difficulties in autism, as well as being quicker and easier to administer and analyze. Copyright © 2011, International Society for Autism Research, Wiley Periodicals, Inc.
Topography measurements and applications in ballistics and tool mark identifications*
Vorburger, T V; Song, J; Petraco, N
2016-01-01
The application of surface topography measurement methods to the field of firearm and toolmark analysis is fairly new. The field has been boosted by the development of a number of competing optical methods, which has improved the speed and accuracy of surface topography acquisitions. We describe here some of these measurement methods as well as several analytical methods for assessing similarities and differences among pairs of surfaces. We also provide a few examples of research results to identify cartridge cases originating from the same firearm or tool marks produced by the same tool. Physical standards and issues of traceability are also discussed. PMID:27182440
Fire test method for graphite fiber reinforced plastics
NASA Technical Reports Server (NTRS)
Bowles, K. J.
1980-01-01
A potential problem in the use of graphite fiber reinforced resin matrix composites is the dispersal of graphite fibers during accidential fires. Airborne, electrically conductive fibers originating from the burning composites could enter and cause shorting in electrical equipment located in surrounding areas. A test method for assessing the burning characteristics of graphite fiber reinforced composites and the effectiveness of the composites in retaining the graphite fibers has been developed. The method utilizes a modified rate of heat release apparatus. The equipment and the testing procedure are described. The application of the test method to the assessment of composite materials is illustrated for two resin matrix/graphite composite systems.
A Rotor Tip Vortex Tracing Algorithm for Image Post-Processing
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.
2015-01-01
A neurite tracing algorithm, originally developed for medical image processing, was used to trace the location of the rotor tip vortex in density gradient flow visualization images. The tracing algorithm was applied to several representative test images to form case studies. The accuracy of the tracing algorithm was compared to two current methods including a manual point and click method and a cross-correlation template method. It is shown that the neurite tracing algorithm can reduce the post-processing time to trace the vortex by a factor of 10 to 15 without compromising the accuracy of the tip vortex location compared to other methods presented in literature.
Action methods in the classroom: creative strategies for nursing education.
McLaughlin, Dorcas E; Freed, Patricia E; Tadych, Rita A
2006-01-01
Nursing education recognizes the need for a framework of experiential learning that supports the development of professional roles. Action methods, originated by Jacob L. Moreno (1953), can be readily adapted to any nursing classroom to create the conditions under which students learn and practice professional nursing roles. While nurse faculty can learn to use action methods, they may not fully comprehend their theoretical underpinnings or may believe they are only used in therapy. This article explores Moreno's ideas related to psychodrama and sociodrama applied in classroom settings, and presents many examples and tips for classroom teachers who wish to incorporate action methods into their classes.
Non-axisymmetric local magnetostatic equilibrium
Candy, Jefferey M.; Belli, Emily A.
2015-03-24
In this study, we outline an approach to the problem of local equilibrium in non-axisymmetric configurations that adheres closely to Miller's original method for axisymmetric plasmas. Importantly, this method is novel in that it allows not only specification of 3D shape, but also explicit specification of the shear in the 3D shape. A spectrally-accurate method for solution of the resulting nonlinear partial differential equations is also developed. We verify the correctness of the spectral method, in the axisymmetric limit, through comparisons with an independent numerical solution. Some analytic results for the two-dimensional case are given, and the connection to Boozermore » coordinates is clarified.« less
Analysis method for Thomson scattering diagnostics in GAMMA 10/PDX.
Ohta, K; Yoshikawa, M; Yasuhara, R; Chikatsu, M; Shima, Y; Kohagura, J; Sakamoto, M; Nakasima, Y; Imai, T; Ichimura, M; Yamada, I; Funaba, H; Minami, T
2016-11-01
We have developed an analysis method to improve the accuracies of electron temperature measurement by employing a fitting technique for the raw Thomson scattering (TS) signals. Least square fitting of the raw TS signals enabled reduction of the error in the electron temperature measurement. We applied the analysis method to a multi-pass (MP) TS system. Because the interval between the MPTS signals is very short, it is difficult to separately analyze each Thomson scattering signal intensity by using the raw signals. We used the fitting method to obtain the original TS scattering signals from the measured raw MPTS signals to obtain the electron temperatures in each pass.
A fast marching algorithm for the factored eikonal equation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Treister, Eran, E-mail: erantreister@gmail.com; Haber, Eldad, E-mail: haber@math.ubc.ca; Department of Mathematics, The University of British Columbia, Vancouver, BC
The eikonal equation is instrumental in many applications in several fields ranging from computer vision to geoscience. This equation can be efficiently solved using the iterative Fast Sweeping (FS) methods and the direct Fast Marching (FM) methods. However, when used for a point source, the original eikonal equation is known to yield inaccurate numerical solutions, because of a singularity at the source. In this case, the factored eikonal equation is often preferred, and is known to yield a more accurate numerical solution. One application that requires the solution of the eikonal equation for point sources is travel time tomography. Thismore » inverse problem may be formulated using the eikonal equation as a forward problem. While this problem has been solved using FS in the past, the more recent choice for applying it involves FM methods because of the efficiency in which sensitivities can be obtained using them. However, while several FS methods are available for solving the factored equation, the FM method is available only for the original eikonal equation. In this paper we develop a Fast Marching algorithm for the factored eikonal equation, using both first and second order finite-difference schemes. Our algorithm follows the same lines as the original FM algorithm and requires the same computational effort. In addition, we show how to obtain sensitivities using this FM method and apply travel time tomography, formulated as an inverse factored eikonal equation. Numerical results in two and three dimensions show that our algorithm solves the factored eikonal equation efficiently, and demonstrate the achieved accuracy for computing the travel time. We also demonstrate a recovery of a 2D and 3D heterogeneous medium by travel time tomography using the eikonal equation for forward modeling and inversion by Gauss–Newton.« less
A decade of improvements in Mimiviridae and Marseilleviridae isolation from amoeba.
Pagnier, Isabelle; Reteno, Dorine-Gaelle Ikanga; Saadi, Hanene; Boughalmi, Mondher; Gaia, Morgan; Slimani, Meriem; Ngounga, Tatsiana; Bekliz, Meriem; Colson, Philippe; Raoult, Didier; La Scola, Bernard
2013-01-01
Since the isolation of the first giant virus, the Mimivirus, by T.J. Rowbotham in a cooling tower in Bradford, UK, and after its characterisation by our group in 2003, we have continued to develop novel strategies to isolate additional strains. By first focusing on cooling towers using our original time-consuming procedure, we were able to isolate a new lineage of giant virus called Marseillevirus and a new Mimivirus strain called Mamavirus. In the following years, we have accumulated the world's largest unique collection of giant viruses by improving the use of antibiotic combinations to avoid bacterial contamination of amoeba, developing strategies of preliminary screening of samples by molecular methods, and using a high-throughput isolation method developed by our group. Based on the inoculation of nearly 7,000 samples, our collection currently contains 43 strains of Mimiviridae (14 in lineage A, 6 in lineage B, and 23 in lineage C) and 17 strains of Marseilleviridae isolated from various environments, including 3 of human origin. This study details the procedures used to build this collection and paves the way for the high-throughput isolation of new isolates to improve the record of giant virus distribution in the environment and the determination of their pangenome.
Some problems of control of dynamical conditions of technological vibrating machines
NASA Astrophysics Data System (ADS)
Kuznetsov, N. K.; Lapshin, V. L.; Eliseev, A. V.
2017-10-01
The possibility of control of dynamical condition of the shakers that are designed for vibration treatment of parts interacting with granular media is discussed. The aim of this article is to develop the methodological basis of technology of creation of mathematical models of shake tables and the development of principles of formation of vibrational fields, estimation of their parameters and control of the structure vibration fields. Approaches to build mathematical models that take into account unilateral constraints, the relationships between elements, with the vibrating surface are developed. Methods intended to construct mathematical model of linear mechanical oscillation systems are used. Small oscillations about the position of static equilibrium are performed. The original method of correction of vibration fields by introduction of the oscillating system additional ties to the structure are proposed. Additional ties are implemented in the form of a mass-inertial device for changing the inertial parameters of the working body of the vibration table by moving the mass-inertial elements. The concept of monitoring the dynamic state of the vibration table based on the original measuring devices is proposed. Estimation for possible changes in dynamic properties is produced. The article is of interest for specialists in the field of creation of vibration technology machines and equipment.
Cultural Adaptations of Behavioral Health Interventions: A Progress Report
Barrera, Manuel
2014-01-01
Objective To reduce health disparities, behavioral health interventions must reach subcultural groups and demonstrate effectiveness in improving their health behaviors and outcomes. One approach to developing such health interventions is to culturally adapt original evidence-based interventions. The goals of the paper are to (a) describe consensus on the stages involved in developing cultural adaptations, (b) identify common elements in cultural adaptations, (c) examine evidence on the effectiveness of culturally enhanced interventions for various health conditions, and (d) pose questions for future research. Method Influential literature from the past decade was examined to identify points of consensus. Results There is agreement that cultural adaptation can be organized into five stages: information gathering, preliminary design, preliminary testing, refinement, and final trial. With few exceptions, reviews of several health conditions (e.g., AIDS, asthma, diabetes) concluded that culturally enhanced interventions are more effective in improving health outcomes than usual care or other control conditions. Conclusion Progress has been made in establishing methods for conducting cultural adaptations and providing evidence of their effectiveness. Future research should include evaluations of cultural adaptations developed in stages, tests to determine the effectiveness of cultural adaptations relative to the original versions, and studies that advance our understanding of cultural constructs’ contributions to intervention engagement and efficacy. PMID:22289132
A New Analytic-Adaptive Model for EGS Assessment, Development and Management Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danko, George L
To increase understanding of the energy extraction capacity of Enhanced Geothermal System(s) (EGS), a numerical model development and application project is completed. The general objective of the project is to develop and apply a new, data-coupled Thermal-Hydrological-Mechanical-Chemical (T-H-M-C) model in which the four internal components can be freely selected from existing simulation software without merging and cross-combining a diverse set of computational codes. Eight tasks are completed during the project period. The results are reported in five publications, an MS thesis, twelve quarterly, and two annual reports to DOE. Two US patents have also been issued during the project period,more » with one patent application originated prior to the start of the project. The “Multiphase Physical Transport Modeling Method and Modeling System” (U.S. Patent 8,396,693 B2, 2013), a key element in the GHE sub-model solution, is successfully used for EGS studies. The “Geothermal Energy Extraction System and Method" invention (U.S. Patent 8,430,166 B2, 2013) originates from the time of project performance, describing a new fluid flow control solution. The new, coupled T-H-M-C numerical model will help analyzing and designing new, efficient EGS systems.« less
Possible effect of biotechnology on plant gene pools in Turkey.
Demir, Aynur
2015-01-02
The recent rapid developments in biotechnology have made great contributions to the study of plant gene pools. The application of in vitro methods in freeze storage and DNA protection techniques in fast production studies has made major advances. From that aspect, biotechnology is an indispensable means for the protection of plant gene pools, which includes the insurance of sustainable agriculture and development of species. Besides all the positive developments, one of the primary risks posed by the uncontrolled spreading of genetically modified organisms is the possibility for other non-target organisms to be negatively affected. Genes of plant origin should be given priority in this type of studies by taking into consideration such negative effects that may result in disruption of ecological balance and damage to plant genetic pools. Turkey, due to its ecological conditions and history, has a very important position in terms of plant gene pools. This richness ought to be protected without corrupting its natural quality and natural evolution process in order to provide the sources of species that will be required for future sustainable agricultural applications. Thus, attention should be paid to the use of biotechnological methods, which play an important role especially in the protection and use of local and original plant gene pools.
Possible effect of biotechnology on plant gene pools in Turkey
Demir, Aynur
2015-01-01
The recent rapid developments in biotechnology have made great contributions to the study of plant gene pools. The application of in vitro methods in freeze storage and DNA protection techniques in fast production studies has made major advances. From that aspect, biotechnology is an indispensable means for the protection of plant gene pools, which includes the insurance of sustainable agriculture and development of species. Besides all the positive developments, one of the primary risks posed by the uncontrolled spreading of genetically modified organisms is the possibility for other non-target organisms to be negatively affected. Genes of plant origin should be given priority in this type of studies by taking into consideration such negative effects that may result in disruption of ecological balance and damage to plant genetic pools. Turkey, due to its ecological conditions and history, has a very important position in terms of plant gene pools. This richness ought to be protected without corrupting its natural quality and natural evolution process in order to provide the sources of species that will be required for future sustainable agricultural applications. Thus, attention should be paid to the use of biotechnological methods, which play an important role especially in the protection and use of local and original plant gene pools. PMID:26019612
Wang, Wen-Chung; Lai, Yen-Chein
2017-03-14
Mature cystic teratomas are usually found in the ovaries. They are bilateral in 10 to 15% of cases and multiple cystic teratomas may be present in one ovary. The aim of this study is to clarify if development of mature cystic teratomas of the ovaries in a single host is metachronous or due to autoimplant or recurrence. We report a woman with bilateral mature cystic teratomas of the ovaries. DNA profiles of these teratomas were investigated via short tandem repeat (STR) analysis and methylation statuses were determined via methylation sensitive multiplex ligation-dependent probe amplification methods. The results showed that the cystic teratomas originated from different stages of oogonia or primary oocyte before germinal vesicle stage failure of meiosis I in female gametogenesis. Potentially relevant literature was searched in PubMed database. Cases of bilateral or multiple mature cystic teratomas of the ovaries were analyzed. To date, there has been no reported case of multiple mature cystic teratomas in which clarification of the origin was achieved using molecular genetic methods. The results of this case study provide evidence of metachronous development of mature cystic teratomas of the ovaries and may serve as a reference in the management of patients following laparoscopic cystectomy.
Ernst Mach, George Sarton and the Empiry of Teaching Science Part I
NASA Astrophysics Data System (ADS)
Siemsen, Hayo
2012-04-01
George Sarton had a strong influence on modern history of science. The method he pursued throughout his life was the method he had discovered in Ernst Mach's Mechanics when he was a student in Ghent. Sarton was in fact throughout his life implementing a research program inspired by the epistemology of Mach. Sarton in turn inspired many others (James Conant, Thomas Kuhn, Gerald Holton, etc.). What were the origins of these ideas in Mach and what can this origin tell us about the history of science and science education nowadays? Which ideas proved to be successful and which ones need to be improved upon? The following article will elaborate the epistemological questions, which Darwin's "Origin" raised concerning human knowledge and scientific knowledge and which led Mach to adapt the concept of what is "empirical" in contrast to metaphysical a priori assumptions a second time after Galileo. On this basis Sarton proposed "genesis and development" as the major goal of Isis. Mach had elaborated this epistemology in La Connaissance et l'Erreur ( Knowledge and Error), which Sarton read in 1913 (Hiebert 1905/1976; de Mey 1984). Accordingly for Sarton, history becomes not only a subject of science, but a method of science education. Culture—and science as part of culture—is a result of a genetic process. History of science shapes and is shaped by science and science education in a reciprocal process. Its epistemology needs to be adapted to scientific facts and the philosophy of science. Sarton was well aware of the need to develop the history of science and the philosophy of science along the lines of this reciprocal process. It was a very fruitful basis, but a specific part of it, Sarton did not elaborate further, namely the psychology of science education. This proved to be a crucial missing element for all of science education in Sarton's succession, especially in the US. Looking again at the origins of the central questions in the thinking of Mach, which provided the basis and gave rise to Sarton's research program, will help in resolving current epistemic and methodological difficulties, contradictions and impasses in science education influenced by Sarton. The difficulties in science education will prevail as long as the omissions from their Machian origins are not systematically recovered and reintegrated.
Kaplan-Sandquist, Kimberly; LeBeau, Marc A; Miller, Mark L
2014-02-01
Chemical analysis of latent fingermarks, "touch chemistry," has the potential of providing intelligence or forensically relevant information. Matrix-assisted laser desorption ionization/time-of-flight mass spectrometry (MALDI/TOF MS) was used as an analytical platform for obtaining mass spectra and chemical images of target drugs and explosives in fingermark residues following conventional fingerprint development methods and MALDI matrix processing. There were two main purposes of this research: (1) develop effective laboratory methods for detecting drugs and explosives in fingermark residues and (2) determine the feasibility of detecting drugs and explosives after casual contact with pills, powders, and residues. Further, synthetic latent print reference pads were evaluated as mimics of natural fingermark residue to determine if the pads could be used for method development and quality control. The results suggest that artificial amino acid and sebaceous oil residue pads are not suitable to adequately simulate natural fingermark chemistry for MALDI/TOF MS analysis. However, the pads were useful for designing experiments and setting instrumental parameters. Based on the natural fingermark residue experiments, handling whole or broken pills did not transfer sufficient quantities of drugs to allow for definitive detection. Transferring drugs or explosives in the form of powders and residues was successful for preparing analytes for detection after contact with fingers and deposition of fingermark residue. One downfall to handling powders was that the analyte particles were easily spread beyond the original fingermark during development. Analyte particles were confined in the original fingermark when using transfer residues. The MALDI/TOF MS was able to detect procaine, pseudoephedrine, TNT, and RDX from contact residue under laboratory conditions with the integration of conventional fingerprint development methods and MALDI matrix. MALDI/TOF MS is a nondestructive technique which provides chemical information in both the mass spectra and chemical images. Published by Elsevier Ireland Ltd.
[Classification and organization technologies in public health].
Filatov, V B; Zhiliaeva, E P; Kal'fa, Iu I
2000-01-01
The authors discuss the impact and main characteristics of organization technologies in public health and the processes of their development and evaluation. They offer an original definition of the notion "organization technologies" with approaches to their classification. A system of logical bases is offered, which can be used for classification. These bases include the level of organization maturity and stage of development of organization technology, its destination to a certain level of management, type of influence and concentration of trend, mechanism of effect, functional group, and methods of development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, J; Lasio, G; Chen, S
2015-06-15
Purpose: To develop a CBCT HU correction method using a patient specific HU to mass density conversion curve based on a novel image registration and organ mapping method for head-and-neck radiation therapy. Methods: There are three steps to generate a patient specific CBCT HU to mass density conversion curve. First, we developed a novel robust image registration method based on sparseness analysis to register the planning CT (PCT) and the CBCT. Second, a novel organ mapping method was developed to transfer the organs at risk (OAR) contours from the PCT to the CBCT and corresponding mean HU values of eachmore » OAR were measured in both the PCT and CBCT volumes. Third, a set of PCT and CBCT HU to mass density conversion curves were created based on the mean HU values of OARs and the corresponding mass density of the OAR in the PCT. Then, we compared our proposed conversion curve with the traditional Catphan phantom based CBCT HU to mass density calibration curve. Both curves were input into the treatment planning system (TPS) for dose calculation. Last, the PTV and OAR doses, DVH and dose distributions of CBCT plans are compared to the original treatment plan. Results: One head-and-neck cases which contained a pair of PCT and CBCT was used. The dose differences between the PCT and CBCT plans using the proposed method are −1.33% for the mean PTV, 0.06% for PTV D95%, and −0.56% for the left neck. The dose differences between plans of PCT and CBCT corrected using the CATPhan based method are −4.39% for mean PTV, 4.07% for PTV D95%, and −2.01% for the left neck. Conclusion: The proposed CBCT HU correction method achieves better agreement with the original treatment plan compared to the traditional CATPhan based calibration method.« less
Numerical Modeling of Poroelastic-Fluid Systems Using High-Resolution Finite Volume Methods
NASA Astrophysics Data System (ADS)
Lemoine, Grady
Poroelasticity theory models the mechanics of porous, fluid-saturated, deformable solids. It was originally developed by Maurice Biot to model geophysical problems, such as seismic waves in oil reservoirs, but has also been applied to modeling living bone and other porous media. Poroelastic media often interact with fluids, such as in ocean bottom acoustics or propagation of waves from soft tissue into bone. This thesis describes the development and testing of high-resolution finite volume numerical methods, and simulation codes implementing these methods, for modeling systems of poroelastic media and fluids in two and three dimensions. These methods operate on both rectilinear grids and logically rectangular mapped grids. To allow the use of these methods, Biot's equations of poroelasticity are formulated as a first-order hyperbolic system with a source term; this source term is incorporated using operator splitting. Some modifications are required to the classical high-resolution finite volume method. Obtaining correct solutions at interfaces between poroelastic media and fluids requires a novel transverse propagation scheme and the removal of the classical second-order correction term at the interface, and in three dimensions a new wave limiting algorithm is also needed to correctly limit shear waves. The accuracy and convergence rates of the methods of this thesis are examined for a variety of analytical solutions, including simple plane waves, reflection and transmission of waves at an interface between different media, and scattering of acoustic waves by a poroelastic cylinder. Solutions are also computed for a variety of test problems from the computational poroelasticity literature, as well as some original test problems designed to mimic possible applications for the simulation code.
An inverse method for the aerodynamic design of three-dimensional aircraft engine nacelles
NASA Technical Reports Server (NTRS)
Bell, R. A.; Cedar, R. D.
1991-01-01
A fast, efficient and user friendly inverse design system for 3-D nacelles was developed. The system is a product of a 2-D inverse design method originally developed at NASA-Langley and the CFL3D analysis code which was also developed at NASA-Langley and modified for nacelle analysis. The design system uses a predictor/corrector design approach in which an analysis code is used to calculate the flow field for an initial geometry, the geometry is then modified based on the difference between the calculated and target pressures. A detailed discussion of the design method, the process of linking it to the modified CFL3D solver and its extension to 3-D is presented. This is followed by a number of examples of the use of the design system for the design of both axisymmetric and 3-D nacelles.
Cu-catalyzed aerobic oxidative esterification of acetophenones with alcohols to α-ketoesters.
Xu, Xuezhao; Ding, Wen; Lin, Yuanguang; Song, Qiuling
2015-02-06
Copper-catalyzed aerobic oxidative esterification of acetophenones with alcohols using molecular oxygen has been developed to form a broad range of α-ketoesters in good yields. In addition to reporting scope and limitations of our new method, mechanism studies are reported that reveal that the carbonyl oxygen in the ester mainly originated from dioxygen.
The world of geography: Visualizing a knowledge domain with cartographic means
Skupin, André
2004-01-01
From an informed critique of existing methods to the development of original tools, cartographic engagement can provide a unique perspective on knowledge domain visualization. Along with a discussion of some principles underlying a cartographically informed visualization methodology, results of experiments involving several thousand conference abstracts will be sketched and their plausibility reflected on. PMID:14764896
ERIC Educational Resources Information Center
Donley, Audrey Bell
Using the historical, documentary analysis, and questionnaire methods of research, this study traces the development and evolution of cooperative office education in the secondary schools of the United States from 1900 through 1969. The study was organized under the following topical divisions: (1) Original of Vocational Education, (2) Development…
The Development and Application of Novel Methods for the Solution of EMP Shielding Problems.
1981-02-01
chiralit\\ into chemistr , and originated the branch of chemistr \\ on a molar scale in, for example. snails, flowers, and we no% call stereochemistr\\ More... Chemistr %. Nobel L ecture (Dec 12. 1915). als in Les Prix Nobel timprimenec Ro.Nalc P A Norstedi & denotes that the strand in position I crosses in front
Physical and mathematical modeling of pollutant emissions when burning peat
NASA Astrophysics Data System (ADS)
Vasilyev, A.; Lozhkin, V.; Tarkhov, D.; Lozhkina, O.; Timofeev, V.
2017-11-01
The article presents an original neural network model of CO dispersion around the experimentally simulated peat fire. It is a self-learning model considering both the measured CO concentrations in the smoke cloud and the refined coefficients of the main equation. The method is recommended for the development of air quality control and forecasting systems.
Least-squares Minimization Approaches to Interpret Total Magnetic Anomalies Due to Spheres
NASA Astrophysics Data System (ADS)
Abdelrahman, E. M.; El-Araby, T. M.; Soliman, K. S.; Essa, K. S.; Abo-Ezz, E. R.
2007-05-01
We have developed three different least-squares approaches to determine successively: the depth, magnetic angle, and amplitude coefficient of a buried sphere from a total magnetic anomaly. By defining the anomaly value at the origin and the nearest zero-anomaly distance from the origin on the profile, the problem of depth determination is transformed into the problem of finding a solution of a nonlinear equation of the form f(z)=0. Knowing the depth and applying the least-squares method, the magnetic angle and amplitude coefficient are determined using two simple linear equations. In this way, the depth, magnetic angle, and amplitude coefficient are determined individually from all observed total magnetic data. The method is applied to synthetic examples with and without random errors and tested on a field example from Senegal, West Africa. In all cases, the depth solutions are in good agreement with the actual ones.
Markeeva, M V; Mareev, O V; Nikolenko, V N; Mareev, G O; Danilova, T V; Fadeeva, E A; Fedorov, R V
The objective of the present work was to study the relationship between the dimensions of the ethmoidal labyrinth and the skull in the subjects differing in the nose shape by means of the factorial and correlation analysis with the application of the modern computer-assisted methods for the three-dimensional reconstruction of the skull. We developed an original method for computed craniometry with the use the original program that made it possible to determine the standard intravital craniometrics characteristics of the human skull with a high degree of accuracy based on the results of analysis of 200 computed tomograms of the head. It was shown that the length of the inferior turbinated bones and the posterior edge of the orbital plate is of special relevance for practically all parameters of the ethmoidal labyrinth. Also, the width of the choanae positively relates to the height of the ethmoidal labyrinth.
NASA Astrophysics Data System (ADS)
Ni, Yong; He, Linghui; Khachaturyan, Armen G.
2010-07-01
A phase field method is proposed to determine the equilibrium fields of a magnetoelectroelastic multiferroic with arbitrarily distributed constitutive constants under applied loadings. This method is based on a developed generalized Eshelby's equivalency principle, in which the elastic strain, electrostatic, and magnetostatic fields at the equilibrium in the original heterogeneous system are exactly the same as those in an equivalent homogeneous magnetoelectroelastic coupled or uncoupled system with properly chosen distributed effective eigenstrain, polarization, and magnetization fields. Finding these effective fields fully solves the equilibrium elasticity, electrostatics, and magnetostatics in the original heterogeneous multiferroic. The paper formulates a variational principle proving that the effective fields are minimizers of appropriate close-form energy functional. The proposed phase field approach produces the energy minimizing effective fields (and thus solving the general multiferroic problem) as a result of artificial relaxation process described by the Ginzburg-Landau-Khalatnikov kinetic equations.
Use of ground radar to detect reentering debris
NASA Technical Reports Server (NTRS)
Crews, J. L.
1985-01-01
The velocity of the particles is required to identify the type of particles producing the ionization trails. A method of approximating the velocity of a meteor from radar data was developed. The method requires the time between the spacings of the Fresnel interference fringes, the range to the ionization trail, and the wavelength of the radar system. The orbital mechanics of the problem are evaluated, if the particles originate with the shuttle, the orbital mechanics will substantiate the relative position of the particles with the position of the shuttle. A program to determine spacecraft orbital decay due to perturbations is utilized for a preliminary evaluation of the orbital mechanics of the problem. Many assumptions concerning the size, shape, density, etc. of the particles are necessary for the preliminary evaluation. The results do not negate the possibility that the events observed by the radar are reentering particles originating from the shuttle.
Functional Carbon Quantum Dots: A Versatile Platform for Chemosensing and Biosensing.
Feng, Hui; Qian, Zhaosheng
2018-05-01
Carbon quantum dot has emerged as a new promising fluorescent nanomaterial due to its excellent optical properties, outstanding biocompatibility and accessible fabrication methods, and has shown huge application perspective in a variety of areas, especially in chemosensing and biosensing applications. In this personal account, we give a brief overview of carbon quantum dots from its origin and preparation methods, present some advance on fluorescence origin of carbon quantum dots, and focus on development of chemosensors and biosensors based on functional carbon quantum dots. Comprehensive advances on functional carbon quantum dots as a versatile platform for sensing from our group are included and summarized as well as some typical examples from the other groups. The biosensing applications of functional carbon quantum dots are highlighted from selective assays of enzyme activity to fluorescent identification of cancer cells and bacteria. © 2018 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Fast dictionary generation and searching for magnetic resonance fingerprinting.
Jun Xie; Mengye Lyu; Jian Zhang; Hui, Edward S; Wu, Ed X; Ze Wang
2017-07-01
A super-fast dictionary generation and searching (DGS) algorithm was developed for MR parameter quantification using magnetic resonance fingerprinting (MRF). MRF is a new technique for simultaneously quantifying multiple MR parameters using one temporally resolved MR scan. But it has a multiplicative computation complexity, resulting in a big burden of dictionary generating, saving, and retrieving, which can easily be intractable for any state-of-art computers. Based on retrospective analysis of the dictionary matching object function, a multi-scale ZOOM like DGS algorithm, dubbed as MRF-ZOOM, was proposed. MRF ZOOM is quasi-parameter-separable so the multiplicative computation complexity is broken into additive one. Evaluations showed that MRF ZOOM was hundreds or thousands of times faster than the original MRF parameter quantification method even without counting the dictionary generation time in. Using real data, it yielded nearly the same results as produced by the original method. MRF ZOOM provides a super-fast solution for MR parameter quantification.
Scientometric methods for identifying emerging technologies
Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T
2015-11-03
Provided is a method of generating a scientometric model that tracks the emergence of an identified technology from initial discovery (via original scientific and conference literature), through critical discoveries (via original scientific, conference literature and patents), transitioning through Technology Readiness Levels (TRLs) and ultimately on to commercial application. During the period of innovation and technology transfer, the impact of scholarly works, patents and on-line web news sources are identified. As trends develop, currency of citations, collaboration indicators, and on-line news patterns are identified. The combinations of four distinct and separate searchable on-line networked sources (i.e., scholarly publications and citation, worldwide patents, news archives, and on-line mapping networks) are assembled to become one collective network (a dataset for analysis of relations). This established network becomes the basis from which to quickly analyze the temporal flow of activity (searchable events) for the example subject domain.
Karabagias, Ioannis K; Karabournioti, Sofia
2018-05-03
Twenty-two honey samples, namely clover and citrus honeys, were collected from the greater Cairo area during the harvesting year 2014⁻2015. The main purpose of the present study was to characterize the aforementioned honey types and to investigate whether the use of easily assessable physicochemical parameters, including color attributes in combination with chemometrics, could differentiate honey floral origin. Parameters taken into account were: pH, electrical conductivity, ash, free acidity, lactonic acidity, total acidity, moisture content, total sugars (degrees Brix-°Bx), total dissolved solids and their ratio to total acidity, salinity, CIELAB color parameters, along with browning index values. Results showed that all honey samples analyzed met the European quality standards set for honey and had variations in the aforementioned physicochemical parameters depending on floral origin. Application of linear discriminant analysis showed that eight physicochemical parameters, including color, could classify Egyptian honeys according to floral origin ( p < 0.05). Correct classification rate was 95.5% using the original method and 90.9% using the cross validation method. The discriminatory ability of the developed model was further validated using unknown honey samples. The overall correct classification rate was not affected. Specific physicochemical parameter analysis in combination with chemometrics has the potential to enhance the differences in floral honeys produced in a given geographical zone.
Karabournioti, Sofia
2018-01-01
Twenty-two honey samples, namely clover and citrus honeys, were collected from the greater Cairo area during the harvesting year 2014–2015. The main purpose of the present study was to characterize the aforementioned honey types and to investigate whether the use of easily assessable physicochemical parameters, including color attributes in combination with chemometrics, could differentiate honey floral origin. Parameters taken into account were: pH, electrical conductivity, ash, free acidity, lactonic acidity, total acidity, moisture content, total sugars (degrees Brix-°Bx), total dissolved solids and their ratio to total acidity, salinity, CIELAB color parameters, along with browning index values. Results showed that all honey samples analyzed met the European quality standards set for honey and had variations in the aforementioned physicochemical parameters depending on floral origin. Application of linear discriminant analysis showed that eight physicochemical parameters, including color, could classify Egyptian honeys according to floral origin (p < 0.05). Correct classification rate was 95.5% using the original method and 90.9% using the cross validation method. The discriminatory ability of the developed model was further validated using unknown honey samples. The overall correct classification rate was not affected. Specific physicochemical parameter analysis in combination with chemometrics has the potential to enhance the differences in floral honeys produced in a given geographical zone. PMID:29751543
NASA Astrophysics Data System (ADS)
Yuan, C. C.; Zhang, D.; Gan, Y.
2017-03-01
Engineering atomic force microscopy tips for reliable tip enhanced Raman spectroscopy (TERS) and colloidal probe technique are becoming routine practices in many labs. In this 10 year update review, various new tip modification methods developed over the past decade are briefly reviewed to help researchers select the appropriate method. The perspective is put in a large context to discuss the opportunities and challenges in this area, including novel combinations of seemingly different methods, potential applications of some methods which were not originally intended for TERS tip fabrication, and the problems of high cost and poor reproducibility of tip fabrication.
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.
1990-01-01
An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.
Nishio, Kengo; Miyazaki, Takehide
2017-01-01
Polyhedral tilings are often used to represent structures such as atoms in materials, grains in crystals, foams, galaxies in the universe, etc. In the previous paper, we have developed a theory to convert a way of how polyhedra are arranged to form a polyhedral tiling into a codeword (series of numbers) from which the original structure can be recovered. The previous theory is based on the idea of forming a polyhedral tiling by gluing together polyhedra face to face. In this paper, we show that the codeword contains redundant digits not needed for recovering the original structure, and develop a theory to reduce the redundancy. For this purpose, instead of polyhedra, we regard two-dimensional regions shared by faces of adjacent polyhedra as building blocks of a polyhedral tiling. Using the present method, the same information is represented by a shorter codeword whose length is reduced by up to the half of the original one. Shorter codewords are easier to handle for both humans and computers, and thus more useful to describe polyhedral tilings. By generalizing the idea of assembling two-dimensional components to higher dimensional polytopes, we develop a unified theory to represent polyhedral tilings and polytopes of different dimensions in the same light. PMID:28094254
Development of alternative versions of the Logical Memory subtest of the WMS-R for use in Brazil
Bolognani, Silvia Adriana Prado; Miranda, Monica Carolina; Martins, Marjorie; Rzezak, Patricia; Bueno, Orlando Francisco Amodeo; de Camargo, Candida Helena Pires; Pompeia, Sabine
2015-01-01
The logical memory test of the Wechsler Memory Scale is one of the most frequently used standardized tests for assessing verbal memory and consists of two separate short stories each containing 25 idea units. Problems with practice effects arise with re-testing a patient, as these stories may be remembered from previous assessments. Therefore, alternative versions of the test stimuli should be developed to minimize learning effects when repeated testing is required for longitudinal evaluations of patients. Objective To present three alternative stories for each of the original stories frequently used in Brazil (Ana Soares and Roberto Mota) and to show their similarity in terms of content, structure and linguistic characteristics. Methods The alternative stories were developed according to the following criteria: overall structure or thematic content (presentation of the character, conflict, aggravation or complements and resolution); specific structure (sex of the character, location and occupation, details of what happened); formal structure (number of words, characters, verbs and nouns); and readability. Results The alternative stories and scoring criteria are presented in comparison to the original WMS stories (Brazilian version). Conclusion The alternative stories presented here correspond well thematically and structurally to the Brazilian versions of the original stories. PMID:29213955
NASA Astrophysics Data System (ADS)
Nürnberg, F.; Kühn, B.; Rollmann, K.
2016-12-01
In over 100 years of quartz glass fabrication, the applications and the optical requirements for this type of optical material have significantly changed. Applications like spectroscopy, UV flash lamps, the Apollo missions as well as the growth in UV and IR applications have directed quartz glass development towards new products, technologies or methods of measurement. The boundaries of the original measurement methods have been achieved and more sensitive measurements with precise resolution for transmission, purity, radiation resistance, absorption, thermal and mechanical stability as well as optical properties like homogeneity, stress birefringence, striae and bubbles/inclusions had to be found. This article will provide an overview of the development of measuring methods of quartz glass, discuss their limits and accuracy and point out the parameters which are of high relevance for today's laser applications.
Korzh, Vladimir; Strähle, Uwe
2002-08-01
A hundred years ago, Dr. Marshall A. Barber proposed a new technique - the microinjection technique. He developed this method initially to clone bacteria and to confirm the germ theory of Koch and Pasteur. Later on, he refined his approach and was able to manipulate nuclei in protozoa and to implant bacteria into plant cells. Continuous improvement and adaptation of this method to new applications dramatically changed experimental embryology and cytology and led to the formation of several new scientific disciplines including animal cloning as one of its latest applications. Interestingly, microinjection originated as a method at the crossroad of bacteriology and plant biology, demonstrating once again the unforeseen impact that basic research in an unrelated field can have on the development of entirely different disciplines.
Qi, Luming; Liu, Honggao; Li, Jieqing; Li, Tao; Wang, Yuanzhong
2018-01-15
Origin traceability is an important step to control the nutritional and pharmacological quality of food products. Boletus edulis mushroom is a well-known food resource in the world. Its nutritional and medicinal properties are drastically varied depending on geographical origins. In this study, three sensor systems (inductively coupled plasma atomic emission spectrophotometer (ICP-AES), ultraviolet-visible (UV-Vis) and Fourier transform mid-infrared spectroscopy (FT-MIR)) were applied for the origin traceability of 192 mushroom samples (caps and stipes) in combination with chemometrics. The difference between cap and stipe was clearly illustrated based on a single sensor technique, respectively. Feature variables from three instruments were used for origin traceability. Two supervised classification methods, partial least square discriminant analysis (FLS-DA) and grid search support vector machine (GS-SVM), were applied to develop mathematical models. Two steps (internal cross-validation and external prediction for unknown samples) were used to evaluate the performance of a classification model. The result is satisfactory with high accuracies ranging from 90.625% to 100%. These models also have an excellent generalization ability with the optimal parameters. Based on the combination of three sensory systems, our study provides a multi-sensory and comprehensive origin traceability of B. edulis mushrooms.
Qi, Luming; Liu, Honggao; Li, Jieqing; Li, Tao
2018-01-01
Origin traceability is an important step to control the nutritional and pharmacological quality of food products. Boletus edulis mushroom is a well-known food resource in the world. Its nutritional and medicinal properties are drastically varied depending on geographical origins. In this study, three sensor systems (inductively coupled plasma atomic emission spectrophotometer (ICP-AES), ultraviolet-visible (UV-Vis) and Fourier transform mid-infrared spectroscopy (FT-MIR)) were applied for the origin traceability of 184 mushroom samples (caps and stipes) in combination with chemometrics. The difference between cap and stipe was clearly illustrated based on a single sensor technique, respectively. Feature variables from three instruments were used for origin traceability. Two supervised classification methods, partial least square discriminant analysis (FLS-DA) and grid search support vector machine (GS-SVM), were applied to develop mathematical models. Two steps (internal cross-validation and external prediction for unknown samples) were used to evaluate the performance of a classification model. The result is satisfactory with high accuracies ranging from 90.625% to 100%. These models also have an excellent generalization ability with the optimal parameters. Based on the combination of three sensory systems, our study provides a multi-sensory and comprehensive origin traceability of B. edulis mushrooms. PMID:29342969
Haematological validation of a computer-based bone marrow reporting system.
Nguyen, D T; Diamond, L W; Cavenagh, J D; Parameswaran, R; Amess, J A
1997-01-01
AIMS: To prove the safety and effectiveness of "Professor Belmonte", a knowledge-based system for bone marrow reporting, a formal evaluation of the reports generated by the system was performed. METHODS: Three haematologists (a consultant, a senior registrar, and a junior registrar), none of whom were involved in the development of the software, compared the unedited reports generated by Professor Belmonte with the original bone marrow reports in 785 unselected cases. Each haematologist independently graded the quality of Belmonte's reports using one of four categories: (a) better than the original report (more informative, containing useful information missing in the original report); (b) equivalent to the original report; (c) satisfactory, but missing information that should have been included; and (d) unsatisfactory. RESULTS: The consultant graded 64 reports as more informative than the original, 687 as equivalent to the original, 32 as satisfactory, and two as unsatisfactory. The senior registrar considered 29 reports to be better than the original, 739 to be equivalent to the original, 15 to be satisfactory, and two to be unsatisfactory. The junior registrar found that 88 reports were better than the original, 681 were equivalent to the original, 14 were satisfactory, and two were unsatisfactory. Each judge found two different reports to be unsatisfactory according to their criteria. All 785 reports generated by the computer system received at least two scores of satisfactory or better. CONCLUSIONS: In this representative study, Professor Belmonte generated bone marrow reports that proved to be as accurate as the original reports in a large university hospital. The haematology knowledge contained within the system, the reasoning process, and the function of the software are safe and effective for assisting haematologists in generating high quality bone marrow reports. PMID:9215118
Shen, Qing; Dong, Wei; Yang, Mei; Li, Linqiu; Cheung, Hon-Yeung; Zhang, Zhifeng
2013-08-14
A matrix solid-phase dispersion (MSPD) procedure with titanium dioxide (TiO2) nanoparticles (NP) as sorbent was developed for the selective extraction of phospholipids from almond samples, and matrix-assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF/MS) was employed for analysis. A remarkable increase in the signals of phospholipid accompanied by a decrease in those of triacylglycerols and diacylglycerols was observed in the relevant mass spectra. The proposed method was applied to five batches of almonds originating from four geographical areas, whereas principal component analysis (PCA) was utilized to normalize the relative amounts of the identified phospholipid species. The results indicated that the lipidomic fingerprint of almonds was successfully established by the negative ion mode spectrum, and the ratio of m/z 833.6 to 835.6 as well as m/z 821.6 could be introduced as potential markers for the differentiation of the tested almonds with different geographical origins. The whole method is of great promise for selective separation of phospholipids from nonphospholipids, especially the glycerides, and superior in fast screening and characterization of phospholipids in almond samples.
Lu, Jia-Yang; Cheung, Michael Lok-Man; Huang, Bao-Tian; Wu, Li-Li; Xie, Wen-Jia; Chen, Zhi-Jian; Li, De-Rui; Xie, Liang-Xi
2015-01-01
To assess the performance of a simple optimisation method for improving target coverage and organ-at-risk (OAR) sparing in intensity-modulated radiotherapy (IMRT) for cervical oesophageal cancer. For 20 selected patients, clinically acceptable original IMRT plans (Original plans) were created, and two optimisation methods were adopted to improve the plans: 1) a base dose function (BDF)-based method, in which the treatment plans were re-optimised based on the original plans, and 2) a dose-controlling structure (DCS)-based method, in which the original plans were re-optimised by assigning additional constraints for hot and cold spots. The Original, BDF-based and DCS-based plans were compared with regard to target dose homogeneity, conformity, OAR sparing, planning time and monitor units (MUs). Dosimetric verifications were performed and delivery times were recorded for the BDF-based and DCS-based plans. The BDF-based plans provided significantly superior dose homogeneity and conformity compared with both the DCS-based and Original plans. The BDF-based method further reduced the doses delivered to the OARs by approximately 1-3%. The re-optimisation time was reduced by approximately 28%, but the MUs and delivery time were slightly increased. All verification tests were passed and no significant differences were found. The BDF-based method for the optimisation of IMRT for cervical oesophageal cancer can achieve significantly better dose distributions with better planning efficiency at the expense of slightly more MUs.
Xiao, Ya-Bing; Zhang, Man; Wen, Hua-Wei
2014-04-01
A method for simultaneous determination of arsanilic, nitarsone and roxarsone (ROX) residues in foods of animal origin was developed by accelerated solvent extraction-liquid chromatography-atomic fluorescence spectrometry (ASE-LC-AFS). The ultrasound centrifugation extraction and accelerated solvent extraction were compared, and the accelerated solvent extraction conditions, namely the proportion of the extraction solvent, the extraction temperature, extraction time and extraction times, were optimized. The operating conditions of LC-AFS and the mobile phase were optimized. Under the optimal conditions, the calibration curves for ASA , NIT and ROX were linear over the concentration range of 0-2.0 mg x L(-1) and their correlation coefficients were 0.999 2-0.999 8. The detection limits of ASA, NIT and ROX were 2.4, 7.4 and 4.1 microg x L(-1) respectively. The average recoveries of ASA, NIT and ROX from two samples spiked at three levels of 0.5, 2, 5 mg x kg(-1) were in the ranges of 87.1%-93.2%, 85.2%-93.9%, and 84.2%-93.7% with RSDs of 1.4%-4.6%, 1.2%-4.2%, and 1.1%-4.5%, respectively. This method possesses the merits of convenience and good repeatability, and is a feasible method for analysis of ASA, NIT and ROX in foods of animal origin.
The Complex Admixture History and Recent Southern Origins of Siberian Populations
Pugach, Irina; Matveev, Rostislav; Spitsyn, Viktor; Makarov, Sergey; Novgorodov, Innokentiy; Osakovsky, Vladimir; Stoneking, Mark; Pakendorf, Brigitte
2016-01-01
Although Siberia was inhabited by modern humans at an early stage, there is still debate over whether it remained habitable during the extreme cold of the Last Glacial Maximum or whether it was subsequently repopulated by peoples with recent shared ancestry. Previous studies of the genetic history of Siberian populations were hampered by the extensive admixture that appears to have taken place among these populations, because commonly used methods assume a tree-like population history and at most single admixture events. Here we analyze geogenetic maps and use other approaches to distinguish the effects of shared ancestry from prehistoric migrations and contact, and develop a new method based on the covariance of ancestry components, to investigate the potentially complex admixture history. We furthermore adapt a previously devised method of admixture dating for use with multiple events of gene flow, and apply these methods to whole-genome genotype data from over 500 individuals belonging to 20 different Siberian ethnolinguistic groups. The results of these analyses indicate that there have been multiple layers of admixture detectable in most of the Siberian populations, with considerable differences in the admixture histories of individual populations. Furthermore, most of the populations of Siberia included here, even those settled far to the north, appear to have a southern origin, with the northward expansions of different populations possibly being driven partly by the advent of pastoralism, especially reindeer domestication. These newly developed methods to analyze multiple admixture events should aid in the investigation of similarly complex population histories elsewhere. PMID:26993256
Eulerian formulation of the interacting particle representation model of homogeneous turbulence
Campos, Alejandro; Duraisamy, Karthik; Iaccarino, Gianluca
2016-10-21
The Interacting Particle Representation Model (IPRM) of homogeneous turbulence incorporates information about the morphology of turbulent structures within the con nes of a one-point model. In the original formulation [Kassinos & Reynolds, Center for Turbulence Research: Annual Research Briefs, 31{51, (1996)], the IPRM was developed in a Lagrangian setting by evolving second moments of velocity conditional on a given gradient vector. In the present work, the IPRM is re-formulated in an Eulerian framework and evolution equations are developed for the marginal PDFs. Eulerian methods avoid the issues associated with statistical estimators used by Lagrangian approaches, such as slow convergence. Amore » specific emphasis of this work is to use the IPRM to examine the long time evolution of homogeneous turbulence. We first describe the derivation of the marginal PDF in spherical coordinates, which reduces the number of independent variables and the cost associated with Eulerian simulations of PDF models. Next, a numerical method based on radial basis functions over a spherical domain is adapted to the IPRM. Finally, results obtained with the new Eulerian solution method are thoroughly analyzed. The sensitivity of the Eulerian simulations to parameters of the numerical scheme, such as the size of the time step and the shape parameter of the radial basis functions, is examined. A comparison between Eulerian and Lagrangian simulations is performed to discern the capabilities of each of the methods. Finally, a linear stability analysis based on the eigenvalues of the discrete differential operators is carried out for both the new Eulerian solution method and the original Lagrangian approach.« less
Mapping yeast origins of replication via single-stranded DNA detection.
Feng, Wenyi; Raghuraman, M K; Brewer, Bonita J
2007-02-01
Studies in th Saccharomyces cerevisiae have provided a framework for understanding how eukaryotic cells replicate their chromosomal DNA to ensure faithful transmission of genetic information to their daughter cells. In particular, S. cerevisiae is the first eukaryote to have its origins of replication mapped on a genomic scale, by three independent groups using three different microarray-based approaches. Here we describe a new technique of origin mapping via detection of single-stranded DNA in yeast. This method not only identified the majority of previously discovered origins, but also detected new ones. We have also shown that this technique can identify origins in Schizosaccharomyces pombe, illustrating the utility of this method for origin mapping in other eukaryotes.
3D Tree Dimensionality Assessment Using Photogrammetry and Small Unmanned Aerial Vehicles
2015-01-01
Detailed, precise, three-dimensional (3D) representations of individual trees are a prerequisite for an accurate assessment of tree competition, growth, and morphological plasticity. Until recently, our ability to measure the dimensionality, spatial arrangement, shape of trees, and shape of tree components with precision has been constrained by technological and logistical limitations and cost. Traditional methods of forest biometrics provide only partial measurements and are labor intensive. Active remote technologies such as LiDAR operated from airborne platforms provide only partial crown reconstructions. The use of terrestrial LiDAR is laborious, has portability limitations and high cost. In this work we capitalized on recent improvements in the capabilities and availability of small unmanned aerial vehicles (UAVs), light and inexpensive cameras, and developed an affordable method for obtaining precise and comprehensive 3D models of trees and small groups of trees. The method employs slow-moving UAVs that acquire images along predefined trajectories near and around targeted trees, and computer vision-based approaches that process the images to obtain detailed tree reconstructions. After we confirmed the potential of the methodology via simulation we evaluated several UAV platforms, strategies for image acquisition, and image processing algorithms. We present an original, step-by-step workflow which utilizes open source programs and original software. We anticipate that future development and applications of our method will improve our understanding of forest self-organization emerging from the competition among trees, and will lead to a refined generation of individual-tree-based forest models. PMID:26393926
3D Tree Dimensionality Assessment Using Photogrammetry and Small Unmanned Aerial Vehicles.
Gatziolis, Demetrios; Lienard, Jean F; Vogs, Andre; Strigul, Nikolay S
2015-01-01
Detailed, precise, three-dimensional (3D) representations of individual trees are a prerequisite for an accurate assessment of tree competition, growth, and morphological plasticity. Until recently, our ability to measure the dimensionality, spatial arrangement, shape of trees, and shape of tree components with precision has been constrained by technological and logistical limitations and cost. Traditional methods of forest biometrics provide only partial measurements and are labor intensive. Active remote technologies such as LiDAR operated from airborne platforms provide only partial crown reconstructions. The use of terrestrial LiDAR is laborious, has portability limitations and high cost. In this work we capitalized on recent improvements in the capabilities and availability of small unmanned aerial vehicles (UAVs), light and inexpensive cameras, and developed an affordable method for obtaining precise and comprehensive 3D models of trees and small groups of trees. The method employs slow-moving UAVs that acquire images along predefined trajectories near and around targeted trees, and computer vision-based approaches that process the images to obtain detailed tree reconstructions. After we confirmed the potential of the methodology via simulation we evaluated several UAV platforms, strategies for image acquisition, and image processing algorithms. We present an original, step-by-step workflow which utilizes open source programs and original software. We anticipate that future development and applications of our method will improve our understanding of forest self-organization emerging from the competition among trees, and will lead to a refined generation of individual-tree-based forest models.
NASA Technical Reports Server (NTRS)
Greene, William H.
1990-01-01
A study was performed focusing on the calculation of sensitivities of displacements, velocities, accelerations, and stresses in linear, structural, transient response problems. One significant goal of the study was to develop and evaluate sensitivity calculation techniques suitable for large-order finite element analyses. Accordingly, approximation vectors such as vibration mode shapes are used to reduce the dimensionality of the finite element model. Much of the research focused on the accuracy of both response quantities and sensitivities as a function of number of vectors used. Two types of sensitivity calculation techniques were developed and evaluated. The first type of technique is an overall finite difference method where the analysis is repeated for perturbed designs. The second type of technique is termed semi-analytical because it involves direct, analytical differentiation of the equations of motion with finite difference approximation of the coefficient matrices. To be computationally practical in large-order problems, the overall finite difference methods must use the approximation vectors from the original design in the analyses of the perturbed models. In several cases this fixed mode approach resulted in very poor approximations of the stress sensitivities. Almost all of the original modes were required for an accurate sensitivity and for small numbers of modes, the accuracy was extremely poor. To overcome this poor accuracy, two semi-analytical techniques were developed. The first technique accounts for the change in eigenvectors through approximate eigenvector derivatives. The second technique applies the mode acceleration method of transient analysis to the sensitivity calculations. Both result in accurate values of the stress sensitivities with a small number of modes and much lower computational costs than if the vibration modes were recalculated and then used in an overall finite difference method.
Lynagh, Timothy; Lynch, Joseph W.
2010-01-01
The ability to silence the electrical activity of defined neuronal populations in vivo is dramatically advancing our understanding of brain function. This technology may eventually be useful clinically for treating a variety of neuropathological disorders caused by excessive neuronal activity. Several neuronal silencing methods have been developed, with the bacterial light-activated halorhodopsin and the invertebrate allatostatin-activated G protein-coupled receptor proving the most successful to date. However, both techniques may be difficult to implement clinically due to their requirement for surgically implanted stimulus delivery methods and their use of nonhuman receptors. A third silencing method, an invertebrate glutamate-gated chloride channel receptor (GluClR) activated by ivermectin, solves the stimulus delivery problem as ivermectin is a safe, well tolerated drug that reaches the brain following systemic administration. However, the limitations of this method include poor functional expression, possibly due to the requirement to coexpress two different subunits in individual neurons, and the nonhuman origin of GluClR. Here, we describe the development of a modified human α1 glycine receptor as an improved ivermectin-gated silencing receptor. The crucial development was the identification of a mutation, A288G, which increased ivermectin sensitivity almost 100-fold, rendering it similar to that of GluClR. Glycine sensitivity was eliminated via the F207A mutation. Its large unitary conductance, homomeric expression, and human origin may render the F207A/A288G α1 glycine receptor an improved silencing receptor for neuroscientific and clinical purposes. As all known highly ivermectin-sensitive GluClRs contain an endogenous glycine residue at the corresponding location, this residue appears essential for exquisite ivermectin sensitivity. PMID:20308070
Automatic superposition of drug molecules based on their common receptor site
NASA Astrophysics Data System (ADS)
Kato, Yuichi; Inoue, Atsushi; Yamada, Miho; Tomioka, Nobuo; Itai, Akiko
1992-10-01
We have prevously developed a new rational method for superposing molecules in terms of submolecular physical and chemical properties, but not in terms of atom positions or chemical structures as has been done in the conventional methods. The program was originally developed for interactive use on a three-dimensional graphic display, providing goodness-of-fit indices on molecular shape, hydrogen bonds, electrostatic interactions and others. Here, we report a new unbiased searching method for the best superposition of molecules, covering all the superposing modes and conformational freedom, as an additional function of the program. The function is based on a novel least-squares method which superposes the expected positions and orientations of hydrogen bonding partners in the receptor that are deduced from both molecules. The method not only gives reliability and reproducibility to the result of the superposition, but also allows us to save labor and time. It is demonstrated that this method is very efficient for finding the correct superposing mode in such systems where hydrogen bonds play important roles.
Pires, Nuno D.; Bemer, Marian; Müller, Lena M.; Baroux, Célia; Spillane, Charles; Grossniklaus, Ueli
2016-01-01
Embryonic development requires a correct balancing of maternal and paternal genetic information. This balance is mediated by genomic imprinting, an epigenetic mechanism that leads to parent-of-origin-dependent gene expression. The parental conflict (or kinship) theory proposes that imprinting can evolve due to a conflict between maternal and paternal alleles over resource allocation during seed development. One assumption of this theory is that paternal alleles can regulate seed growth; however, paternal effects on seed size are often very low or non-existent. We demonstrate that there is a pool of cryptic genetic variation in the paternal control of Arabidopsis thaliana seed development. Such cryptic variation can be exposed in seeds that maternally inherit a medea mutation, suggesting that MEA acts as a maternal buffer of paternal effects. Genetic mapping using recombinant inbred lines, and a novel method for the mapping of parent-of-origin effects using whole-genome sequencing of segregant bulks, indicate that there are at least six loci with small, paternal effects on seed development. Together, our analyses reveal the existence of a pool of hidden genetic variation on the paternal control of seed development that is likely shaped by parental conflict. PMID:26811909
Pires, Nuno D; Bemer, Marian; Müller, Lena M; Baroux, Célia; Spillane, Charles; Grossniklaus, Ueli
2016-01-01
Embryonic development requires a correct balancing of maternal and paternal genetic information. This balance is mediated by genomic imprinting, an epigenetic mechanism that leads to parent-of-origin-dependent gene expression. The parental conflict (or kinship) theory proposes that imprinting can evolve due to a conflict between maternal and paternal alleles over resource allocation during seed development. One assumption of this theory is that paternal alleles can regulate seed growth; however, paternal effects on seed size are often very low or non-existent. We demonstrate that there is a pool of cryptic genetic variation in the paternal control of Arabidopsis thaliana seed development. Such cryptic variation can be exposed in seeds that maternally inherit a medea mutation, suggesting that MEA acts as a maternal buffer of paternal effects. Genetic mapping using recombinant inbred lines, and a novel method for the mapping of parent-of-origin effects using whole-genome sequencing of segregant bulks, indicate that there are at least six loci with small, paternal effects on seed development. Together, our analyses reveal the existence of a pool of hidden genetic variation on the paternal control of seed development that is likely shaped by parental conflict.
Comparison of Alternate and Original Items on the Montreal Cognitive Assessment
Lebedeva, Elena; Huang, Mei; Koski, Lisa
2016-01-01
Background The Montreal Cognitive Assessment (MoCA) is a screening tool for mild cognitive impairment (MCI) in elderly individuals. We hypothesized that measurement error when using the new alternate MoCA versions to monitor change over time could be related to the use of items that are not of comparable difficulty to their corresponding originals of similar content. The objective of this study was to compare the difficulty of the alternate MoCA items to the original ones. Methods Five selected items from alternate versions of the MoCA were included with items from the original MoCA administered adaptively to geriatric outpatients (N = 78). Rasch analysis was used to estimate the difficulty level of the items. Results None of the five items from the alternate versions matched the difficulty level of their corresponding original items. Conclusions This study demonstrates the potential benefits of a Rasch analysis-based approach for selecting items during the process of development of parallel forms. The results suggest that better match of the items from different MoCA forms by their difficulty would result in higher sensitivity to changes in cognitive function over time. PMID:27076861
Biochips: non-conventional strategies for biosensing elements immobilization.
Marquette, Christophe A; Corgier, Benjamin P; Heyries, Kevin A; Blum, Loic J
2008-01-01
The present article draws a general picture of non-conventional methods for biomolecules immobilization. The technologies presented are based either on original solid supports or on innovative immobilization processes. Polydimethylsiloxane elastomer will be presented as a popular immobilization support within the biochip developer community. Electro-addressing of biomolecules at the surface of conducting biochips will appear to be an interesting alternative to immobilization processes based on surface functionalization. Finally, bead-assisted biomolecules immobilization will be presented as an open field of research for biochip developments.
History of Circuit Breakers and Expectations of Japanese Original Development
NASA Astrophysics Data System (ADS)
Yoshioka, Yoshio; Yoshinaga, Kiyoshi; Yanabu, Satoru
This paper studies the history of high voltage circuit breaker engineering. Methods of analysis are (1) to collect facts in regard to its development, (2) to review the history in order to find essential factors and (3) to identify its pros and cons from engineering point of view. The amount of electric power consumption has increased and gradually the circuit breaker concept was developed. At first the oil circuit breakers were developed in Europe and the air circuit breaker and vacuum circuit breaker were developed. Finally the SF6gas circuit breakers are developed together with the gas insulated switchgear and what is next? The future research and development policy is also discussed.
Originality Detection Software in a Graduate Policy Course: A Mixed-Methods Evaluation of Plagiarism
ERIC Educational Resources Information Center
Dreuth Zeman, Laura; Steen, Julie A.; Metz Zeman, Natalie
2011-01-01
The authors used a mixed-methods approach to evaluate the use of Turnitin originality detection software in a graduate social work course. Qualitative analysis of student responses revealed positive and negative spent completing assignments, and the tone of the class. Quantitative analysis of students' originality scores indicated a short-term…
Intensity compensation for on-line detection of defects on fruit
NASA Astrophysics Data System (ADS)
Wen, Zhiqing; Tao, Yang
1997-10-01
A machine-vision sorting system was developed that utilizes the difference in light reflectance of fruit surfaces to distinguish the defective and good apples. To accommodate to the spherical reflectance characteristics of fruit with curved surface like apple, a spherical transform algorithm was developed that converts the original image to a non-radiant image without losing defective segments on the fruit. To prevent high-quality dark-colored fruit form being classified into the defective class and increase the defect detection rate for light-colored fruit, an intensity compensation method using maximum propagation was used. Experimental results demonstrated the effectiveness of the method based on maximum propagation and spherical transform for on-line detection of defects on apples.
Use of species-specific PCR for the identification of 10 sea cucumber species
NASA Astrophysics Data System (ADS)
Wen, Jing; Zeng, Ling
2014-11-01
We developed a species-specific PCR method to identify species among dehydrated products of 10 sea cucumber species. Ten reverse species-specific primers designed from the 16S rRNA gene, in combination with one forward universal primer, generated PCR fragments of ca. 270 bp length for each species. The specificity of the PCR assay was tested with DNA of samples of 21 sea cucumber species. Amplification was observed in specific species only. The species-specific PCR method we developed was successfully applied to authenticate species of commercial products of dehydrated sea cucumber, and was proven to be a useful, rapid, and low-cost technique to identify the origin of the sea cucumber product.
Three-dimensional inversion of multisource array electromagnetic data
NASA Astrophysics Data System (ADS)
Tartaras, Efthimios
Three-dimensional (3-D) inversion is increasingly important for the correct interpretation of geophysical data sets in complex environments. To this effect, several approximate solutions have been developed that allow the construction of relatively fast inversion schemes. One such method that is fast and provides satisfactory accuracy is the quasi-linear (QL) approximation. It has, however, the drawback that it is source-dependent and, therefore, impractical in situations where multiple transmitters in different positions are employed. I have, therefore, developed a localized form of the QL approximation that is source-independent. This so-called localized quasi-linear (LQL) approximation can have a scalar, a diagonal, or a full tensor form. Numerical examples of its comparison with the full integral equation solution, the Born approximation, and the original QL approximation are given. The objective behind developing this approximation is to use it in a fast 3-D inversion scheme appropriate for multisource array data such as those collected in airborne surveys, cross-well logging, and other similar geophysical applications. I have developed such an inversion scheme using the scalar and diagonal LQL approximation. It reduces the original nonlinear inverse electromagnetic (EM) problem to three linear inverse problems. The first of these problems is solved using a weighted regularized linear conjugate gradient method, whereas the last two are solved in the least squares sense. The algorithm I developed provides the option of obtaining either smooth or focused inversion images. I have applied the 3-D LQL inversion to synthetic 3-D EM data that simulate a helicopter-borne survey over different earth models. The results demonstrate the stability and efficiency of the method and show that the LQL approximation can be a practical solution to the problem of 3-D inversion of multisource array frequency-domain EM data. I have also applied the method to helicopter-borne EM data collected by INCO Exploration over the Voisey's Bay area in Labrador, Canada. The results of the 3-D inversion successfully delineate the shallow massive sulfides and show that the method can produce reasonable results even in areas of complex geology and large resistivity contrasts.
NASA Astrophysics Data System (ADS)
Wu, Fang-Xiang; Mu, Lei; Shi, Zhong-Ke
2010-01-01
The models of gene regulatory networks are often derived from statistical thermodynamics principle or Michaelis-Menten kinetics equation. As a result, the models contain rational reaction rates which are nonlinear in both parameters and states. It is challenging to estimate parameters nonlinear in a model although there have been many traditional nonlinear parameter estimation methods such as Gauss-Newton iteration method and its variants. In this article, we develop a two-step method to estimate the parameters in rational reaction rates of gene regulatory networks via weighted linear least squares. This method takes the special structure of rational reaction rates into consideration. That is, in the rational reaction rates, the numerator and the denominator are linear in parameters. By designing a special weight matrix for the linear least squares, parameters in the numerator and the denominator can be estimated by solving two linear least squares problems. The main advantage of the developed method is that it can produce the analytical solutions to the estimation of parameters in rational reaction rates which originally is nonlinear parameter estimation problem. The developed method is applied to a couple of gene regulatory networks. The simulation results show the superior performance over Gauss-Newton method.
NASA Astrophysics Data System (ADS)
Howerton, William
This thesis presents a method for the integration of complex network control algorithms with localized agent specific algorithms for maneuvering and obstacle avoidance. This method allows for successful implementation of group and agent specific behaviors. It has proven to be robust and will work for a variety of vehicle platforms. Initially, a review and implementation of two specific algorithms will be detailed. The first, a modified Kuramoto model was developed by Xu [1] which utilizes tools from graph theory to efficiently perform the task of distributing agents. The second algorithm developed by Kim [2] is an effective method for wheeled robots to avoid local obstacles using a limit-cycle navigation method. The results of implementing these methods on a test-bed of wheeled robots will be presented. Control issues related to outside disturbances not anticipated in the original theory are then discussed. A novel method of using simulated agents to separate the task of distributing agents from agent specific velocity and heading commands has been developed and implemented to address these issues. This new method can be used to combine various behaviors and is not limited to a specific control algorithm.
[Psychophysiological selection: status and prospects].
Gurovskiĭ, N N; Novikov, M A
1981-01-01
The major stages in the development of psychophysiological selection of cosmonauts in the USSR are discussed. The psychophysiological selection was originally based on the data of psychoneurological expertise of the flight personnel and achievements of aviation psychology in the USSR. This was followed by the development of psychophysiological research, using instrumentation and simulation flights. Further complication of flight programs and participation of non-pilot cosmonauts (engineers, scientists) necessitated detailed study of personality properties and application of personality tests. At the present stage in the development of psychophysiological selection great importance is attached to the biorhythmological selection and methods for studying man's capabilities to control his own emotional, behavioral and autonomic reactions as well as environmental parameters. The review also discusses in detail methods of group selection and problems of rational selection of space crews.
NASA Astrophysics Data System (ADS)
Sanchez, J. L.; Osipowicz, T.; Tang, S. M.; Tay, T. S.; Win, T. T.
1997-07-01
The trace element concentrations found in geological samples can shed light on the formation process. In the case of gemstones, which might be of artificial or natural origin, there is also considerable interest in the development of methods that provide identification of the origin of a sample. For rubies, trace element concentrations present in natural samples were shown previously to be significant indicators of the region of origin [S.M. Tang et al., Appl. Spectr. 42 (1988) 44, and 43 (1989) 219]. Here we report the results of micro-PIXE analyses of trace element (Ti, V, Cr, Fe, Cu and Ga) concentrations of a large set ( n = 130) of natural rough rubies from nine locations in Myanmar (Burma). The resulting concentrations are subjected to statistical analysis. Six of the nine groups form clusters when the data base is evaluated using tree clustering and principal component analysis.
Balog, Julia; Perenyi, Dora; Guallar-Hoyas, Cristina; Egri, Attila; Pringle, Steven D; Stead, Sara; Chevallier, Olivier P; Elliott, Chris T; Takats, Zoltan
2016-06-15
Increasingly abundant food fraud cases have brought food authenticity and safety into major focus. This study presents a fast and effective way to identify meat products using rapid evaporative ionization mass spectrometry (REIMS). The experimental setup was demonstrated to be able to record a mass spectrometric profile of meat specimens in a time frame of <5 s. A multivariate statistical algorithm was developed and successfully tested for the identification of animal tissue with different anatomical origin, breed, and species with 100% accuracy at species and 97% accuracy at breed level. Detection of the presence of meat originating from a different species (horse, cattle, and venison) has also been demonstrated with high accuracy using mixed patties with a 5% detection limit. REIMS technology was found to be a promising tool in food safety applications providing a reliable and simple method for the rapid characterization of food products.
Neelotpol, Sharmind; Hay, Alastair W M; Jolly, A Jim; Woolridge, Mike W
2016-08-31
To recruit South Asian pregnant women, living in the UK, into a clinicoepidemiological study for the collection of lifestyle survey data and antenatal blood and to retain the women for the later collection of cord blood and meconium samples from their babies for biochemical analysis. A longitudinal study recruiting pregnant women of South Asian and Caucasian origin living in the UK. Recruitment of the participants, collection of clinical samples and survey data took place at the 2 sites within a single UK Northern Hospital Trust. Pregnant women of South Asian origin (study group, n=98) and of Caucasian origin (comparison group, n=38) living in Leeds, UK. Among the participants approached, 81% agreed to take part in the study while a 'direct approach' method was followed. The retention rate of the participants was a remarkable 93.4%. The main challenges in recruiting the ethnic minority participants were their cultural and religious conservativeness, language barrier, lack of interest and feeling of extra 'stress' in taking part in research. The chief investigator developed an innovative participant retention method, associated with the women's cultural and religious practices. The method proved useful in retaining the participants for about 5 months and in enabling successful collection of clinical samples from the same mother-baby pairs. The collection of clinical samples and lifestyle data exceeded the calculated sample size required to give the study sufficient power. The numbers of samples obtained were: maternal blood (n=171), cord blood (n=38), meconium (n=176), lifestyle questionnaire data (n=136) and postnatal records (n=136). Recruitment and retention of participants, according to the calculated sample size, ensured sufficient power and success for a clinicoepidemiological study. Results suggest that development of trust and confidence between the participant and the researcher is the key to the success of a clinical and epidemiological study involving ethnic minorities. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Improved memory loading techniques for the TSRV display system
NASA Technical Reports Server (NTRS)
Easley, W. C.; Lynn, W. A.; Mcluer, D. G.
1986-01-01
A recent upgrade of the TSRV research flight system at NASA Langley Research Center retained the original monochrome display system. However, the display memory loading equipment was replaced requiring design and development of new methods of performing this task. This paper describes the new techniques developed to load memory in the display system. An outdated paper tape method for loading the BOOTSTRAP control program was replaced by EPROM storage of the characters contained on the tape. Rather than move a tape past an optical reader, a counter was implemented which steps sequentially through EPROM addresses and presents the same data to the loader circuitry. A cumbersome cassette tape method for loading the applications software was replaced with a floppy disk method using a microprocessor terminal installed as part of the upgrade. The cassette memory image was transferred to disk and a specific software loader was written for the terminal which duplicates the function of the cassette loader.
A Generalized Fast Frequency Sweep Algorithm for Coupled Circuit-EM Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rockway, J D; Champagne, N J; Sharpe, R M
2004-01-14
Frequency domain techniques are popular for analyzing electromagnetics (EM) and coupled circuit-EM problems. These techniques, such as the method of moments (MoM) and the finite element method (FEM), are used to determine the response of the EM portion of the problem at a single frequency. Since only one frequency is solved at a time, it may take a long time to calculate the parameters for wideband devices. In this paper, a fast frequency sweep based on the Asymptotic Wave Expansion (AWE) method is developed and applied to generalized mixed circuit-EM problems. The AWE method, which was originally developed for lumped-loadmore » circuit simulations, has recently been shown to be effective at quasi-static and low frequency full-wave simulations. Here it is applied to a full-wave MoM solver, capable of solving for metals, dielectrics, and coupled circuit-EM problems.« less
Imaging light responses of foveal ganglion cells in the living macaque eye.
Yin, Lu; Masella, Benjamin; Dalkara, Deniz; Zhang, Jie; Flannery, John G; Schaffer, David V; Williams, David R; Merigan, William H
2014-05-07
The fovea dominates primate vision, and its anatomy and perceptual abilities are well studied, but its physiology has been little explored because of limitations of current physiological methods. In this study, we adapted a novel in vivo imaging method, originally developed in mouse retina, to explore foveal physiology in the macaque, which permits the repeated imaging of the functional response of many retinal ganglion cells (RGCs) simultaneously. A genetically encoded calcium indicator, G-CaMP5, was inserted into foveal RGCs, followed by calcium imaging of the displacement of foveal RGCs from their receptive fields, and their intensity-response functions. The spatial offset of foveal RGCs from their cone inputs makes this method especially appropriate for fovea by permitting imaging of RGC responses without excessive light adaptation of cones. This new method will permit the tracking of visual development, progression of retinal disease, or therapeutic interventions, such as insertion of visual prostheses.
New methods, algorithms, and software for rapid mapping of tree positions in coordinate forest plots
A. Dan Wilson
2000-01-01
The theories and methodologies for two new tree mapping methods, the Sequential-target method and the Plot-origin radial method, are described. The methods accommodate the use of any conventional distance measuring device and compass to collect horizontal distance and azimuth data between source or reference positions (origins) and target trees. Conversion equations...
Lavranos, Giagkos; Manolakou, Panagiota; Katsiki, Evangelia; Angelopoulou, Roxani
2013-12-01
Anthropology has always been particularly interested in the origin of human life and the development towards adulthood. Although originally working with skeletal measurements and bio-morphological markers in modern populations, it has now entered the growing field of applied molecular biology. This relatively recent advance allows the detailed study of major events in human development and senescence. For instance, sperm DNA integrity and chromatin re-organization are crucial factors for fertilization and embryo development. Clinical researchers have developed improved methods for the evaluation of DNA integrity and protaminosis in sperm nuclei, such as the TUNEL and the CMA3 assays. DNA damage in spermatozoal nuclei is detected using the TUNEL assay which depends on the specific enzymatic reaction of TdT with the end strand breaks of DNA. Protaminosis in spermatozoal nucleus is evaluated using CMA3 assay, which is based on the in situ competition between CMA3 and protamines. Such measurements may provide useful data on human reproductive health, aiding the explanation of demographic differences across the world.
NASA Astrophysics Data System (ADS)
Shinogle-Decker, Heather; Martinez-Rivera, Noraida; O'Brien, John; Powell, Richard D.; Joshi, Vishwas N.; Connell, Samuel; Rosa-Molinar, Eduardo
2018-02-01
A new correlative Förster Resonance Energy Transfer (FRET) microscopy method using FluoroNanogold™, a fluorescent immunoprobe with a covalently attached Nanogold® particle (1.4nm Au), overcomes resolution limitations in determining distances within synaptic nanoscale architecture. FRET by acceptor photobleaching has long been used as a method to increase fluorescence resolution. The transfer of energy from a donor to an acceptor generally occurs between 10-100Å, which is the relative distance between the donor molecule and the acceptor molecule. For the correlative FRET microscopy method using FluoroNanogold™, we immuno-labeled GFP-tagged-HeLa-expressing Connexin 35 (Cx35) with anti-GFP and with anti-Cx35/36 antibodies, and then photo-bleached the Cx before processing the sample for electron microscopic imaging. Preliminary studies reveal the use of Alexa Fluor® 594 FluoroNanogold™ slightly increases FRET distance to 70Å, in contrast to the 62.5Å using AlexaFluor 594®. Preliminary studies also show that using a FluoroNanogold™ probe inhibits photobleaching. After one photobleaching session, Alexa Fluor 594® fluorescence dropped to 19% of its original fluorescence; in contrast, after one photobleaching session, Alexa Fluor 594® FluoroNanogold™ fluorescence dropped to 53% of its original intensity. This result confirms that Alexa Fluor 594® FluoroNanogold™ is a much better donor probe than is Alexa Fluor 594®. The new method (a) creates a double confirmation method in determining structure and orientation of synaptic architecture, (b) allows development of a two-dimensional in vitro model to be used for precise testing of multiple parameters, and (c) increases throughput. Future work will include development of FluoroNanogold™ probes with different sizes of gold for additional correlative microscopy studies.
Mapping Diffusion in a Living Cell via the Phasor Approach
Ranjit, Suman; Lanzano, Luca; Gratton, Enrico
2014-01-01
Diffusion of a fluorescent protein within a cell has been measured using either fluctuation-based techniques (fluorescence correlation spectroscopy (FCS) or raster-scan image correlation spectroscopy) or particle tracking. However, none of these methods enables us to measure the diffusion of the fluorescent particle at each pixel of the image. Measurement using conventional single-point FCS at every individual pixel results in continuous long exposure of the cell to the laser and eventual bleaching of the sample. To overcome this limitation, we have developed what we believe to be a new method of scanning with simultaneous construction of a fluorescent image of the cell. In this believed new method of modified raster scanning, as it acquires the image, the laser scans each individual line multiple times before moving to the next line. This continues until the entire area is scanned. This is different from the original raster-scan image correlation spectroscopy approach, where data are acquired by scanning each frame once and then scanning the image multiple times. The total time of data acquisition needed for this method is much shorter than the time required for traditional FCS analysis at each pixel. However, at a single pixel, the acquired intensity time sequence is short; requiring nonconventional analysis of the correlation function to extract information about the diffusion. These correlation data have been analyzed using the phasor approach, a fit-free method that was originally developed for analysis of FLIM images. Analysis using this method results in an estimation of the average diffusion coefficient of the fluorescent species at each pixel of an image, and thus, a detailed diffusion map of the cell can be created. PMID:25517145
A Multi-Variate Fit to the Chemical Composition of the Cosmic-Ray Spectrum
NASA Astrophysics Data System (ADS)
Eisch, Jonathan
Since the discovery of cosmic rays over a century ago, evidence of their origins has remained elusive. Deflected by galactic magnetic fields, the only direct evidence of their origin and propagation remain encoded in their energy distribution and chemical composition. Current models of galactic cosmic rays predict variations of the energy distribution of individual elements in an energy region around 3x1015 eV known as the knee. This work presents a method to measure the energy distribution of individual elemental groups in the knee region and its application to a year of data from the IceCube detector. The method uses cosmic rays detected by both IceTop, the surface-array component, and the deep-ice component of IceCube during the 2009-2010 operation of the IC-59 detector. IceTop is used to measure the energy and the relative likelihood of the mass composition using the signal from the cosmic-ray induced extensive air shower reaching the surface. IceCube, 1.5 km below the surface, measures the energy of the high-energy bundle of muons created in the very first interactions after the cosmic ray enters the atmosphere. These event distributions are fit by a constrained model derived from detailed simulations of cosmic rays representing five chemical elements. The results of this analysis are evaluated in terms of the theoretical uncertainties in cosmic-ray interactions and seasonal variations in the atmosphere. The improvements in high-energy cosmic ray hadronic-interaction models informed by this analysis, combined with increased data from subsequent operation of the IceCube detector, could provide crucial limits on the origin of cosmic rays and their propagation through the galaxy. In the course of developing this method, a number of analysis and statistical techniques were developed to deal with the difficulties inherent in this type of measurement. These include a composition-sensitive air shower reconstruction technique, a method to model simulated event distributions with limited statistics, and a method to optimize and estimate the error on a regularized fit.
Parallelization of an Object-Oriented Unstructured Aeroacoustics Solver
NASA Technical Reports Server (NTRS)
Baggag, Abdelkader; Atkins, Harold; Oezturan, Can; Keyes, David
1999-01-01
A computational aeroacoustics code based on the discontinuous Galerkin method is ported to several parallel platforms using MPI. The discontinuous Galerkin method is a compact high-order method that retains its accuracy and robustness on non-smooth unstructured meshes. In its semi-discrete form, the discontinuous Galerkin method can be combined with explicit time marching methods making it well suited to time accurate computations. The compact nature of the discontinuous Galerkin method also makes it well suited for distributed memory parallel platforms. The original serial code was written using an object-oriented approach and was previously optimized for cache-based machines. The port to parallel platforms was achieved simply by treating partition boundaries as a type of boundary condition. Code modifications were minimal because boundary conditions were abstractions in the original program. Scalability results are presented for the SCI Origin, IBM SP2, and clusters of SGI and Sun workstations. Slightly superlinear speedup is achieved on a fixed-size problem on the Origin, due to cache effects.
Short text sentiment classification based on feature extension and ensemble classifier
NASA Astrophysics Data System (ADS)
Liu, Yang; Zhu, Xie
2018-05-01
With the rapid development of Internet social media, excavating the emotional tendencies of the short text information from the Internet, the acquisition of useful information has attracted the attention of researchers. At present, the commonly used can be attributed to the rule-based classification and statistical machine learning classification methods. Although micro-blog sentiment analysis has made good progress, there still exist some shortcomings such as not highly accurate enough and strong dependence from sentiment classification effect. Aiming at the characteristics of Chinese short texts, such as less information, sparse features, and diverse expressions, this paper considers expanding the original text by mining related semantic information from the reviews, forwarding and other related information. First, this paper uses Word2vec to compute word similarity to extend the feature words. And then uses an ensemble classifier composed of SVM, KNN and HMM to analyze the emotion of the short text of micro-blog. The experimental results show that the proposed method can make good use of the comment forwarding information to extend the original features. Compared with the traditional method, the accuracy, recall and F1 value obtained by this method have been improved.
Statistical evidence for common ancestry: Application to primates.
Baum, David A; Ané, Cécile; Larget, Bret; Solís-Lemus, Claudia; Ho, Lam Si Tung; Boone, Peggy; Drummond, Chloe P; Bontrager, Martin; Hunter, Steven J; Saucier, William
2016-06-01
Since Darwin, biologists have come to recognize that the theory of descent from common ancestry (CA) is very well supported by diverse lines of evidence. However, while the qualitative evidence is overwhelming, we also need formal methods for quantifying the evidential support for CA over the alternative hypothesis of separate ancestry (SA). In this article, we explore a diversity of statistical methods using data from the primates. We focus on two alternatives to CA, species SA (the separate origin of each named species) and family SA (the separate origin of each family). We implemented statistical tests based on morphological, molecular, and biogeographic data and developed two new methods: one that tests for phylogenetic autocorrelation while correcting for variation due to confounding ecological traits and a method for examining whether fossil taxa have fewer derived differences than living taxa. We overwhelmingly rejected both species and family SA with infinitesimal P values. We compare these results with those from two companion papers, which also found tremendously strong support for the CA of all primates, and discuss future directions and general philosophical issues that pertain to statistical testing of historical hypotheses such as CA. © 2016 The Author(s). Evolution © 2016 The Society for the Study of Evolution.
A Universal Tare Load Prediction Algorithm for Strain-Gage Balance Calibration Data Analysis
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2011-01-01
An algorithm is discussed that may be used to estimate tare loads of wind tunnel strain-gage balance calibration data. The algorithm was originally developed by R. Galway of IAR/NRC Canada and has been described in the literature for the iterative analysis technique. Basic ideas of Galway's algorithm, however, are universally applicable and work for both the iterative and the non-iterative analysis technique. A recent modification of Galway's algorithm is presented that improves the convergence behavior of the tare load prediction process if it is used in combination with the non-iterative analysis technique. The modified algorithm allows an analyst to use an alternate method for the calculation of intermediate non-linear tare load estimates whenever Galway's original approach does not lead to a convergence of the tare load iterations. It is also shown in detail how Galway's algorithm may be applied to the non-iterative analysis technique. Hand load data from the calibration of a six-component force balance is used to illustrate the application of the original and modified tare load prediction method. During the analysis of the data both the iterative and the non-iterative analysis technique were applied. Overall, predicted tare loads for combinations of the two tare load prediction methods and the two balance data analysis techniques showed excellent agreement as long as the tare load iterations converged. The modified algorithm, however, appears to have an advantage over the original algorithm when absolute voltage measurements of gage outputs are processed using the non-iterative analysis technique. In these situations only the modified algorithm converged because it uses an exact solution of the intermediate non-linear tare load estimate for the tare load iteration.
NASA Astrophysics Data System (ADS)
Sayre, George Anthony
The purpose of this dissertation was to develop the C ++ program Emergency Dose to calculate transport of radionuclides through indoor spaces using intermediate fidelity physics that provides improved spatial heterogeneity over well-mixed models such as MELCORRTM and much lower computation times than CFD codes such as FLUENTRTM . Modified potential flow theory, which is an original formulation of potential flow theory with additions of turbulent jet and natural convection approximations, calculates spatially heterogeneous velocity fields that well-mixed models cannot predict. Other original contributions of MPFT are: (1) generation of high fidelity boundary conditions relative to well-mixed-CFD coupling methods (conflation), (2) broadening of potential flow applications to arbitrary indoor spaces previously restricted to specific applications such as exhaust hood studies, and (3) great reduction of computation time relative to CFD codes without total loss of heterogeneity. Additionally, the Lagrangian transport module, which is discussed in Sections 1.3 and 2.4, showcases an ensemble-based formulation thought to be original to interior studies. Velocity and concentration transport benchmarks against analogous formulations in COMSOLRTM produced favorable results with discrepancies resulting from the tetrahedral meshing used in COMSOLRTM outperforming the Cartesian method used by Emergency Dose. A performance comparison of the concentration transport modules against MELCORRTM showed that Emergency Dose held advantages over the well-mixed model especially in scenarios with many interior partitions and varied source positions. A performance comparison of velocity module against FLUENTRTM showed that viscous drag provided the largest error between Emergency Dose and CFD velocity calculations, but that Emergency Dose's turbulent jets well approximated the corresponding CFD jets. Overall, Emergency Dose was found to provide a viable intermediate solution method for concentration transport with relatively low computation times.
ERIC Educational Resources Information Center
Onorato, P.
2011-01-01
An introduction to quantum mechanics based on the sum-over-paths (SOP) method originated by Richard P. Feynman and developed by E. F. Taylor and coworkers is presented. The Einstein-Brillouin-Keller (EBK) semiclassical quantization rules are obtained following the SOP approach for bounded systems, and a general approach to the calculation of…
Genetic Influences on Early Word Recognition Abilities and Disabilities: A Study of 7-Year-Old Twins
ERIC Educational Resources Information Center
Harlaar, Nicole; Spinath, Frank M.; Dale, Philip S.; Plomin, Robert
2005-01-01
Background: A fundamental issue for child psychology concerns the origins of individual differences in early reading development. Method: A measure of word recognition, the Test of Word Reading Efficiency (TOWRE), was administered by telephone to a representative population sample of 3,909 same-sex and opposite-sex pairs of 7-year-old twins.…
ERIC Educational Resources Information Center
Wolmer, Leo; Laor, Nathaniel; Dedeoglu, Ceyda; Siev, Joanna; Yazgan, Yanki
2005-01-01
Background: Child survivors of a catastrophic earthquake in Turkey were evaluated three and a half years after the event, and three years after a sub-group participated in a teacher-mediated intervention developed by the authors. The goal of this follow-up study was to determine the long-term effectiveness of the original intervention. Methods:…
ERIC Educational Resources Information Center
Lamb, Richard; Annetta, Leonard; Vallett, David
2015-01-01
Introduction: Creativity is the production of the new, original, unique, and divergent products and ideas mediated through lateral thinking. Evidence suggests that high levels of creativity and fluency are important in the continued development of student interest, efficacy and ultimately career impact in the sciences. Method: In this study, 559…
Unbiased nonorthogonal bases for tomographic reconstruction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sainz, Isabel; Klimov, Andrei B.; Roa, Luis
2010-05-15
We have developed a general method for constructing a set of nonorthogonal bases with equal separations between all different basis states in prime dimensions. The results are that the corresponding biorthogonal counterparts are pairwise unbiased with the components of the original bases. Using these bases, we derive an explicit expression for the optimal tomography in nonorthogonal bases. A special two-dimensional case is analyzed separately.
Watershed Models for Predicting Nitrogen Loads from Artificially Drained Lands
R. Wayne Skaggs; George M. Chescheir; Glenn Fernandez; Devendra M. Amatya
2003-01-01
Non-point sources of pollutants originate at the field scale but water quality problems usually occur at the watershed or basin scale. This paper describes a series of models developed for poorly drained watersheds. The models use DRAINMOD to predict hydrology at the field scale and a range of methods to predict channel hydraulics and nitrogen transport. In-stream...
Advanced Digital Forensic and Steganalysis Methods
2009-02-01
investigation is simultaneously cropped, scaled, and processed, extending the technology when the digital image is printed, developing technology capable ...or other common processing operations). TECNOLOGY APPLICATIONS 1. Determining the origin of digital images 2. Matching an image to a camera...Technology Transfer and Innovation Partnerships Division of Research P.O. Box 6000 State University of New York Binghamton, NY 13902-6000 Phone: 607-777
Teleology and News: The Religious Roots of American Journalism, 1630-1730.
ERIC Educational Resources Information Center
Nord, David Paul
The nature and function of news in the public life of seventeenth-century New England and the legacy this conception of news left for the development of American newspaper journalism in the eighteenth century are explored in this paper. The paper argues that the origin of American news--its subject matter, style, and method of reporting--is deeply…
ERIC Educational Resources Information Center
Symons, Sarah L.; Colgoni, Andrew; Harvey, Chad T.
2017-01-01
We describe interim results of an ongoing longitudinal pedagogical study investigating the efficacy of the Honours Integrated Science Program (iSci). We describe the pedagogical methods we use to prompt research skill development in a model from instructor-dependence to independent original research. We also describe a tool we use to help students…
Fast and Reliable Thermodynamic Approach for Determining the Protonation State of the Asp Dyad.
Huang, Jinfeng; Sun, Bin; Yao, Yuan; Liu, Junjun
2017-09-25
The protonation state of the asp dyad is significantly important in revealing enzymatic mechanisms and developing drugs. However, it is hard to determine by calculating free energy changes between possible protonation states, because the free energy changes due to protein conformational flexibility are usually much larger than those originating from different locations of protons. Sophisticated and computationally expensive methods such as free energy perturbation, thermodynamic integration (TI), and quantum mechanics/molecular mechanics are therefore usually used for this purpose. In the present study, we have developed a simple thermodynamic approach to effectively eliminating the free energy changes arising from protein conformational flexibility and estimating the free energy changes only originated from the locations of protons, which provides a fast and reliable method for determining the protonation state of asp dyads. The test of this approach on a total of 15 asp dyad systems, including BACE-1 and HIV-1 protease, shows that the predictions from this approach are all consistent with experiments or with the computationally expensive TI calculations. It is clear that our thermodynamic approach could be used to rapidly and reliably determine the protonation state of the asp dyad.
Borries, Carola; Sandel, Aaron A; Koenig, Andreas; Fernandez-Duque, Eduardo; Kamilar, Jason M; Amoroso, Caroline R; Barton, Robert A; Bray, Joel; Di Fiore, Anthony; Gilby, Ian C; Gordon, Adam D; Mundry, Roger; Port, Markus; Powell, Lauren E; Pusey, Anne E; Spriggs, Amanda; Nunn, Charles L
2016-09-01
Recent decades have seen rapid development of new analytical methods to investigate patterns of interspecific variation. Yet these cutting-edge statistical analyses often rely on data of questionable origin, varying accuracy, and weak comparability, which seem to have reduced the reproducibility of studies. It is time to improve the transparency of comparative data while also making these improved data more widely available. We, the authors, met to discuss how transparency, usability, and reproducibility of comparative data can best be achieved. We propose four guiding principles: 1) data identification with explicit operational definitions and complete descriptions of methods; 2) inclusion of metadata that capture key characteristics of the data, such as sample size, geographic coordinates, and nutrient availability (for example, captive versus wild animals); 3) documentation of the original reference for each datum; and 4) facilitation of effective interactions with the data via user friendly and transparent interfaces. We urge reviewers, editors, publishers, database developers and users, funding agencies, researchers publishing their primary data, and those performing comparative analyses to embrace these standards to increase the transparency, usability, and reproducibility of comparative studies. © 2016 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danko, George L
To increase understanding of the energy extraction capacity of Enhanced Geothermal System(s) (EGS), a numerical model development and application project is completed. The general objective of the project is to develop and apply a new, data-coupled Thermal-Hydrological-Mechanical-Chemical (T-H-M-C) model in which the four internal components can be freely selected from existing simulation software without merging and cross-combining a diverse set of computational codes. Eight tasks are completed during the project period. The results are reported in five publications, an MS thesis, twelve quarterly, and two annual reports to DOE. Two US patents have also been issued during the project period,more » with one patent application originated prior to the start of the project. The “Multiphase Physical Transport Modeling Method and Modeling System” (U.S. Patent 8,396,693 B2, 2013), a key element in the GHE sub-model solution, is successfully used for EGS studies. The “Geothermal Energy Extraction System and Method" invention (U.S. Patent 8,430,166 B2, 2013) originates from the time of project performance, describing a new fluid flow control solution. The new, coupled T-H-M-C numerical model will help analyzing and designing new, efficient EGS systems.« less
An instrument to assess the statistical intensity of medical research papers.
Nieminen, Pentti; Virtanen, Jorma I; Vähänikkilä, Hannu
2017-01-01
There is widespread evidence that statistical methods play an important role in original research articles, especially in medical research. The evaluation of statistical methods and reporting in journals suffers from a lack of standardized methods for assessing the use of statistics. The objective of this study was to develop and evaluate an instrument to assess the statistical intensity in research articles in a standardized way. A checklist-type measure scale was developed by selecting and refining items from previous reports about the statistical contents of medical journal articles and from published guidelines for statistical reporting. A total of 840 original medical research articles that were published between 2007-2015 in 16 journals were evaluated to test the scoring instrument. The total sum of all items was used to assess the intensity between sub-fields and journals. Inter-rater agreement was examined using a random sample of 40 articles. Four raters read and evaluated the selected articles using the developed instrument. The scale consisted of 66 items. The total summary score adequately discriminated between research articles according to their study design characteristics. The new instrument could also discriminate between journals according to their statistical intensity. The inter-observer agreement measured by the ICC was 0.88 between all four raters. Individual item analysis showed very high agreement between the rater pairs, the percentage agreement ranged from 91.7% to 95.2%. A reliable and applicable instrument for evaluating the statistical intensity in research papers was developed. It is a helpful tool for comparing the statistical intensity between sub-fields and journals. The novel instrument may be applied in manuscript peer review to identify papers in need of additional statistical review.
Hong, KyungPyo; Jeong, Eun-Kee; Wall, T. Scott; Drakos, Stavros G.; Kim, Daniel
2015-01-01
Purpose To develop and evaluate a wideband arrhythmia-insensitive-rapid (AIR) pulse sequence for cardiac T1 mapping without image artifacts induced by implantable-cardioverter-defibrillator (ICD). Methods We developed a wideband AIR pulse sequence by incorporating a saturation pulse with wide frequency bandwidth (8.9 kHz), in order to achieve uniform T1 weighting in the heart with ICD. We tested the performance of original and “wideband” AIR cardiac T1 mapping pulse sequences in phantom and human experiments at 1.5T. Results In 5 phantoms representing native myocardium and blood and post-contrast blood/tissue T1 values, compared with the control T1 values measured with an inversion-recovery pulse sequence without ICD, T1 values measured with original AIR with ICD were considerably lower (absolute percent error >29%), whereas T1 values measured with wideband AIR with ICD were similar (absolute percent error <5%). Similarly, in 11 human subjects, compared with the control T1 values measured with original AIR without ICD, T1 measured with original AIR with ICD was significantly lower (absolute percent error >10.1%), whereas T1 measured with wideband AIR with ICD was similar (absolute percent error <2.0%). Conclusion This study demonstrates the feasibility of a wideband pulse sequence for cardiac T1 mapping without significant image artifacts induced by ICD. PMID:25975192
Agent-Based Modeling of China's Rural-Urban Migration and Social Network Structure.
Fu, Zhaohao; Hao, Lingxin
2018-01-15
We analyze China's rural-urban migration and endogenous social network structures using agent-based modeling. The agents from census micro data are located in their rural origin with an empirical-estimated prior propensity to move. The population-scale social network is a hybrid one, combining observed family ties and locations of the origin with a parameter space calibrated from census, survey and aggregate data and sampled using a stepwise Latin Hypercube Sampling method. At monthly intervals, some agents migrate and these migratory acts change the social network by turning within-nonmigrant connections to between-migrant-nonmigrant connections, turning local connections to nonlocal connections, and adding among-migrant connections. In turn, the changing social network structure updates migratory propensities of those well-connected nonmigrants who become more likely to move. These two processes iterate over time. Using a core-periphery method developed from the k -core decomposition method, we identify and quantify the network structural changes and map these changes with the migration acceleration patterns. We conclude that network structural changes are essential for explaining migration acceleration observed in China during the 1995-2000 period.
Agent-based modeling of China's rural-urban migration and social network structure
NASA Astrophysics Data System (ADS)
Fu, Zhaohao; Hao, Lingxin
2018-01-01
We analyze China's rural-urban migration and endogenous social network structures using agent-based modeling. The agents from census micro data are located in their rural origin with an empirical-estimated prior propensity to move. The population-scale social network is a hybrid one, combining observed family ties and locations of the origin with a parameter space calibrated from census, survey and aggregate data and sampled using a stepwise Latin Hypercube Sampling method. At monthly intervals, some agents migrate and these migratory acts change the social network by turning within-nonmigrant connections to between-migrant-nonmigrant connections, turning local connections to nonlocal connections, and adding among-migrant connections. In turn, the changing social network structure updates migratory propensities of those well-connected nonmigrants who become more likely to move. These two processes iterate over time. Using a core-periphery method developed from the k-core decomposition method, we identify and quantify the network structural changes and map these changes with the migration acceleration patterns. We conclude that network structural changes are essential for explaining migration acceleration observed in China during the 1995-2000 period.
Muñoz, Eva; Muñoz, Gloria; Pineda, Laura; Serrahima, Eulalia; Centrich, Francesc
2012-01-01
A multiresidue method based on GC or LC and MS or MS/MS for the determination of 204 pesticides in diverse food matrixes of animal and plant origin is described. The method can include different stages of cleanup according to the chemical characteristics of each sample. Samples were extracted using accelerated solvent extraction. Those with a high fat content or that contained chlorophyll required further purification by gel permeation chromatography and/or SPE (ENVI-Carb). The methodology developed here was fully validated; the LOQs for the 204 pesticides are presented. The LOQ values lie between 0.01 to 0.02 mg/kg. However, in some cases, mainly in baby food, they were as low as 0.003 mg/kg, thereby meeting European Union requirements on maximum residue levels for pesticides, as outlined in European regulation 396/2005 and the Commission Directive 2003/13/EC. The procedure has been accredited for a wide scope of pesticides and matrixes by the Spanish Accreditation Body (ENAC) following ISO/IEC 17025:2005, as outlined in ENAC technical note NT-19.
Taher, Mohammad Ali; Pourmohammad, Fatemeh; Fazelirad, Hamid
2015-12-01
In the present work, an electrothermal atomic absorption spectrometric method has been developed for the determination of ultra-trace amounts of rhodium after adsorption of its 2-(5-bromo-2-pyridylazo)-5-diethylaminophenol/tetraphenylborate ion associated complex at the surface of alumina. Several factors affecting the extraction efficiency such as the pH, type of eluent, sample and eluent flow rates, sorption capacity of alumina and sample volume were investigated and optimized. The relative standard deviation for eight measurements of 0.1 ng/mL of rhodium was ±6.3%. In this method, the detection limit was 0.003 ng/mL in the original solution. The sorption capacity of alumina and the linear range for Rh(III) were evaluated as 0.8 mg/g and 0.015-0.45 ng/mL in the original solution, respectively. The proposed method was successfully applied for the extraction and determination of rhodium content in some food and standard samples with high recovery values. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Approximate Dynamic Programming: Combining Regional and Local State Following Approximations.
Deptula, Patryk; Rosenfeld, Joel A; Kamalapurkar, Rushikesh; Dixon, Warren E
2018-06-01
An infinite-horizon optimal regulation problem for a control-affine deterministic system is solved online using a local state following (StaF) kernel and a regional model-based reinforcement learning (R-MBRL) method to approximate the value function. Unlike traditional methods such as R-MBRL that aim to approximate the value function over a large compact set, the StaF kernel approach aims to approximate the value function in a local neighborhood of the state that travels within a compact set. In this paper, the value function is approximated using a state-dependent convex combination of the StaF-based and the R-MBRL-based approximations. As the state enters a neighborhood containing the origin, the value function transitions from being approximated by the StaF approach to the R-MBRL approach. Semiglobal uniformly ultimately bounded (SGUUB) convergence of the system states to the origin is established using a Lyapunov-based analysis. Simulation results are provided for two, three, six, and ten-state dynamical systems to demonstrate the scalability and performance of the developed method.
All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis
Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L.; Terés, Lluís; Baumann, Reinhard R.
2016-01-01
We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement. PMID:27649784
A bulk viscosity approach for shock capturing on unstructured grids
NASA Astrophysics Data System (ADS)
Shoeybi, Mohammad; Larsson, Nils Johan; Ham, Frank; Moin, Parviz
2008-11-01
The bulk viscosity approach for shock capturing (Cook and Cabot, JCP, 2005) augments the bulk part of the viscous stress tensor. The intention is to capture shock waves without dissipating turbulent structures. The present work extends and modifies this method for unstructured grids. We propose a method that properly scales the bulk viscosity with the grid spacing normal to the shock for unstructured grid for which the shock is not necessarily aligned with the grid. The magnitude of the strain rate tensor used in the original formulation is replaced with the dilatation, which appears to be more appropriate in the vortical turbulent flow regions (Mani et al., 2008). The original form of the model is found to have an impact on dilatational motions away form the shock wave, which is eliminated by a proposed localization of the bulk viscosity. Finally, to allow for grid adaptation around shock waves, an explicit/implicit time advancement scheme has been developed that adaptively identifies the stiff regions. The full method has been verified with several test cases, including 2D shock-vorticity entropy interaction, homogenous isotropic turbulence, and turbulent flow over a cylinder.
Lim, Dong Kyu; Mo, Changyeun; Lee, Dong-Kyu; Long, Nguyen Phuoc; Lim, Jongguk; Kwon, Sung Won
2018-01-01
The authenticity determination of white rice is crucial to prevent deceptive origin labeling and dishonest trading. However, a non-destructive and comprehensive method for rapidly discriminating the geographical origins of white rice between countries is still lacking. In the current study, we developed a volatile organic compound based geographical discrimination method using headspace solid-phase microextraction coupled to gas chromatography-mass spectrometry (HS-SPME/GC-MS) to discriminate rice samples from Korea and China. A partial least squares discriminant analysis (PLS-DA) model exhibited a good classification of white rice between Korea and China (accuracy = 0.958, goodness of fit = 0.937, goodness of prediction = 0.831, and permutation test p-value = 0.043). Combining the PLS-DA based feature selection with the differentially expressed features from the unpaired t-test and significance analysis of microarrays, 12 discriminatory biomarkers were found. Among them, hexanal and 1-hexanol have been previously known to be associated with the cultivation environment and storage conditions. Other hydrocarbon biomarkers are novel, and their impact on rice production and storage remains to be elucidated. In conclusion, our findings highlight the ability to rapidly discriminate white rice from Korea and China. The developed method maybe useful for the authenticity and quality control of white rice. Copyright © 2017. Published by Elsevier B.V.
Sorbe, A; Chazel, M; Gay, E; Haenni, M; Madec, J-Y; Hendrikx, P
2011-06-01
Develop and calculate performance indicators allows to continuously follow the operation of an epidemiological surveillance network. This is an internal evaluation method, implemented by the coordinators in collaboration with all the actors of the network. Its purpose is to detect weak points in order to optimize management. A method for the development of performance indicators of epidemiological surveillance networks was developed in 2004 and was applied to several networks. Its implementation requires a thorough description of the network environment and all its activities to define priority indicators. Since this method is considered to be complex, our objective consisted in developing a simplified approach and applying it to an epidemiological surveillance network. We applied the initial method to a theoretical network model to obtain a list of generic indicators that can be adapted to any surveillance network. We obtained a list of 25 generic performance indicators, intended to be reformulated and described according to the specificities of each network. It was used to develop performance indicators for RESAPATH, an epidemiological surveillance network of antimicrobial resistance in pathogenic bacteria of animal origin in France. This application allowed us to validate the simplified method, its value in terms of practical implementation, and its level of user acceptance. Its ease of use and speed of application compared to the initial method argue in favor of its use on broader scale. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Bowen, James M.
The goal of this research was to investigate the physicochemical properties of weapons grade plutonium particles originating from the 1960 BOMARC incident for the purpose of predicting their fate in the environment and to address radiation protection and nuclear security concerns. Methods were developed to locate and isolate the particles in order to characterize them. Physical, chemical, and radiological characterization was performed using a variety of techniques. And finally, the particles were subjected to a sequential extraction procedure, a series of increasingly aggressive reagents, to simulate an accelerated environmental exposure. A link between the morphology of the particles and their partitioning amongst environmental mechanisms was established.
Smooth particle hydrodynamics: theory and application to the origin of the moon
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benz, W.
1986-01-01
The origin of the moon is modeled by the so-called smooth particle hydrodynamics (SPH) method (Lucy, 1977, Monaghan 1985) which substitutes to the fluid a finite set of extended particles, the hydrodynamics equations reduce to the equation of motion of individual particles. These equations of motion differ only from the standard gravitational N-body problem insofar that pressure gradients and viscosity terms have to be added to the gradient of the potential to derive the forces between the particles. The numerical tools developed for ''classical'' N-body problems can therefore be readily applied to solve 3 dimensional hydroynamical problems. 12 refs., 1more » fig.« less
Reporter Concerns in 300 Mode-Related Incident Reports from NASA's Aviation Safety Reporting System
NASA Technical Reports Server (NTRS)
McGreevy, Michael W.
1996-01-01
A model has been developed which represents prominent reporter concerns expressed in the narratives of 300 mode-related incident reports from NASA's Aviation Safety Reporting System (ASRS). The model objectively quantifies the structure of concerns which persist across situations and reporters. These concerns are described and illustrated using verbatim sentences from the original narratives. Report accession numbers are included with each sentence so that concerns can be traced back to the original reports. The results also include an inventory of mode names mentioned in the narratives, and a comparison of individual and joint concerns. The method is based on a proximity-weighted co-occurrence metric and object-oriented complexity reduction.
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Sanderson, A. C.
1994-01-01
Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.
Fire test method for graphite fiber reinforced plastics
NASA Technical Reports Server (NTRS)
Bowles, K. J.
1980-01-01
A potential problem in the use of graphite fiber reinforced resin matrix composites is the dispersal of graphite fibers during accidental fires. Airborne, electrically conductive fibers originating from the burning composites could enter and cause shorting in electrical equipment located in surrounding areas. A test method for assessing the burning characteristics of graphite fiber reinforced composites and the effectiveness of the composites in retaining the graphite fibers has been developed. The method utilizes a modified Ohio State University Rate of Heat Release apparatus. The equipment and the testing procedure are described. The application of the test method to the assessment of composite materials is illustrated for two resin matrix/graphite composite systems.
NASA Technical Reports Server (NTRS)
Yan, Jue; Shu, Chi-Wang; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
In this paper we review the existing and develop new continuous Galerkin methods for solving time dependent partial differential equations with higher order derivatives in one and multiple space dimensions. We review local discontinuous Galerkin methods for convection diffusion equations involving second derivatives and for KdV type equations involving third derivatives. We then develop new local discontinuous Galerkin methods for the time dependent bi-harmonic type equations involving fourth derivatives, and partial differential equations involving fifth derivatives. For these new methods we present correct interface numerical fluxes and prove L(exp 2) stability for general nonlinear problems. Preliminary numerical examples are shown to illustrate these methods. Finally, we present new results on a post-processing technique, originally designed for methods with good negative-order error estimates, on the local discontinuous Galerkin methods applied to equations with higher derivatives. Numerical experiments show that this technique works as well for the new higher derivative cases, in effectively doubling the rate of convergence with negligible additional computational cost, for linear as well as some nonlinear problems, with a local uniform mesh.
Spanish Cultural Adaptation of the Questionnaire Early Arthritis for Psoriatic Patients.
García-Gavín, J; Pérez-Pérez, L; Tinazzi, I; Vidal, D; McGonagle, D
2017-12-01
The Early Arthritis for Psoriatic patients (EARP) questionnaire is a screening tool for psoriatic arthritis. The original Italian version has good measurement properties but the EARP required translation and adaptation for use in Spain. This article describes the cultural adaptation process as a step prior to validation. We used the principles of good practice for the cross-cultural adaptation of patient-reported outcomes measurement established by the International Society Pharmacoeconomics and Outcome Research. The steps in this process were preparation, forward translation, reconciliation, back-translation and review, harmonization, cognitive debriefing and review, and proofreading. During preparation the developers of the original questionnaire were asked for their permission to adapt the EARP for use in Spain and to act as consultants during the process. The original questionnaire was translated into Spanish by native Spanish translators, who made slight changes that were approved by the questionnaire's developers. The Spanish version was then back-translated into Italian; that version was reviewed to confirm equivalence with the original Italian text. The reconciled Spanish EARP was then tested for comprehensibility and interpretation in a group of 35 patients. All the patients answered all items without making additional comments. This cultural adaptation of the EARP questionnaire for Spanish populations is the first step towards its later use in routine clinical practice. The application of a cross-cultural adaptation method ensured equivalence between the original and Spanish versions of the EARP. The Spanish questionnaire will be validated in a second stage. Copyright © 2017 AEDV. Publicado por Elsevier España, S.L.U. All rights reserved.
García, M C; Marina, M L
2006-04-01
The undeclared addition of soybean proteins to milk products is forbidden and a method is needed for food control and enforcement. This paper reports the development of a chromatographic method for routine analysis enabling the detection of the addition of soybean proteins to dairy products. A perfusion chromatography column and a linear binary gradient of acetonitrile-water-0.1% (v/v) trifluoroacetic acid at a temperature of 60 degrees C were used. A very simple sample treatment consisting of mixing the sample with a suitable solvent (Milli-Q water or bicarbonate buffer (pH=11)) and centrifuging was used. The method enabled the separation of soybean proteins from milk proteins in less than 4 min (at a flow-rate of 3 ml/min). The method has been successfully applied to the detection of soybean proteins in milk, cheese, yogurt, and enteral formula. The correct quantitation of these vegetable proteins has also been possible in milk adulterated at origin with known sources of soybean proteins. The application of the method to samples adulterated at origin also leads to interesting conclusions as to the effect of the processing conditions used for the preparation of each dairy product on the determination of soybean proteins.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daley, P F
The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water samplingmore » and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method detection limit estimates and trend plotting were performed with spreadsheets and statistics software. Moreover, the analytical method developed was very limited in compound coverage, and unable to closely mirror the standard analytical methods promulgated by the EPA. To address these deficiencies, during this award the original equipment was operated at the OU 2-GTS to further evaluate the use of columns, commercial standard blends and other components to broaden the compound coverage of the chromatography system. A second-generation ASAP was designed and built to replace the original system at the OU 2-GTS, and include provision for introduction of internal standard compounds and surrogates into each sample analyzed. An enhanced, LabVIEW based chromatogram analysis application was written, that manages and archives chemical standards information, and provides a basis for NIST traceability for all analyses. Within this same package, all compound calibration response curves are managed, and different report formats were incorporated, that simplify trend analysis. Test results focus on operation of the original system at the OU 1 Integrated Chemical and Flow Monitoring System, at the OU 1 Fire Drill Area remediation site.« less
Kurushima, J. D.; Lipinski, M. J.; Gandolfi, B.; Froenicke, L.; Grahn, J. C.; Grahn, R. A.; Lyons, L. A.
2012-01-01
Summary Both cat breeders and the lay public have interests in the origins of their pets, not only in the genetic identity of the purebred individuals, but also the historical origins of common household cats. The cat fancy is a relatively new institution with over 85% of its 40–50 breeds arising only in the past 75 years, primarily through selection on single-gene aesthetic traits. The short, yet intense cat breed history poses a significant challenge to the development of a genetic marker-based breed identification strategy. Using different breed assignment strategies and methods, 477 cats representing 29 fancy breeds were analysed with 38 short tandem repeats, 148 intergenic and five phenotypic single nucleotide polymorphisms. Results suggest the frequentist method of Paetkau (accuracy single nucleotide polymorphisms = 0.78, short tandem repeats = 0.88) surpasses the Bayesian method of Rannala and Mountain (single nucleotide polymorphisms = 0.56, short tandem repeats = 0.83) for accurate assignment of individuals to the correct breed. Additionally, a post-assignment verification step with the five phenotypic single nucleotide polymorphisms accurately identified between 0.31 and 0.58 of the mis-assigned individuals raising the sensitivity of assignment with the frequentist method to 0.89 and 0.92 single nucleotide polymorphisms and short tandem repeats respectively. This study provides a novel multi-step assignment strategy and suggests that, despite their short breed history and breed family groupings, a majority of cats can be assigned to their proper breed or population of origin, i.e. race. PMID:23171373
NASA Astrophysics Data System (ADS)
Zuluaga, Jorge I.; Sánchez-Hernández, Oscar; Sucerquia, Mario; Ferrín, Ignacio
2018-06-01
With the advent of more and deeper sky surveys, the discovery of interstellar small objects entering into the solar system has been finally possible. In 2017 October 19, using observations of the Pan-STARRS survey, a fast moving object, now officially named 1I/2017 U1 (‘Oumuamua), was discovered in a heliocentric unbound trajectory, suggesting an interstellar origin. Assessing the provenance of interstellar small objects is key for understanding their distribution, spatial density, and the processes responsible for their ejection from planetary systems. However, their peculiar trajectories place a limit on the number of observations available to determine a precise orbit. As a result, when its position is propagated ∼105–106 years backward in time, small errors in orbital elements become large uncertainties in position in the interstellar space. In this paper we present a general method for assigning probabilities to nearby stars of being the parent system of an observed interstellar object. We describe the method in detail and apply it for assessing the origin of ‘Oumuamua. A preliminary list of potential progenitors and their corresponding probabilities is provided. In the future, when further information about the object and/or the nearby stars be refined, the probabilities computed with our method can be updated. We provide all the data and codes we developed for this purpose in the form of an open source C/C++/Python package, iWander, which is publicly available at http://github.com/seap-udea/iWander.
Development and origins of zebrafish ocular vasculature.
Kaufman, Rivka; Weiss, Omri; Sebbagh, Meyrav; Ravid, Revital; Gibbs-Bar, Liron; Yaniv, Karina; Inbal, Adi
2015-03-27
The developing eye receives blood supply from two vascular systems, the intraocular hyaloid system and the superficial choroidal vessels. In zebrafish, a highly stereotypic and simple set of vessels develops on the surface of the eye prior to development of choroidal vessels. The origins and formation of this so-called superficial system have not been described. We have analyzed the development of superficial vessels by time-lapse imaging and identified their origins by photoconversion experiments in kdrl:Kaede transgenic embryos. We show that the entire superficial system is derived from a venous origin, and surprisingly, we find that the hyaloid system has, in addition to its previously described arterial origin, a venous origin for specific vessels. Despite arising solely from a vein, one of the vessels in the superficial system, the nasal radial vessel (NRV), appears to acquire an arterial identity while growing over the nasal aspect of the eye and this happens in a blood flow-independent manner. Our results provide a thorough analysis of the early development and origins of zebrafish ocular vessels and establish the superficial vasculature as a model for studying vascular patterning in the context of the developing eye.
Improving mixing efficiency of a polymer micromixer by use of a plastic shim divider
NASA Astrophysics Data System (ADS)
Li, Lei; Lee, L. James; Castro, Jose M.; Yi, Allen Y.
2010-03-01
In this paper, a critical modification to a polymer based affordable split-and-recombination static micromixer is described. To evaluate the improvement, both the original and the modified design were carefully investigated using an experimental setup and numerical modeling approach. The structure of the micromixer was designed to take advantage of the process capabilities of both ultraprecision micromachining and microinjection molding process. Specifically, the original and the modified design were numerically simulated using commercial finite element method software ANSYS CFX to assist the re-designing of the micromixers. The simulation results have shown that both designs are capable of performing mixing while the modified design has a much improved performance. Mixing experiments with two different fluids were carried out using the original and the modified mixers again showed a significantly improved mixing uniformity by the latter. The measured mixing coefficient for the original design was 0.11, and for the improved design it was 0.065. The developed manufacturing process based on ultraprecision machining and microinjection molding processes for device fabrication has the advantage of high-dimensional precision, low cost and manufacturing flexibility.
Longobardi, F; Casiello, G; Cortese, M; Perini, M; Camin, F; Catucci, L; Agostiano, A
2015-12-01
The aim of this study was to predict the geographic origin of lentils by using isotope ratio mass spectrometry (IRMS) in combination with chemometrics. Lentil samples from two origins, i.e. Italy and Canada, were analysed obtaining the stable isotope ratios of δ(13)C, δ(15)N, δ(2)H, δ(18)O, and δ(34)S. A comparison between median values (U-test) highlighted statistically significant differences (p<0.05) for all isotopic parameters between the lentils produced in these two different geographic areas, except for δ(15)N. Applying principal component analysis, grouping of samples was observed on the basis of origin but with overlapping zones; consequently, two supervised discriminant techniques, i.e. partial least squares discriminant analysis and k-nearest neighbours algorithm were used. Both models showed good performances with external prediction abilities of about 93% demonstrating the suitability of the methods developed. Subsequently, isotopic determinations were also performed on the protein and starch fractions and the relevant results are reported. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurzeja, R.; Werth, D.; Buckley, R.
The Atmospheric Technology Group at SRNL developed a new method to detect signals from Weapons of Mass Destruction (WMD) activities in a time series of chemical measurements at a downwind location. This method was tested with radioxenon measured in Russia and Japan after the 2013 underground test in North Korea. This LDRD calculated the uncertainty in the method with the measured data and also for a case with the signal reduced to 1/10 its measured value. The research showed that the uncertainty in the calculated probability of origin from the NK test site was small enough to confirm the test.more » The method was also wellbehaved for small signal strengths.« less
Lahouidak, Samah; Salghi, Rachid; Zougagh, Mohammed; Ríos, Angel
2018-03-06
A capillary electrophoresis method was developed for the determination of coumarin (COUM), ethyl vanillin (EVA), p-hydroxybenzaldehyde (PHB), p-hydroxybenzoic acid (PHBA), vanillin (VAN), vanillic acid (VANA) and vanillic alcohol (VOH) in vanilla products. The measured concentrations are compared to values obtained by liquid chromatography (LC) method. Analytical results, method precision, and accuracy data are presented and limits of detection for the method ranged from 2 to 5 μg/mL. The results obtained are used in monitoring the composition of vanilla flavorings, as well as for confirmation of natural or non-natural origin of vanilla in samples using four selected food samples containing this flavor. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The development of a peak-time criterion for designing controlled-release devices.
Simon, Laurent; Ospina, Juan
2016-08-25
This work consists of estimating dynamic characteristics for topically-applied drugs when the magnitude of the flux increases to a maximum value, called peak flux, before declining to zero. This situation is typical of controlled-released systems with a finite donor or vehicle volume. Laplace transforms were applied to the governing equations and resulted in an expression for the flux in terms of the physical characteristics of the system. After approximating this function by a second-order model, three parameters of this reduced structure captured the essential features of the original process. Closed-form relationships were then developed for the peak flux and time-to-peak based on the empirical representation. Three case studies that involve mechanisms, such as diffusion, partitioning, dissolution and elimination, were selected to illustrate the procedure. The technique performed successfully as shown by the ability of the second-order flux to match the prediction of the original transport equations. A main advantage of the proposed method is that it does not require a solution of the original partial differential equations. Less accurate results were noted for longer lag times. Copyright © 2016 Elsevier B.V. All rights reserved.
[Self-Stigma of Depression Scale SSDS - Evaluation of the German Version].
Makowski, Anna Christin; Mnich, Eva E; von dem Knesebeck, Olaf
2017-05-12
Objectives A better understanding of self-stigma facilitates the development and evaluation of anti-stigma measures. In this study, the Self-Stigma of Depression Scale (SSDS) is applied for the first time in Germany. The focus lies on feasibility and psychometric characteristics of the scale. Methods Data stem from a representative population survey in Germany (N = 2,013). The 16 items of the original SSDS are used to assess anticipated self-stigma in case of depression. Main component analysis is applied to analyze the factor structure. Results The original version of the SDSS could not be replicated in the German sample. Instead of four, three factors emerged in the German version. They are similar to three subscales of the original SSDS: "social inadequacy", "help-seeking inhibition" and "self-blame". The internal reliability of the total scale as well as of the first two subscales is acceptable. Conclusion SSDS is a multidimensional construct and can serve as an important instrument in research regarding self-stigma of depression in Germany. A further development of the German scale is recommended in order to gain greater insight into the nature of (anticipated) depression self-stigma. © Georg Thieme Verlag KG Stuttgart · New York.
Nykanen, David G; Forbes, Thomas J; Du, Wei; Divekar, Abhay A; Reeves, Jaxk H; Hagler, Donald J; Fagan, Thomas E; Pedra, Carlos A C; Fleming, Gregory A; Khan, Danyal M; Javois, Alexander J; Gruenstein, Daniel H; Qureshi, Shakeel A; Moore, Phillip M; Wax, David H
2016-02-01
We sought to develop a scoring system that predicts the risk of serious adverse events (SAE's) for individual pediatric patients undergoing cardiac catheterization procedures. Systematic assessment of risk of SAE in pediatric catheterization can be challenging in view of a wide variation in procedure and patient complexity as well as rapidly evolving technology. A 10 component scoring system was originally developed based on expert consensus and review of the existing literature. Data from an international multi-institutional catheterization registry (CCISC) between 2008 and 2013 were used to validate this scoring system. In addition we used multivariate methods to further refine the original risk score to improve its predictive power of SAE's. Univariate analysis confirmed the strong correlation of each of the 10 components of the original risk score with SAE attributed to a pediatric cardiac catheterization (P < 0.001 for all variables). Multivariate analysis resulted in a modified risk score (CRISP) that corresponds to an increase in value of area under a receiver operating characteristic curve (AUC) from 0.715 to 0.741. The CRISP score predicts risk of occurrence of an SAE for individual patients undergoing pediatric cardiac catheterization procedures. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Sumihara, K.
Based upon legitimate variational principles, one microscopic-macroscopic finite element formulation for linear dynamics is presented by Hybrid Stress Finite Element Method. The microscopic application of Geometric Perturbation introduced by Pian and the introduction of infinitesimal limit core element (Baby Element) have been consistently combined according to the flexible and inherent interpretation of the legitimate variational principles initially originated by Pian and Tong. The conceptual development based upon Hybrid Finite Element Method is extended to linear dynamics with the introduction of physically meaningful higher modes.
Multiconfigurational quantum propagation with trajectory-guided generalized coherent states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grigolo, Adriano, E-mail: agrigolo@ifi.unicamp.br; Aguiar, Marcus A. M. de, E-mail: aguiar@ifi.unicamp.br; Viscondi, Thiago F., E-mail: viscondi@if.usp.br
2016-03-07
A generalized version of the coupled coherent states method for coherent states of arbitrary Lie groups is developed. In contrast to the original formulation, which is restricted to frozen-Gaussian basis sets, the extended method is suitable for propagating quantum states of systems featuring diversified physical properties, such as spin degrees of freedom or particle indistinguishability. The approach is illustrated with simple models for interacting bosons trapped in double- and triple-well potentials, most adequately described in terms of SU(2) and SU(3) bosonic coherent states, respectively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.
2012-01-15
The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.