Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...
Computer aided statistical process control for on-line instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meils, D.E.
1995-01-01
On-line chemical process instrumentation historically has been used for trending. Recent technological advances in on-line instrumentation have improved the accuracy and reliability of on-line instrumentation. However, little attention has been given to validating and verifying on-line instrumentation. This paper presents two practical approaches for validating instrument performance by comparison of on-line instrument response to either another portable instrument or another bench instrument. Because the comparison of two instruments` performance to each other requires somewhat complex statistical calculations, a computer code (Lab Stats Pack{reg_sign}) is used to simplify the calculations. Lab Stats Pack{reg_sign} also develops control charts that may be usedmore » for continuous verification of on-line instrument performance.« less
On the fractal characterization of Paretian Poisson processes
NASA Astrophysics Data System (ADS)
Eliazar, Iddo I.; Sokolov, Igor M.
2012-06-01
Paretian Poisson processes are Poisson processes which are defined on the positive half-line, have maximal points, and are quantified by power-law intensities. Paretian Poisson processes are elemental in statistical physics, and are the bedrock of a host of power-law statistics ranging from Pareto's law to anomalous diffusion. In this paper we establish evenness-based fractal characterizations of Paretian Poisson processes. Considering an array of socioeconomic evenness-based measures of statistical heterogeneity, we show that: amongst the realm of Poisson processes which are defined on the positive half-line, and have maximal points, Paretian Poisson processes are the unique class of 'fractal processes' exhibiting scale-invariance. The results established in this paper are diametric to previous results asserting that the scale-invariance of Poisson processes-with respect to physical randomness-based measures of statistical heterogeneity-is characterized by exponential Poissonian intensities.
Dalgin, Rebecca Spirito; Dalgin, M Halim; Metzger, Scott J
2018-05-01
This article focuses on the impact of a peer run warm line as part of the psychiatric recovery process. It utilized data including the Recovery Assessment Scale, community integration measures and crisis service usage. Longitudinal statistical analysis was completed on 48 sets of data from 2011, 2012, and 2013. Although no statistically significant differences were observed for the RAS score, community integration data showed increases in visits to primary care doctors, leisure/recreation activities and socialization with others. This study highlights the complexity of psychiatric recovery and that nonclinical peer services like peer run warm lines may be critical to the process.
NASA Astrophysics Data System (ADS)
Pries, V. V.; Proskuriakov, N. E.
2018-04-01
To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.
Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li
2017-10-01
To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.
Adaptive filtering in biological signal processing.
Iyer, V K; Ploysongsang, Y; Ramamoorthy, P A
1990-01-01
The high dependence of conventional optimal filtering methods on the a priori knowledge of the signal and noise statistics render them ineffective in dealing with signals whose statistics cannot be predetermined accurately. Adaptive filtering methods offer a better alternative, since the a priori knowledge of statistics is less critical, real time processing is possible, and the computations are less expensive for this approach. Adaptive filtering methods compute the filter coefficients "on-line", converging to the optimal values in the least-mean square (LMS) error sense. Adaptive filtering is therefore apt for dealing with the "unknown" statistics situation and has been applied extensively in areas like communication, speech, radar, sonar, seismology, and biological signal processing and analysis for channel equalization, interference and echo canceling, line enhancement, signal detection, system identification, spectral analysis, beamforming, modeling, control, etc. In this review article adaptive filtering in the context of biological signals is reviewed. An intuitive approach to the underlying theory of adaptive filters and its applicability are presented. Applications of the principles in biological signal processing are discussed in a manner that brings out the key ideas involved. Current and potential future directions in adaptive biological signal processing are also discussed.
Bjorgan, Asgeir; Randeberg, Lise Lyngsnes
2015-01-01
Processing line-by-line and in real-time can be convenient for some applications of line-scanning hyperspectral imaging technology. Some types of processing, like inverse modeling and spectral analysis, can be sensitive to noise. The MNF (minimum noise fraction) transform provides suitable denoising performance, but requires full image availability for the estimation of image and noise statistics. In this work, a modified algorithm is proposed. Incrementally-updated statistics enables the algorithm to denoise the image line-by-line. The denoising performance has been compared to conventional MNF and found to be equal. With a satisfying denoising performance and real-time implementation, the developed algorithm can denoise line-scanned hyperspectral images in real-time. The elimination of waiting time before denoised data are available is an important step towards real-time visualization of processed hyperspectral data. The source code can be found at http://www.github.com/ntnu-bioopt/mnf. This includes an implementation of conventional MNF denoising. PMID:25654717
NASA Astrophysics Data System (ADS)
Zhao, Libo; Xia, Yong; Hebibul, Rahman; Wang, Jiuhong; Zhou, Xiangyang; Hu, Yingjie; Li, Zhikang; Luo, Guoxi; Zhao, Yulong; Jiang, Zhuangde
2018-03-01
This paper presents an experimental study using image processing to investigate width and width uniformity of sub-micrometer polyethylene oxide (PEO) lines fabricated by near-filed electrospinning (NFES) technique. An adaptive thresholding method was developed to determine the optimal gray values to accurately extract profiles of printed lines from original optical images. And it was proved with good feasibility. The mechanism of the proposed thresholding method was believed to take advantage of statistic property and get rid of halo induced errors. Triangular method and relative standard deviation (RSD) were introduced to calculate line width and width uniformity, respectively. Based on these image processing methods, the effects of process parameters including substrate speed (v), applied voltage (U), nozzle-to-collector distance (H), and syringe pump flow rate (Q) on width and width uniformity of printed lines were discussed. The research results are helpful to promote the NFES technique for fabricating high resolution micro and sub-micro lines and also helpful to optical image processing at sub-micro level.
Interpretation of Statistical Data: The Importance of Affective Expressions
ERIC Educational Resources Information Center
Queiroz, Tamires; Monteiro, Carlos; Carvalho, Liliane; François, Karen
2017-01-01
In recent years, research on teaching and learning of statistics emphasized that the interpretation of data is a complex process that involves cognitive and technical aspects. However, it is a human activity that involves also contextual and affective aspects. This view is in line with research on affectivity and cognition. While the affective…
Nongrayness Effects in Wolf-Rayet Wind Momentum Deposition
NASA Astrophysics Data System (ADS)
Onifer, A. J.; Gayley, K. G.
2004-05-01
Wolf-Rayet winds are characterized by their large momentum fluxes and optically thick winds. A simple analytic approach that helps to understand the most critical processes is the effecively gray approximation, but this has not been generalized to more realistic nongray opacities. We have developed a simplified theory for describing the interaction of the stellar flux with nongray wind opacity. We replace the detailed line list with a set of statistical parameters that are sensitive to the line strengths as well as the wavelength distribution of lines. We determine these statistical parameters for several real line lists, exploring the effects of temperature and density changes on the efficiency of momentum driving relative to gray opacity. We wish to acknowledge NSF grant AST-0098155.
A Prototype System for Retrieval of Gene Functional Information
Folk, Lillian C.; Patrick, Timothy B.; Pattison, James S.; Wolfinger, Russell D.; Mitchell, Joyce A.
2003-01-01
Microarrays allow researchers to gather data about the expression patterns of thousands of genes simultaneously. Statistical analysis can reveal which genes show statistically significant results. Making biological sense of those results requires the retrieval of functional information about the genes thus identified, typically a manual gene-by-gene retrieval of information from various on-line databases. For experiments generating thousands of genes of interest, retrieval of functional information can become a significant bottleneck. To address this issue, we are currently developing a prototype system to automate the process of retrieval of functional information from multiple on-line sources. PMID:14728346
Smolko, J R; Greisler, D S
2001-01-01
There is ongoing pressure for medical groups owned by not-for-profit health care systems or for-profit entrepreneurs to generate profit. The fading promise of superior strategy through health care integration has boards of directors clamoring for bottom-line performance. While prudent, sole focus on the bottom line through the lens of the profit-and-loss (P&L) statement provides incomplete information upon which to base executive decisions. The purpose of this paper is to suggest that placing statistical process control (SPC) charts in tandem with the P&L statement provides a more complete picture of medical group performance thereby optimizing decision making as executives deal with the whitewater issues surrounding physician practice ownership.
Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K
1996-03-01
An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.
1980-12-01
distributions of Figs. 3 and 4 may be fitted quit, accurately by broken straight lines. If we had plotted the differential distributions directly...collection process. These fluctuations are smoothed by replacing the actual differential distribution by the derivative of the fitted broken-line lognormal...for each interval T. The constants in the distribution for each broken section of the lognormal approximations are found by fitting lines to the curve
Batch Statistical Process Monitoring Approach to a Cocrystallization Process.
Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A
2015-12-01
Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
NASA Astrophysics Data System (ADS)
Wu, Chong; Liu, Liping; Wei, Ming; Xi, Baozhu; Yu, Minghui
2018-03-01
A modified hydrometeor classification algorithm (HCA) is developed in this study for Chinese polarimetric radars. This algorithm is based on the U.S. operational HCA. Meanwhile, the methodology of statistics-based optimization is proposed including calibration checking, datasets selection, membership functions modification, computation thresholds modification, and effect verification. Zhuhai radar, the first operational polarimetric radar in South China, applies these procedures. The systematic bias of calibration is corrected, the reliability of radar measurements deteriorates when the signal-to-noise ratio is low, and correlation coefficient within the melting layer is usually lower than that of the U.S. WSR-88D radar. Through modification based on statistical analysis of polarimetric variables, the localized HCA especially for Zhuhai is obtained, and it performs well over a one-month test through comparison with sounding and surface observations. The algorithm is then utilized for analysis of a squall line process on 11 May 2014 and is found to provide reasonable details with respect to horizontal and vertical structures, and the HCA results—especially in the mixed rain-hail region—can reflect the life cycle of the squall line. In addition, the kinematic and microphysical processes of cloud evolution and the differences between radar-detected hail and surface observations are also analyzed. The results of this study provide evidence for the improvement of this HCA developed specifically for China.
On-line dimensional measurement of small components on the eyeglasses assembly line
NASA Astrophysics Data System (ADS)
Rosati, G.; Boschetti, G.; Biondi, A.; Rossi, A.
2009-03-01
Dimensional measurement of the subassemblies at the beginning of the assembly line is a very crucial process for the eyeglasses industry, since even small manufacturing errors of the components can lead to very visible defects on the final product. For this reason, all subcomponents of the eyeglass are verified before beginning the assembly process either with a 100% inspection or on a statistical basis. Inspection is usually performed by human operators, with high costs and a degree of repeatability which is not always satisfactory. This paper presents a novel on-line measuring system for dimensional verification of small metallic subassemblies for the eyeglasses industry. The machine vision system proposed, which was designed to be used at the beginning of the assembly line, could also be employed in the Statistical Process Control (SPC) by the manufacturer of the subassemblies. The automated system proposed is based on artificial vision, and exploits two CCD cameras and an anthropomorphic robot to inspect and manipulate the subcomponents of the eyeglass. Each component is recognized by the first camera in a quite large workspace, picked up by the robot and placed in the small vision field of the second camera which performs the measurement process. Finally, the part is palletized by the robot. The system can be easily taught by the operator by simply placing the template object in the vision field of the measurement camera (for dimensional data acquisition) and hence by instructing the robot via the Teaching Control Pendant within the vision field of the first camera (for pick-up transformation acquisition). The major problem we dealt with is that the shape and dimensions of the subassemblies can vary in a quite wide range, but different positioning of the same component can look very similar one to another. For this reason, a specific shape recognition procedure was developed. In the paper, the whole system is presented together with first experimental lab results.
Sparse approximation of currents for statistics on curves and surfaces.
Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas
2008-01-01
Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.
Diversity of Poissonian populations.
Eliazar, Iddo I; Sokolov, Igor M
2010-01-01
Populations represented by collections of points scattered randomly on the real line are ubiquitous in science and engineering. The statistical modeling of such populations leads naturally to Poissonian populations-Poisson processes on the real line with a distinguished maximal point. Poissonian populations are infinite objects underlying key issues in statistical physics, probability theory, and random fractals. Due to their infiniteness, measuring the diversity of Poissonian populations depends on the lower-bound cut-off applied. This research characterizes the classes of Poissonian populations whose diversities are invariant with respect to the cut-off level applied and establishes an elemental connection between these classes and extreme-value theory. The measures of diversity considered are variance and dispersion, Simpson's index and inverse participation ratio, Shannon's entropy and Rényi's entropy, and Gini's index.
NASA Astrophysics Data System (ADS)
Eliazar, Iddo
2017-05-01
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.
Fuzzy Adaptive Control for Intelligent Autonomous Space Exploration Problems
NASA Technical Reports Server (NTRS)
Esogbue, Augustine O.
1998-01-01
The principal objective of the research reported here is the re-design, analysis and optimization of our newly developed neural network fuzzy adaptive controller model for complex processes capable of learning fuzzy control rules using process data and improving its control through on-line adaption. The learned improvement is according to a performance objective function that provides evaluative feedback; this performance objective is broadly defined to meet long-range goals over time. Although fuzzy control had proven effective for complex, nonlinear, imprecisely-defined processes for which standard models and controls are either inefficient, impractical or cannot be derived, the state of the art prior to our work showed that procedures for deriving fuzzy control, however, were mostly ad hoc heuristics. The learning ability of neural networks was exploited to systematically derive fuzzy control and permit on-line adaption and in the process optimize control. The operation of neural networks integrates very naturally with fuzzy logic. The neural networks which were designed and tested using simulation software and simulated data, followed by realistic industrial data were reconfigured for application on several platforms as well as for the employment of improved algorithms. The statistical procedures of the learning process were investigated and evaluated with standard statistical procedures (such as ANOVA, graphical analysis of residuals, etc.). The computational advantage of dynamic programming-like methods of optimal control was used to permit on-line fuzzy adaptive control. Tests for the consistency, completeness and interaction of the control rules were applied. Comparisons to other methods and controllers were made so as to identify the major advantages of the resulting controller model. Several specific modifications and extensions were made to the original controller. Additional modifications and explorations have been proposed for further study. Some of these are in progress in our laboratory while others await additional support. All of these enhancements will improve the attractiveness of the controller as an effective tool for the on line control of an array of complex process environments.
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…
Intelligent form removal with character stroke preservation
NASA Astrophysics Data System (ADS)
Garris, Michael D.
1996-03-01
A new technique for intelligent form removal has been developed along with a new method for evaluating its impact on optical character recognition (OCR). All the dominant lines in the image are automatically detected using the Hough line transform and intelligently erased while simultaneously preserving overlapping character strokes by computing line width statistics and keying off of certain visual cues. This new method of form removal operates on loosely defined zones with no image deskewing. Any field in which the writer is provided a horizontal line to enter a response can be processed by this method. Several examples of processed fields are provided, including a comparison of results between the new method and a commercially available forms removal package. Even if this new form removal method did not improve character recognition accuracy, it is still a significant improvement to the technology because the requirement of a priori knowledge of the form's geometric details has been greatly reduced. This relaxes the recognition system's dependence on rigid form design, printing, and reproduction by automatically detecting and removing some of the physical structures (lines) on the form. Using the National Institute of Standards and Technology (NIST) public domain form-based handprint recognition system, the technique was tested on a large number of fields containing randomly ordered handprinted lowercase alphabets, as these letters (especially those with descenders) frequently touch and extend through the line along which they are written. Preserving character strokes improves overall lowercase recognition performance by 3%, which is a net improvement, but a single performance number like this doesn't communicate how the recognition process was really influenced. There is expected to be trade- offs with the introduction of any new technique into a complex recognition system. To understand both the improvements and the trade-offs, a new analysis was designed to compare the statistical distributions of individual confusion pairs between two systems. As OCR technology continues to improve, sophisticated analyses like this are necessary to reduce the errors remaining in complex recognition problems.
Radar error statistics for the space shuttle
NASA Technical Reports Server (NTRS)
Lear, W. M.
1979-01-01
Radar error statistics of C-band and S-band that are recommended for use with the groundtracking programs to process space shuttle tracking data are presented. The statistics are divided into two parts: bias error statistics, using the subscript B, and high frequency error statistics, using the subscript q. Bias errors may be slowly varying to constant. High frequency random errors (noise) are rapidly varying and may or may not be correlated from sample to sample. Bias errors were mainly due to hardware defects and to errors in correction for atmospheric refraction effects. High frequency noise was mainly due to hardware and due to atmospheric scintillation. Three types of atmospheric scintillation were identified: horizontal, vertical, and line of sight. This was the first time that horizontal and line of sight scintillations were identified.
Hall, Michelle G; Mattingley, Jason B; Dux, Paul E
2015-08-01
The brain exploits redundancies in the environment to efficiently represent the complexity of the visual world. One example of this is ensemble processing, which provides a statistical summary of elements within a set (e.g., mean size). Another is statistical learning, which involves the encoding of stable spatial or temporal relationships between objects. It has been suggested that ensemble processing over arrays of oriented lines disrupts statistical learning of structure within the arrays (Zhao, Ngo, McKendrick, & Turk-Browne, 2011). Here we asked whether ensemble processing and statistical learning are mutually incompatible, or whether this disruption might occur because ensemble processing encourages participants to process the stimulus arrays in a way that impedes statistical learning. In Experiment 1, we replicated Zhao and colleagues' finding that ensemble processing disrupts statistical learning. In Experiments 2 and 3, we found that statistical learning was unimpaired by ensemble processing when task demands necessitated (a) focal attention to individual items within the stimulus arrays and (b) the retention of individual items in working memory. Together, these results are consistent with an account suggesting that ensemble processing and statistical learning can operate over the same stimuli given appropriate stimulus processing demands during exposure to regularities. (c) 2015 APA, all rights reserved).
A Statistics-Based Cracking Criterion of Resin-Bonded Silica Sand for Casting Process Simulation
NASA Astrophysics Data System (ADS)
Wang, Huimin; Lu, Yan; Ripplinger, Keith; Detwiler, Duane; Luo, Alan A.
2017-02-01
Cracking of sand molds/cores can result in many casting defects such as veining. A robust cracking criterion is needed in casting process simulation for predicting/controlling such defects. A cracking probability map, relating to fracture stress and effective volume, was proposed for resin-bonded silica sand based on Weibull statistics. Three-point bending test results of sand samples were used to generate the cracking map and set up a safety line for cracking criterion. Tensile test results confirmed the accuracy of the safety line for cracking prediction. A laboratory casting experiment was designed and carried out to predict cracking of a cup mold during aluminum casting. The stress-strain behavior and the effective volume of the cup molds were calculated using a finite element analysis code ProCAST®. Furthermore, an energy dispersive spectroscopy fractographic examination of the sand samples confirmed the binder cracking in resin-bonded silica sand.
Infant Directed Speech Enhances Statistical Learning in Newborn Infants: An ERP Study
Teinonen, Tuomas; Tervaniemi, Mari; Huotilainen, Minna
2016-01-01
Statistical learning and the social contexts of language addressed to infants are hypothesized to play important roles in early language development. Previous behavioral work has found that the exaggerated prosodic contours of infant-directed speech (IDS) facilitate statistical learning in 8-month-old infants. Here we examined the neural processes involved in on-line statistical learning and investigated whether the use of IDS facilitates statistical learning in sleeping newborns. Event-related potentials (ERPs) were recorded while newborns were exposed to12 pseudo-words, six spoken with exaggerated pitch contours of IDS and six spoken without exaggerated pitch contours (ADS) in ten alternating blocks. We examined whether ERP amplitudes for syllable position within a pseudo-word (word-initial vs. word-medial vs. word-final, indicating statistical word learning) and speech register (ADS vs. IDS) would interact. The ADS and IDS registers elicited similar ERP patterns for syllable position in an early 0–100 ms component but elicited different ERP effects in both the polarity and topographical distribution at 200–400 ms and 450–650 ms. These results provide the first evidence that the exaggerated pitch contours of IDS result in differences in brain activity linked to on-line statistical learning in sleeping newborns. PMID:27617967
From micro to mainframe. A practical approach to perinatal data processing.
Yeh, S Y; Lincoln, T
1985-04-01
A new, practical approach to perinatal data processing for a large obstetric population is described. This was done with a microcomputer for data entry and a mainframe computer for data reduction. The Screen Oriented Data Access (SODA) program was used to generate the data entry form and to input data into the Apple II Plus computer. Data were stored on diskettes and transmitted through a modern and telephone line to the IBM 370/168 computer. The Statistical Analysis System (SAS) program was used for statistical analyses and report generations. This approach was found to be most practical, flexible, and economical.
NASA Astrophysics Data System (ADS)
Chang, Anteng; Li, Huajun; Wang, Shuqing; Du, Junfeng
2017-08-01
Both wave-frequency (WF) and low-frequency (LF) components of mooring tension are in principle non-Gaussian due to nonlinearities in the dynamic system. This paper conducts a comprehensive investigation of applicable probability density functions (PDFs) of mooring tension amplitudes used to assess mooring-line fatigue damage via the spectral method. Short-term statistical characteristics of mooring-line tension responses are firstly investigated, in which the discrepancy arising from Gaussian approximation is revealed by comparing kurtosis and skewness coefficients. Several distribution functions based on present analytical spectral methods are selected to express the statistical distribution of the mooring-line tension amplitudes. Results indicate that the Gamma-type distribution and a linear combination of Dirlik and Tovo-Benasciutti formulas are suitable for separate WF and LF mooring tension components. A novel parametric method based on nonlinear transformations and stochastic optimization is then proposed to increase the effectiveness of mooring-line fatigue assessment due to non-Gaussian bimodal tension responses. Using time domain simulation as a benchmark, its accuracy is further validated using a numerical case study of a moored semi-submersible platform.
Burggraeve, A; Van den Kerkhof, T; Hellings, M; Remon, J P; Vervaet, C; De Beer, T
2011-04-18
Fluid bed granulation is a batch process, which is characterized by the processing of raw materials for a predefined period of time, consisting of a fixed spraying phase and a subsequent drying period. The present study shows the multivariate statistical modeling and control of a fluid bed granulation process based on in-line particle size distribution (PSD) measurements (using spatial filter velocimetry) combined with continuous product temperature registration using a partial least squares (PLS) approach. Via the continuous in-line monitoring of the PSD and product temperature during granulation of various reference batches, a statistical batch model was developed allowing the real-time evaluation and acceptance or rejection of future batches. Continuously monitored PSD and product temperature process data of 10 reference batches (X-data) were used to develop a reference batch PLS model, regressing the X-data versus the batch process time (Y-data). Two PLS components captured 98.8% of the variation in the X-data block. Score control charts in which the average batch trajectory and upper and lower control limits are displayed were developed. Next, these control charts were used to monitor 4 new test batches in real-time and to immediately detect any deviations from the expected batch trajectory. By real-time evaluation of new batches using the developed control charts and by computation of contribution plots of deviating process behavior at a certain time point, batch losses or reprocessing can be prevented. Immediately after batch completion, all PSD and product temperature information (i.e., a batch progress fingerprint) was used to estimate some granule properties (density and flowability) at an early stage, which can improve batch release time. Individual PLS models relating the computed scores (X) of the reference PLS model (based on the 10 reference batches) and the density, respectively, flowabililty as Y-matrix, were developed. The scores of the 4 test batches were used to examine the predictive ability of the model. Copyright © 2011 Elsevier B.V. All rights reserved.
Effects of Nongray Opacity on Radiatively Driven Wolf-Rayet Winds
NASA Astrophysics Data System (ADS)
Onifer, A. J.; Gayley, K. G.
2002-05-01
Wolf-Rayet winds are characterized by their large momentum fluxes, and simulations of radiation driving have been increasingly successful in modeling these winds. Simple analytic approaches that help understand the most critical processes for copious momentum deposition already exist in the effectively gray approximation, but these have not been extended to more realistic nongray opacities. With this in mind, we have developed a simplified theory for describing the interaction of the stellar flux with nongray wind opacity. We replace the detailed line list with a set of statistical parameters that are sensitive not only to the strength but also the wavelength distribution of lines, incorporating as a free parameter the rate of photon frequency redistribution. We label the resulting flux-weighted opacity the statistical Sobolev- Rosseland (SSR) mean, and explore how changing these various statistical parameters affects the flux/opacity interaction. We wish to acknowledge NSF grant AST-0098155
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of thismore » object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.« less
Improving the performance of a filling line based on simulation
NASA Astrophysics Data System (ADS)
Jasiulewicz-Kaczmarek, M.; Bartkowiak, T.
2016-08-01
The paper describes the method of improving performance of a filling line based on simulation. This study concerns a production line that is located in a manufacturing centre of a FMCG company. A discrete event simulation model was built using data provided by maintenance data acquisition system. Two types of failures were identified in the system and were approximated using continuous statistical distributions. The model was validated taking into consideration line performance measures. A brief Pareto analysis of line failures was conducted to identify potential areas of improvement. Two improvements scenarios were proposed and tested via simulation. The outcome of the simulations were the bases of financial analysis. NPV and ROI values were calculated taking into account depreciation, profits, losses, current CIT rate and inflation. A validated simulation model can be a useful tool in maintenance decision-making process.
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Klafter, Joseph
2008-05-01
Many random populations can be modeled as a countable set of points scattered randomly on the positive half-line. The points may represent magnitudes of earthquakes and tornados, masses of stars, market values of public companies, etc. In this article we explore a specific class of random such populations we coin ` Paretian Poisson processes'. This class is elemental in statistical physics—connecting together, in a deep and fundamental way, diverse issues including: the Poisson distribution of the Law of Small Numbers; Paretian tail statistics; the Fréchet distribution of Extreme Value Theory; the one-sided Lévy distribution of the Central Limit Theorem; scale-invariance, renormalization and fractality; resilience to random perturbations.
Calandriello, Luigi; Goggiamani, Angela; Ienzi, Emanuela; Naldini, Silvia; Orsini, Dario
2013-01-01
The author's describe accidents at work and occupational diseases outcome's measure in Agricolture insurance management acquired through statistical approach based on data processing provided by INAIL Bank data. Accident's incidence in Agricolture is compared to main insurance managements, using frequency index of accidents appearance selected on line of work and type of consequence. Concerning occupational diseases the authors describes the complaints and compensation with the comparison referring the analysis to statistical general data. The data define a worrying phenomenon.
Version pressure feedback mechanisms for speculative versioning caches
Eichenberger, Alexandre E.; Gara, Alan; O& #x27; Brien, Kathryn M.; Ohmacht, Martin; Zhuang, Xiaotong
2013-03-12
Mechanisms are provided for controlling version pressure on a speculative versioning cache. Raw version pressure data is collected based on one or more threads accessing cache lines of the speculative versioning cache. One or more statistical measures of version pressure are generated based on the collected raw version pressure data. A determination is made as to whether one or more modifications to an operation of a data processing system are to be performed based on the one or more statistical measures of version pressure, the one or more modifications affecting version pressure exerted on the speculative versioning cache. An operation of the data processing system is modified based on the one or more determined modifications, in response to a determination that one or more modifications to the operation of the data processing system are to be performed, to affect the version pressure exerted on the speculative versioning cache.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, S.; Li, Y.; Liu, C.
2015-08-15
This paper presents a statistical theory for the initial onset of multipactor breakdown in coaxial transmission lines, taking both the nonuniform electric field and random electron emission velocity into account. A general numerical method is first developed to construct the joint probability density function based on the approximate equation of the electron trajectory. The nonstationary dynamics of the multipactor process on both surfaces of coaxial lines are modelled based on the probability of various impacts and their corresponding secondary emission. The resonant assumption of the classical theory on the independent double-sided and single-sided impacts is replaced by the consideration ofmore » their interaction. As a result, the time evolutions of the electron population for exponential growth and absorption on both inner and outer conductor, in response to the applied voltage above and below the multipactor breakdown level, are obtained to investigate the exact mechanism of multipactor discharge in coaxial lines. Furthermore, the multipactor threshold predictions of the presented model are compared with experimental results using measured secondary emission yield of the tested samples which shows reasonable agreement. Finally, the detailed impact scenario reveals that single-surface multipactor is more likely to occur with a higher outer to inner conductor radius ratio.« less
NASA Astrophysics Data System (ADS)
Trump, Jonathan R.; Hsu, Alexander D.; Fang, Jerome J.; Faber, S. M.; Koo, David C.; Kocevski, Dale D.
2013-02-01
We present the first quantified, statistical map of broad-line active galactic nucleus (AGN) frequency with host galaxy color and stellar mass in nearby (0.01 < z < 0.11) galaxies. Aperture photometry and z-band concentration measurements from the Sloan Digital Sky Survey are used to disentangle AGN and galaxy emission, resulting in estimates of uncontaminated galaxy rest-frame color, luminosity, and stellar mass. Broad-line AGNs are distributed throughout the blue cloud and green valley at a given stellar mass, and are much rarer in quiescent (red sequence) galaxies. This is in contrast to the published host galaxy properties of weaker narrow-line AGNs, indicating that broad-line AGNs occur during a different phase in galaxy evolution. More luminous broad-line AGNs have bluer host galaxies, even at fixed mass, suggesting that the same processes that fuel nuclear activity also efficiently form stars. The data favor processes that simultaneously fuel both star formation activity and rapid supermassive black hole accretion. If AGNs cause feedback on their host galaxies in the nearby universe, the evidence of galaxy-wide quenching must be delayed until after the broad-line AGN phase.
Amorphous silicon photovoltaic manufacturing technology, phase 2A
NASA Astrophysics Data System (ADS)
Duran, G.; Mackamul, K.; Metcalf, D.
1995-01-01
Utility Power Group (UPG), and its lower-tier subcontractor, Advanced Photovoltaic Systems, Inc. (APS) have conducted efforts in developing their manufacturing lines. UPG has focused on the automation of encapsulation and termination processes developed in Phase 1. APS has focused on completion of the encapsulation and module design tasks, while continuing the process and quality control and automation projects. The goal is to produce 55 watt (stabilized) EP50 modules in a new facility. In the APS Trenton EUREKA manufacturing facility, APS has: (1) Developed high throughput lamination procedures; (2) Optimized existing module designs; (3) Developed new module designs for architectural applications; (4) Developed enhanced deposition parameter control; (5) Designed equipment required to manufacture new EUREKA modules developed during Phase II; (6) Improved uniformity of thin-film materials deposition; and (7) Improved the stabilized power output of the APS EP50 EUREKA module to 55 watts. In the APS Fairfield EUREKA manufacturing facility, APS has: (1) Introduced the new products developed under Phase 1 into the APS Fairfield EUREKA module production line; (2) Increased the extent of automation in the production line; (3) Introduced Statistical Process Control to the module production line; and (4) Transferred-progress made in the APS Trenton facility into the APS Fairfield facility.
Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.
2016-01-01
Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.
Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A
2018-02-01
Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ladd-Lively, Jennifer L
2014-01-01
The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component inmore » the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.« less
Analysis of Variance in Statistical Image Processing
NASA Astrophysics Data System (ADS)
Kurz, Ludwik; Hafed Benteftifa, M.
1997-04-01
A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.
Fast neutron-gamma discrimination on neutron emission profile measurement on JT-60U.
Ishii, K; Shinohara, K; Ishikawa, M; Baba, M; Isobe, M; Okamoto, A; Kitajima, S; Sasao, M
2010-10-01
A digital signal processing (DSP) system is applied to stilbene scintillation detectors of the multichannel neutron emission profile monitor in JT-60U. Automatic analysis of the neutron-γ pulse shape discrimination is a key issue to diminish the processing time in the DSP system, and it has been applied using the two-dimensional (2D) map. Linear discriminant function is used to determine the dividing line between neutron events and γ-ray events on a 2D map. In order to verify the validity of the dividing line determination, the pulse shape discrimination quality is evaluated. As a result, the γ-ray contamination in most of the beam heating phase was negligible compared with the statistical error with 10 ms time resolution.
Fast neutron-gamma discrimination on neutron emission profile measurement on JT-60U
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishii, K.; Okamoto, A.; Kitajima, S.
2010-10-15
A digital signal processing (DSP) system is applied to stilbene scintillation detectors of the multichannel neutron emission profile monitor in JT-60U. Automatic analysis of the neutron-{gamma} pulse shape discrimination is a key issue to diminish the processing time in the DSP system, and it has been applied using the two-dimensional (2D) map. Linear discriminant function is used to determine the dividing line between neutron events and {gamma}-ray events on a 2D map. In order to verify the validity of the dividing line determination, the pulse shape discrimination quality is evaluated. As a result, the {gamma}-ray contamination in most of themore » beam heating phase was negligible compared with the statistical error with 10 ms time resolution.« less
STILTS -- Starlink Tables Infrastructure Library Tool Set
NASA Astrophysics Data System (ADS)
Taylor, Mark
STILTS is a set of command-line tools for processing tabular data. It has been designed for, but is not restricted to, use on astronomical data such as source catalogues. It contains both generic (format-independent) table processing tools and tools for processing VOTable documents. Facilities offered include crossmatching, format conversion, format validation, column calculation and rearrangement, row selection, sorting, plotting, statistical calculations and metadata display. Calculations on cell data can be performed using a powerful and extensible expression language. The package is written in pure Java and based on STIL, the Starlink Tables Infrastructure Library. This gives it high portability, support for many data formats (including FITS, VOTable, text-based formats and SQL databases), extensibility and scalability. Where possible the tools are written to accept streamed data so the size of tables which can be processed is not limited by available memory. As well as the tutorial and reference information in this document, detailed on-line help is available from the tools themselves. STILTS is available under the GNU General Public Licence.
Cracking the Language Code: Neural Mechanisms Underlying Speech Parsing
McNealy, Kristin; Mazziotta, John C.; Dapretto, Mirella
2013-01-01
Word segmentation, detecting word boundaries in continuous speech, is a critical aspect of language learning. Previous research in infants and adults demonstrated that a stream of speech can be readily segmented based solely on the statistical and speech cues afforded by the input. Using functional magnetic resonance imaging (fMRI), the neural substrate of word segmentation was examined on-line as participants listened to three streams of concatenated syllables, containing either statistical regularities alone, statistical regularities and speech cues, or no cues. Despite the participants’ inability to explicitly detect differences between the speech streams, neural activity differed significantly across conditions, with left-lateralized signal increases in temporal cortices observed only when participants listened to streams containing statistical regularities, particularly the stream containing speech cues. In a second fMRI study, designed to verify that word segmentation had implicitly taken place, participants listened to trisyllabic combinations that occurred with different frequencies in the streams of speech they just heard (“words,” 45 times; “partwords,” 15 times; “nonwords,” once). Reliably greater activity in left inferior and middle frontal gyri was observed when comparing words with partwords and, to a lesser extent, when comparing partwords with nonwords. Activity in these regions, taken to index the implicit detection of word boundaries, was positively correlated with participants’ rapid auditory processing skills. These findings provide a neural signature of on-line word segmentation in the mature brain and an initial model with which to study developmental changes in the neural architecture involved in processing speech cues during language learning. PMID:16855090
Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A
2017-08-07
A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.
Continuous monitoring of trace gas species in incineration processes can serve two purposes: (i) monitoring precursors of polychlorinated dibenzodioxin and polychlorinated dibenzofuran (PCDD/F) or other indicator species in the raw gas will enable use of their on-line signals for...
Statistically Qualified Neuro-Analytic system and Method for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
1998-11-04
An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less
Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software
NASA Astrophysics Data System (ADS)
Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.
2017-09-01
This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.
New Directions in the NOAO Observing Proposal System
NASA Astrophysics Data System (ADS)
Gasson, David; Bell, Dave
For the past eight years NOAO has been refining its on-line observing proposal system. Virtually all related processes are now handled electronically. Members of the astronomical community can submit proposals through email, web form, or via the Gemini Phase I Tool. NOAO staff can use the system to do administrative tasks, scheduling, and compilation of various statistics. In addition, all information relevant to the TAC process is made available on-line, including the proposals themselves (in HTML, PDF and PostScript) and technical comments. Grades and TAC comments are entered and edited through web forms, and can be sorted and filtered according to specified criteria. Current developments include a move away from proprietary solutions, toward open standards such as SQL (in the form of the MySQL relational database system), Perl, PHP and XML.
Online performance evaluation of RAID 5 using CPU utilization
NASA Astrophysics Data System (ADS)
Jin, Hai; Yang, Hua; Zhang, Jiangling
1998-09-01
Redundant arrays of independent disks (RAID) technology is the efficient way to solve the bottleneck problem between CPU processing ability and I/O subsystem. For the system point of view, the most important metric of on line performance is the utilization of CPU. This paper first employs the way to calculate the CPU utilization of system connected with RAID level 5 using statistic average method. From the simulation results of CPU utilization of system connected with RAID level 5 subsystem can we see that using multiple disks as an array to access data in parallel is the efficient way to enhance the on-line performance of disk storage system. USing high-end disk drivers to compose the disk array is the key to enhance the on-line performance of system.
Momentum deposition on Wolf-Rayet winds: Nonisotropic diffusion with effective gray opacity
NASA Technical Reports Server (NTRS)
Gayley, Kenneth G.; Owocki, Stanley P.; Cranmer, Steven R.
1995-01-01
We derive the velocity and mass-loss rate of a steady state Wolf-Rayet (WR) wind, using a nonisotropic diffusion approximation applied to the transfer between strongly overlapping spectral lines. Following the approach of Friend & Castor (1983), the line list is assumed to approximate a statistically parameterized Poisson distribution in frequency, so that photon transport is controlled by an angle-dependent, effectively gray opacity. We show the nonisotropic diffusion approximation yields good agreement with more accurate numerical treatments of the radiative transfer, while providing analytic insight into wind driving by multiple scattering. We illustrate, in particular, that multiple radiative momentum deposition does not require that potons be repeatedly reflected across substantial distances within the spherical envelope, but indeed is greatest when photons undergo a nearly local diffusion, e.g., through scattering by many lines closely spaced in frequency. Our results reiterate the view that the so-called 'momentum problem' of Wolf-Rayet winds is better characterized as an 'opacity problem' of simply identfying enough lines. One way of increasing the number of thick lines in Wolf-Rayet winds is to transfer opacity from saturated to unsaturated lines, yielding a steeper opacity distribution than that found in OB winds. We discuss the implications of this perspective for extending our approach to W-R wind models that incorporate a more fundamental treatment of the ionization and excitation processes that determine the line opacity. In particular, we argue that developing statistical descriptions of the lines to allow an improved effective opacity for the line ensemble would offer several advantages for deriving such more fundamental W-R wind models.
Momentum deposition on Wolf-Rayet winds: Nonisotropic diffusion with effective gray opacity
NASA Astrophysics Data System (ADS)
Gayley, Kenneth G.; Owocki, Stanley P.; Cranmer, Steven R.
1995-03-01
We derive the velocity and mass-loss rate of a steady state Wolf-Rayet (WR) wind, using a nonisotropic diffusion approximation applied to the transfer between strongly overlapping spectral lines. Following the approach of Friend & Castor (1983), the line list is assumed to approximate a statistically parameterized Poisson distribution in frequency, so that photon transport is controlled by an angle-dependent, effectively gray opacity. We show the nonisotropic diffusion approximation yields good agreement with more accurate numerical treatments of the radiative transfer, while providing analytic insight into wind driving by multiple scattering. We illustrate, in particular, that multiple radiative momentum deposition does not require that photons be repeatedly reflected across substantial distances within the spherical envelope, but indeed is greatest when photons undergo a nearly local diffusion, e.g., through scattering by many lines closely spaced in frequency. Our results reiterate the view that the so-called 'momentum problem' of Wolf-Rayet winds is better characterized as an 'opacity problem' of simply identifying enough lines. One way of increasing the number of thick lines in Wolf-Rayet winds is to transfer opacity from saturated to unsaturated lines, yielding a steeper opacity distribution than that found in OB winds. We discuss the implications of this perspective for extending our approach to W-R wind models that incorporate a more fundamental treatment of the ionization and excitation processes that determine the line opacity. In particular, we argue that developing statistical descriptions of the lines to allow an improved effective opacity for the line ensemble would offer several advantages for deriving such more fundamental W-R wind models.
The Müller-Lyer Illusion in a Computational Model of Biological Object Recognition
Zeman, Astrid; Obst, Oliver; Brooks, Kevin R.; Rich, Anina N.
2013-01-01
Studying illusions provides insight into the way the brain processes information. The Müller-Lyer Illusion (MLI) is a classical geometrical illusion of size, in which perceived line length is decreased by arrowheads and increased by arrowtails. Many theories have been put forward to explain the MLI, such as misapplied size constancy scaling, the statistics of image-source relationships and the filtering properties of signal processing in primary visual areas. Artificial models of the ventral visual processing stream allow us to isolate factors hypothesised to cause the illusion and test how these affect classification performance. We trained a feed-forward feature hierarchical model, HMAX, to perform a dual category line length judgment task (short versus long) with over 90% accuracy. We then tested the system in its ability to judge relative line lengths for images in a control set versus images that induce the MLI in humans. Results from the computational model show an overall illusory effect similar to that experienced by human subjects. No natural images were used for training, implying that misapplied size constancy and image-source statistics are not necessary factors for generating the illusion. A post-hoc analysis of response weights within a representative trained network ruled out the possibility that the illusion is caused by a reliance on information at low spatial frequencies. Our results suggest that the MLI can be produced using only feed-forward, neurophysiological connections. PMID:23457510
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cucchiara, A.; Prochaska, J. X.; Zhu, G.
2013-08-20
In 2006, Prochter et al. reported a statistically significant enhancement of very strong Mg II absorption systems intervening the sight lines to gamma-ray bursts (GRBs) relative to the incidence of such absorption along quasar sight lines. This counterintuitive result has inspired a diverse set of astrophysical explanations (e.g., dust, gravitational lensing) but none of these has obviously resolved the puzzle. Using the largest set of GRB afterglow spectra available, we reexamine the purported enhancement. In an independent sample of GRB spectra with a survey path three times larger than Prochter et al., we measure the incidence per unit redshift ofmore » {>=}1 A rest-frame equivalent width Mg II absorbers at z Almost-Equal-To 1 to be l(z) = 0.18 {+-} 0.06. This is fully consistent with current estimates for the incidence of such absorbers along quasar sight lines. Therefore, we do not confirm the original enhancement and suggest those results suffered from a statistical fluke. Signatures of the original result do remain in our full sample (l(z) shows an Almost-Equal-To 1.5 enhancement over l(z){sub QSO}), but the statistical significance now lies at Almost-Equal-To 90% c.l. Restricting our analysis to the subset of high-resolution spectra of GRB afterglows (which overlaps substantially with Prochter et al.), we still reproduce a statistically significant enhancement of Mg II absorption. The reason for this excess, if real, is still unclear since there is no connection between the rapid afterglow follow-up process with echelle (or echellette) spectrographs and the detectability of strong Mg II doublets. Only a larger sample of such high-resolution data will shed some light on this matter.« less
Object-oriented productivity metrics
NASA Technical Reports Server (NTRS)
Connell, John L.; Eller, Nancy
1992-01-01
Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.
Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello
2016-01-01
The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.
Brouckaert, D; Uyttersprot, J-S; Broeckx, W; De Beer, T
2017-06-08
The industrial production of liquid detergent compositions entails delicate balance of ingredients and process steps. In order to assure high quality and productivity in the manufacturing line, process analytical technology tools such as Raman spectroscopy are to be implemented. Marked chemical specificity, negligible water interference and high robustness are ascribed to this process analytical technique. Previously, at-line calibration models have been developed for determining the concentration levels of the being studied liquid detergents main ingredients from Raman spectra. A strategy is now proposed to transfer such at-line developed regression models to an in-line set-up, allowing real-time dosing control of the liquid detergent composition under production. To mimic in-line manufacturing conditions, liquid detergent compositions are created in a five-liter vessel with an overhead mixer. Raman spectra are continuously acquired by pumping the detergent under production via plastic tubing towards a Raman superhead probe, which is incorporated into a metal frame with a sapphire window facing the detergent fluid. Two at-line developed partial least squares (PLS) models are aimed at transferring, predicting the concentration of surfactant 1 and polymer 2 in the examined liquid detergent composition. A univariate slope/bias correction (SBC) is investigated, next to three well-acknowledged multivariate transformation methods: direct, piecewise and double-window piecewise direct standardization. Transfer is considered successful when the magnitude of the validation sets root mean square error of prediction (RMSEP) is similar to or smaller than the corresponding at-line prediction error. The transferred model offering the most promising outcome is further subjected to an exhaustive statistical evaluation, in order to appraise the applicability of the suggested calibration transfer method. Interval hypothesis tests are thereby performed for method comparison. It is illustrated that the investigated transfer approach yields satisfactory results, provided that the original at-line calibration model is thoroughly validated. Both SBC transfer models return lower RMSEP values than their corresponding original models. The surfactant 1 assay met all relevant evaluation criteria, demonstrating successful transfer to the in-line set-up. The in-line quantification of polymer 2 levels in the liquid detergent composition could not be statistically validated, due to the poorer performance of the at-line model. Copyright © 2017 Elsevier B.V. All rights reserved.
Learning Scene Categories from High Resolution Satellite Image for Aerial Video Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheriyadat, Anil M
2011-01-01
Automatic scene categorization can benefit various aerial video processing applications. This paper addresses the problem of predicting the scene category from aerial video frames using a prior model learned from satellite imagery. We show that local and global features in the form of line statistics and 2-D power spectrum parameters respectively can characterize the aerial scene well. The line feature statistics and spatial frequency parameters are useful cues to distinguish between different urban scene categories. We learn the scene prediction model from highresolution satellite imagery to test the model on the Columbus Surrogate Unmanned Aerial Vehicle (CSUAV) dataset ollected bymore » high-altitude wide area UAV sensor platform. e compare the proposed features with the popular Scale nvariant Feature Transform (SIFT) features. Our experimental results show that proposed approach outperforms te SIFT model when the training and testing are conducted n disparate data sources.« less
Development of online NIR urine analyzing system based on AOTF
NASA Astrophysics Data System (ADS)
Wan, Feng; Sun, Zhendong; Li, Xiaoxia
2006-09-01
In this paper, some key techniques on development of on-line MR urine analyzing system based on AOTF (Acousto - Optics Tunable Filter) are introduced. Problems about designing the optical system including collimation of incident light and working distance (the shortest distance for separating incident light and diffracted light) are analyzed and researched. DDS (Direct Digital Synthesizer) controlled by microprocessor is used to realize the wavelength scan. The experiment results show that this MR urine analyzing system based on. AOTF has 10000 - 4000cm -1 wavelength range and O.3ms wavelength transfer rate. Compare with the conventional Fourier Transform NIP. spectrophotometer for analyzing multi-components in urine, this system features low cost, small volume and on-line measurement function. Unscrambler software (multivariate statistical software by CAMO Inc. Norway) is selected as the software for processing the data. This system can realize on line quantitative analysis of protein, urea and creatinine in urine.
Statistical process control of cocrystallization processes: A comparison between OPLS and PLS.
Silva, Ana F T; Sarraguça, Mafalda Cruz; Ribeiro, Paulo R; Santos, Adenilson O; De Beer, Thomas; Lopes, João Almeida
2017-03-30
Orthogonal partial least squares regression (OPLS) is being increasingly adopted as an alternative to partial least squares (PLS) regression due to the better generalization that can be achieved. Particularly in multivariate batch statistical process control (BSPC), the use of OPLS for estimating nominal trajectories is advantageous. In OPLS, the nominal process trajectories are expected to be captured in a single predictive principal component while uncorrelated variations are filtered out to orthogonal principal components. In theory, OPLS will yield a better estimation of the Hotelling's T 2 statistic and corresponding control limits thus lowering the number of false positives and false negatives when assessing the process disturbances. Although OPLS advantages have been demonstrated in the context of regression, its use on BSPC was seldom reported. This study proposes an OPLS-based approach for BSPC of a cocrystallization process between hydrochlorothiazide and p-aminobenzoic acid monitored on-line with near infrared spectroscopy and compares the fault detection performance with the same approach based on PLS. A series of cocrystallization batches with imposed disturbances were used to test the ability to detect abnormal situations by OPLS and PLS-based BSPC methods. Results demonstrated that OPLS was generally superior in terms of sensibility and specificity in most situations. In some abnormal batches, it was found that the imposed disturbances were only detected with OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.
On the structure and phase transitions of power-law Poissonian ensembles
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Oshanin, Gleb
2012-10-01
Power-law Poissonian ensembles are Poisson processes that are defined on the positive half-line, and that are governed by power-law intensities. Power-law Poissonian ensembles are stochastic objects of fundamental significance; they uniquely display an array of fractal features and they uniquely generate a span of important applications. In this paper we apply three different methods—oligarchic analysis, Lorenzian analysis and heterogeneity analysis—to explore power-law Poissonian ensembles. The amalgamation of these analyses, combined with the topology of power-law Poissonian ensembles, establishes a detailed and multi-faceted picture of the statistical structure and the statistical phase transitions of these elemental ensembles.
Effective Vaccine Communication during the Disneyland Measles Outbreak
Broniatowski, David Andre; Hilyard, Karen M.; Dredze, Mark
2016-01-01
Vaccine refusal rates have increased in recent years, highlighting the need for effective risk communication, especially over social media. Fuzzy-trace theory predicts that individuals encode bottom-line meaning ("gist") and statistical information ("verbatim") in parallel and that articles expressing a clear gist will be most compelling. We coded news articles (n=4,686) collected during the 2014–2015 Disneyland measles for content including statistics, stories, or opinions containing bottom-line gists regarding vaccines and vaccine-preventable illnesses. We measured the extent to which articles were compelling by how frequently they were shared on Facebook. The most widely shared articles expressed bottom-line opinions, although articles containing statistics were also more likely to be shared than articles lacking statistics. Stories had limited impact on Facebook shares. Results support Fuzzy Trace Theory's predictions regarding the distinct yet parallel impact of categorical gist and statistical verbatim information on public health communication. PMID:27179915
Effective vaccine communication during the disneyland measles outbreak.
Broniatowski, David A; Hilyard, Karen M; Dredze, Mark
2016-06-14
Vaccine refusal rates have increased in recent years, highlighting the need for effective risk communication, especially over social media. Fuzzy-trace theory predicts that individuals encode bottom-line meaning ("gist") and statistical information ("verbatim") in parallel and those articles expressing a clear gist will be most compelling. We coded news articles (n=4581) collected during the 2014-2015 Disneyland measles for content including statistics, stories, or bottom-line gists regarding vaccines and vaccine-preventable illnesses. We measured the extent to which articles were compelling by how frequently they were shared on Facebook. The most widely shared articles expressed bottom-line gists, although articles containing statistics were also more likely to be shared than articles lacking statistics. Stories had limited impact on Facebook shares. Results support Fuzzy Trace Theory's predictions regarding the distinct yet parallel impact of categorical gist and statistical verbatim information on public health communication. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Smith, A. D.; Vaziri, S.; Rodriguez, S.; Östling, M.; Lemme, M. C.
2015-06-01
A chip to wafer scale, CMOS compatible method of graphene device fabrication has been established, which can be integrated into the back end of the line (BEOL) of conventional semiconductor process flows. In this paper, we present experimental results of graphene field effect transistors (GFETs) which were fabricated using this wafer scalable method. The carrier mobilities in these transistors reach up to several hundred cm2 V-1 s-1. Further, these devices exhibit current saturation regions similar to graphene devices fabricated using mechanical exfoliation. The overall performance of the GFETs can not yet compete with record values reported for devices based on mechanically exfoliated material. Nevertheless, this large scale approach is an important step towards reliability and variability studies as well as optimization of device aspects such as electrical contacts and dielectric interfaces with statistically relevant numbers of devices. It is also an important milestone towards introducing graphene into wafer scale process lines.
Detecting Inspection Objects of Power Line from Cable Inspection Robot LiDAR Data
Qin, Xinyan; Wu, Gongping; Fan, Fei
2018-01-01
Power lines are extending to complex environments (e.g., lakes and forests), and the distribution of power lines in a tower is becoming complicated (e.g., multi-loop and multi-bundle). Additionally, power line inspection is becoming heavier and more difficult. Advanced LiDAR technology is increasingly being used to solve these difficulties. Based on precise cable inspection robot (CIR) LiDAR data and the distinctive position and orientation system (POS) data, we propose a novel methodology to detect inspection objects surrounding power lines. The proposed method mainly includes four steps: firstly, the original point cloud is divided into single-span data as a processing unit; secondly, the optimal elevation threshold is constructed to remove ground points without the existing filtering algorithm, improving data processing efficiency and extraction accuracy; thirdly, a single power line and its surrounding data can be respectively extracted by a structured partition based on a POS data (SPPD) algorithm from “layer” to “block” according to power line distribution; finally, a partition recognition method is proposed based on the distribution characteristics of inspection objects, highlighting the feature information and improving the recognition effect. The local neighborhood statistics and the 3D region growing method are used to recognize different inspection objects surrounding power lines in a partition. Three datasets were collected by two CIR LIDAR systems in our study. The experimental results demonstrate that an average 90.6% accuracy and average 98.2% precision at the point cloud level can be achieved. The successful extraction indicates that the proposed method is feasible and promising. Our study can be used to obtain precise dimensions of fittings for modeling, as well as automatic detection and location of security risks, so as to improve the intelligence level of power line inspection. PMID:29690560
Detecting Inspection Objects of Power Line from Cable Inspection Robot LiDAR Data.
Qin, Xinyan; Wu, Gongping; Lei, Jin; Fan, Fei; Ye, Xuhui
2018-04-22
Power lines are extending to complex environments (e.g., lakes and forests), and the distribution of power lines in a tower is becoming complicated (e.g., multi-loop and multi-bundle). Additionally, power line inspection is becoming heavier and more difficult. Advanced LiDAR technology is increasingly being used to solve these difficulties. Based on precise cable inspection robot (CIR) LiDAR data and the distinctive position and orientation system (POS) data, we propose a novel methodology to detect inspection objects surrounding power lines. The proposed method mainly includes four steps: firstly, the original point cloud is divided into single-span data as a processing unit; secondly, the optimal elevation threshold is constructed to remove ground points without the existing filtering algorithm, improving data processing efficiency and extraction accuracy; thirdly, a single power line and its surrounding data can be respectively extracted by a structured partition based on a POS data (SPPD) algorithm from "layer" to "block" according to power line distribution; finally, a partition recognition method is proposed based on the distribution characteristics of inspection objects, highlighting the feature information and improving the recognition effect. The local neighborhood statistics and the 3D region growing method are used to recognize different inspection objects surrounding power lines in a partition. Three datasets were collected by two CIR LIDAR systems in our study. The experimental results demonstrate that an average 90.6% accuracy and average 98.2% precision at the point cloud level can be achieved. The successful extraction indicates that the proposed method is feasible and promising. Our study can be used to obtain precise dimensions of fittings for modeling, as well as automatic detection and location of security risks, so as to improve the intelligence level of power line inspection.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
Utah Virtual Lab: JAVA interactivity for teaching science and statistics on line.
Malloy, T E; Jensen, G C
2001-05-01
The Utah on-line Virtual Lab is a JAVA program run dynamically off a database. It is embedded in StatCenter (www.psych.utah.edu/learn/statsampler.html), an on-line collection of tools and text for teaching and learning statistics. Instructors author a statistical virtual reality that simulates theories and data in a specific research focus area by defining independent, predictor, and dependent variables and the relations among them. Students work in an on-line virtual environment to discover the principles of this simulated reality: They go to a library, read theoretical overviews and scientific puzzles, and then go to a lab, design a study, collect and analyze data, and write a report. Each student's design and data analysis decisions are computer-graded and recorded in a database; the written research report can be read by the instructor or by other students in peer groups simulating scientific conventions.
ATS-6 - Preliminary results from the 13/18-GHz COMSAT Propagation Experiment
NASA Technical Reports Server (NTRS)
Hyde, G.
1975-01-01
The 13/18-GHz COMSAT Propagation Experiment (CPE) is reviewed, the data acquisition and processing are discussed, and samples of preliminary results are presented. The need for measurements of both hydrometeor-induced attenuation statistics and diversity effectiveness is brought out. The facilitation of the experiment - CPE dual frequency and diversity site location, the CPE ground transmit terminals, the CPE transponder on Applications Technology Satellite-6 (ATS-6), and the CPE receive and data acquisition system - is briefly examined. The on-line preprocessing of the received signal is reviewed, followed by a discussion of the off-line processing of this database to remove signal fluctuations not due to hydrometeors. Finally, samples of the results of first-level analysis of the resultant data for the 18-GHz diversity site near Boston, Mass., and for the dual frequency 13/18-GHz site near Detroit, Mich., are presented and discussed.
Exploring Foundation Concepts in Introductory Statistics Using Dynamic Data Points
ERIC Educational Resources Information Center
Ekol, George
2015-01-01
This paper analyses introductory statistics students' verbal and gestural expressions as they interacted with a dynamic sketch (DS) designed using "Sketchpad" software. The DS involved numeric data points built on the number line whose values changed as the points were dragged along the number line. The study is framed on aggregate…
Alberio, Tiziana; Pieroni, Luisa; Ronci, Maurizio; Banfi, Cristina; Bongarzone, Italia; Bottoni, Patrizia; Brioschi, Maura; Caterino, Marianna; Chinello, Clizia; Cormio, Antonella; Cozzolino, Flora; Cunsolo, Vincenzo; Fontana, Simona; Garavaglia, Barbara; Giusti, Laura; Greco, Viviana; Lucacchini, Antonio; Maffioli, Elisa; Magni, Fulvio; Monteleone, Francesca; Monti, Maria; Monti, Valentina; Musicco, Clara; Petrosillo, Giuseppe; Porcelli, Vito; Saletti, Rosaria; Scatena, Roberto; Soggiu, Alessio; Tedeschi, Gabriella; Zilocchi, Mara; Roncada, Paola; Urbani, Andrea; Fasano, Mauro
2017-12-01
The Mitochondrial Human Proteome Project aims at understanding the function of the mitochondrial proteome and its crosstalk with the proteome of other organelles. Being able to choose a suitable and validated enrichment protocol of functional mitochondria, based on the specific needs of the downstream proteomics analysis, would greatly help the researchers in the field. Mitochondrial fractions from ten model cell lines were prepared using three enrichment protocols and analyzed on seven different LC-MS/MS platforms. All data were processed using neXtProt as reference database. The data are available for the Human Proteome Project purposes through the ProteomeXchange Consortium with the identifier PXD007053. The processed data sets were analyzed using a suite of R routines to perform a statistical analysis and to retrieve subcellular and submitochondrial localizations. Although the overall number of identified total and mitochondrial proteins was not significantly dependent on the enrichment protocol, specific line to line differences were observed. Moreover, the protein lists were mapped to a network representing the functional mitochondrial proteome, encompassing mitochondrial proteins and their first interactors. More than 80% of the identified proteins resulted in nodes of this network but with a different ability in coisolating mitochondria-associated structures for each enrichment protocol/cell line pair.
NASA Astrophysics Data System (ADS)
Polito, V.; Testa, P.; De Pontieu, B.; Allred, J. C.
2017-12-01
The observation of the high temperature (above 10 MK) Fe XXI 1354.1 A line with the Interface Region Imaging Spectrograph (IRIS) has provided significant insights into the chromospheric evaporation process in flares. In particular, the line is often observed to be completely blueshifted, in contrast to previous observations at lower spatial and spectral resolution, and in agreement with predictions from theoretical models. Interestingly, the line is also observed to be mostly symmetric and with a large excess above the thermal width. One popular interpretation for the excess broadening is given by assuming a superposition of flows from different loop strands. In this work, we perform a statistical analysis of Fe XXI line profiles observed by IRIS during the impulsive phase of flares and compare our results with hydrodynamic simulations of multi-thread flare loops performed with the 1D RADYN code. Our results indicate that the multi-thread models cannot easily reproduce the symmetry of the line and that some other physical process might need to be invoked in order to explain the observed profiles.
1989-06-01
letters on one line and several letters on the next line, there is no accurate way to credit these extra letters for statistical analysis. The decimal and...contains the descriptive statistics of the objective refractive error components of infantrymen. Figures 8-11 show the frequency distributions for sphere...equivalents. Nonspectacle wearers Table 12 contains the idescriptive statistics for non- spectacle wearers. Based or these refractive error data, about 30
Low-cost and high-speed optical mark reader based on an intelligent line camera
NASA Astrophysics Data System (ADS)
Hussmann, Stephan; Chan, Leona; Fung, Celine; Albrecht, Martin
2003-08-01
Optical Mark Recognition (OMR) is thoroughly reliable and highly efficient provided that high standards are maintained at both the planning and implementation stages. It is necessary to ensure that OMR forms are designed with due attention to data integrity checks, the best use is made of features built into the OMR, used data integrity is checked before the data is processed and data is validated before it is processed. This paper describes the design and implementation of an OMR prototype system for marking multiple-choice tests automatically. Parameter testing is carried out before the platform and the multiple-choice answer sheet has been designed. Position recognition and position verification methods have been developed and implemented in an intelligent line scan camera. The position recognition process is implemented into a Field Programmable Gate Array (FPGA), whereas the verification process is implemented into a micro-controller. The verified results are then sent to the Graphical User Interface (GUI) for answers checking and statistical analysis. At the end of the paper the proposed OMR system will be compared with commercially available system on the market.
In-line monitoring of pellet coating thickness growth by means of visual imaging.
Oman Kadunc, Nika; Sibanc, Rok; Dreu, Rok; Likar, Boštjan; Tomaževič, Dejan
2014-08-15
Coating thickness is the most important attribute of coated pharmaceutical pellets as it directly affects release profiles and stability of the drug. Quality control of the coating process of pharmaceutical pellets is thus of utmost importance for assuring the desired end product characteristics. A visual imaging technique is presented and examined as a process analytic technology (PAT) tool for noninvasive continuous in-line and real time monitoring of coating thickness of pharmaceutical pellets during the coating process. Images of pellets were acquired during the coating process through an observation window of a Wurster coating apparatus. Image analysis methods were developed for fast and accurate determination of pellets' coating thickness during a coating process. The accuracy of the results for pellet coating thickness growth obtained in real time was evaluated through comparison with an off-line reference method and a good agreement was found. Information about the inter-pellet coating uniformity was gained from further statistical analysis of the measured pellet size distributions. Accuracy and performance analysis of the proposed method showed that visual imaging is feasible as a PAT tool for in-line and real time monitoring of the coating process of pharmaceutical pellets. Copyright © 2014 Elsevier B.V. All rights reserved.
Perspectives for on-line analysis of bauxite by neutron irradiation
NASA Astrophysics Data System (ADS)
Beurton, Gabriel; Ledru, Bertrand; Letourneur, Philippe
1995-03-01
The interest in bauxite as a major source of alumina results in a strong demand for on-line instrumentation suitable for sorting, blending, and processing operations at the bauxite mine and for monitoring instrumentation in the Bayer process. The results of laboratory experiments based on neutron interactions with bauxite are described. The technique was chosen in order to overcome the problem of spatial heterogeneity in bulk mineral analysis. The evaluated elements contributed to approximately 99.5% of the sample weight. In addition, the measurements provide valuable information on physical parameters such as density, hygrometry, and material flow. Using a pulsed generator, the analysis system offers potential for on-line measurements (borehole logging or conveyor belt). An overall description of the experimental set-up is given. The experimental data include measurements of natural radioactivity, delayed radioactivity induced by activation, and prompt gamma rays following neutron reaction. In situ applications of neutron interactions provide continuous analysis and produce results which are more statistically significant. The key factors contributing to advances in industrial applications are the development of high count rate gamma spectroscopy and computational tools to design measurement systems and interpret their results.
The impact of feedback from galaxy formation on the Lyman α transmitted flux
NASA Astrophysics Data System (ADS)
Viel, Matteo; Schaye, Joop; Booth, C. M.
2013-02-01
The forest of Lyman α absorption lines seen in the spectra of distant quasars has become an important probe of the distribution of matter in the Universe. We use large, hydrodynamical simulations from the OverWhelmingly Large Simulations project project to investigate the effect of feedback from galaxy formation on the probability distribution function and the power spectrum of the Lyman α transmitted flux. While metal-line cooling is unimportant, both galactic outflows from massive galaxies driven by active galactic nuclei and winds from low-mass galaxies driven by supernovae have a substantial impact on the flux statistics. At redshift z = 2.25, the effects on the flux statistics are of a similar magnitude as the statistical uncertainties of published data sets. The changes in the flux statistics are not due to differences in the temperature-density relation of the photoionized gas. Instead, they are caused by changes in the density distribution and in the fraction of hot, collisionally ionized gas. It may be possible to disentangle astrophysical and cosmological effects by taking advantage of the fact that they induce different redshift dependencies. In particular, the magnitude of the feedback effects appears to decrease rapidly with increasing redshift. Analyses of Lyman α forest data from surveys that are currently in process, such as Baryon Oscillation Spectroscopic Survey of the Sloan Digital Sky Survey-III (BOSS/SDSS-III) and X-Shooter/Very Large Telescope (VLT), must take galactic winds into account.
Data Warehousing at the Marine Corps Institute
2003-09-01
applications exists for several reasons. It allows for data to be extracted from many sources, by “cleaned”, and stored into one large data facility ...exists. Key individuals at MCI, or the so called “knowledge workers” will be educated , and try to brainstorm possible data relationships that can...They include querying and reporting, On-Line Analytical Processing (OLAP) and statistical analysis, and data mining. 1. Queries and Reports The
The Complete Redistribution Approximation in Optically Thick Line-Driven Winds
NASA Astrophysics Data System (ADS)
Gayley, K. G.; Onifer, A. J.
2001-05-01
Wolf-Rayet winds are thought to exhibit large momentum fluxes, which has in part been explained by ionization stratification in the wind. However, it the cause of high mass loss, not high momentum flux, that remains largely a mystery, because standard models fail to achieve sufficient acceleration near the surface where the mass-loss rate is set. We consider a radiative transfer approximation that allows for the dynamics of optically thick Wolf-Rayet winds to be modeled without detailed treatment of the radiation field, called the complete redistribution approximation. In it, it is assumed that thermalization processes cause the photon frequencies to be completely randomized over the course of propagating through the wind, which allows the radiation field to be treated statistically rather than in detail. Thus the approach is similar to the statistical treatment of the line list used in the celebrated CAK approach. The results differ from the effectively gray treatment in that the radiation field is influenced by the line distribution, and the role of gaps in the line distribution is enhanced. The ramifications for the driving of large mass-loss rates is explored.
NASA Astrophysics Data System (ADS)
Zhao, Wenyu; Zhang, Haiyi; Ji, Yuefeng; Xu, Daxiong
2004-05-01
Based on the proposed polarization mode dispersion (PMD) compensation simulation model and statistical analysis method (Monte-Carlo), the critical parameters initialization of two typical optical domain PMD compensators, which include optical PMD method with fixed compensation differential group delay (DGD) and that with variable compensation DGD, are detailedly investigated by numerical method. In the simulation, the line PMD values are chosen as 3ps, 4ps and 5ps and run samples are set to 1000 in order to achieve statistical evaluation for PMD compensated systems, respectively. The simulation results show that for the PMD value pre-known systems, the value of the fixed DGD compensator should be set to 1.5~1.6 times of line PMD value in order to reach the optimum performance, but for the second kind of PMD compensator, the DGD range of lower limit should be 1.5~1.6 times of line PMD provided that of upper limit is set to 3 times of line PMD, if no effective ways are chosen to resolve the problem of local minimum in optimum process. Another conclusion can be drawn from the simulation is that, although the second PMD compensator holds higher PMD compensation performance, it will spend more feedback loops to look up the optimum DGD value in the real PMD compensation realization, and this will bring more requirements on adjustable DGD device, not only wider adjustable range, but rapid adjusting speed for real time PMD equalization.
Spectral Analysis of B Stars: An Application of Bayesian Statistics
NASA Astrophysics Data System (ADS)
Mugnes, J.-M.; Robert, C.
2012-12-01
To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.
Chen, Qing; Xu, Pengfei; Liu, Wenzhong
2016-01-01
Computer vision as a fast, low-cost, noncontact, and online monitoring technology has been an important tool to inspect product quality, particularly on a large-scale assembly production line. However, the current industrial vision system is far from satisfactory in the intelligent perception of complex grain images, comprising a large number of local homogeneous fragmentations or patches without distinct foreground and background. We attempt to solve this problem based on the statistical modeling of spatial structures of grain images. We present a physical explanation in advance to indicate that the spatial structures of the complex grain images are subject to a representative Weibull distribution according to the theory of sequential fragmentation, which is well known in the continued comminution of ore grinding. To delineate the spatial structure of the grain image, we present a method of multiscale and omnidirectional Gaussian derivative filtering. Then, a product quality classifier based on sparse multikernel–least squares support vector machine is proposed to solve the low-confidence classification problem of imbalanced data distribution. The proposed method is applied on the assembly line of a food-processing enterprise to classify (or identify) automatically the production quality of rice. The experiments on the real application case, compared with the commonly used methods, illustrate the validity of our method. PMID:26986726
An Adaptive Technique for a Redundant-Sensor Navigation System. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chien, T. T.
1972-01-01
An on-line adaptive technique is developed to provide a self-contained redundant-sensor navigation system with a capability to utilize its full potentiality in reliability and performance. The gyro navigation system is modeled as a Gauss-Markov process, with degradation modes defined as changes in characteristics specified by parameters associated with the model. The adaptive system is formulated as a multistage stochastic process: (1) a detection system, (2) an identification system and (3) a compensation system. It is shown that the sufficient statistics for the partially observable process in the detection and identification system is the posterior measure of the state of degradation, conditioned on the measurement history.
An intervention to decrease patient identification band errors in a children's hospital.
Hain, Paul D; Joers, B; Rush, M; Slayton, J; Throop, P; Hoagg, S; Allen, L; Grantham, J; Deshpande, J K
2010-06-01
Patient misidentification continues to be a quality and safety issue. There is a paucity of US data describing interventions to reduce identification band error rates. Monroe Carell Jr Children's Hospital at Vanderbilt. Percentage of patients with defective identification bands. Web-based surveys were sent, asking hospital personnel to anonymously identify perceived barriers to reaching zero defects with identification bands. Corrective action plans were created and implemented with ideas from leadership, front-line staff and the online survey. Data from unannounced audits of patient identification bands were plotted on statistical process control charts and shared monthly with staff. All hospital personnel were expected to "stop the line" if there were any patient identification questions. The first audit showed a defect rate of 20.4%. The original mean defect rate was 6.5%. After interventions and education, the new mean defect rate was 2.6%. (a) The initial rate of patient identification band errors in the hospital was higher than expected. (b) The action resulting in most significant improvement was staff awareness of the problem, with clear expectations to immediately stop the line if a patient identification error was present. (c) Staff surveys are an excellent source of suggestions for combating patient identification issues. (d) Continued audit and data collection is necessary for sustainable staff focus and continued improvement. (e) Statistical process control charts are both an effective method to track results and an easily understood tool for sharing data with staff.
Shin, Kang-Jae; Kim, Hong-San; O, Jehoon; Kwon, Hyun-Jin; Yang, Hun-Mu
2018-05-12
There is no standardized approach to the greater occipital nerve (GON) block technique for treating occipital neuralgia. The aim of the present study was to validate the previously-suggested guidelines for conventional injection techniques and to provide navigational guidelines for safe GON block. The GON, lesser occipital nerve (LON) and occipital artery (OA) were carefully dissected in the occipital region of embalmed cadavers. Using a 3D digitizer, the GON, LON, and OA were observed on the two reference lines. The distances between the landmarks were recorded and statistically analyzed. On the superior nuchal line, the mean distances between the external occipital protuberance (EOP) and the most medial branch of the GON was 33.5 mm. The mean distance between the EOP and the most medial branch of the OA was 37.4 mm. On the EOP-mastoid process (MP) line, the GON was on the medial third and the LON the lateral third of the EOP-MP line. The safe injection points on the EOP-MP line are about 3 cm from the EOP, 1 cm inferior parallel to the EOP-MP line, and about 3 cm away from the MP. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.
Process property studies of melt blown thermoplastic polyurethane polymers
NASA Astrophysics Data System (ADS)
Lee, Youn Eung
The primary goal of this research was to determine optimum processing conditions to produce commercially acceptable melt blown (MB) thermoplastic polyurethane (TPU) webs. The 6-inch MB line and the 20-inch wide Accurate Products MB pilot line at the Textiles and Nonwovens Development Center (TANDEC), The University of Tennessee, Knoxville, were utilized for this study. The MB TPU trials were performed in four different phases: Phase 1 focused on the envelope of the MB operating conditions for different TPU polymers; Phase 2 focused on the production of commercially acceptable MB TPU webs; Phase 3 focused on the optimization of the processing conditions of MB TPU webs, and the determination of the significant relationships between processing parameters and web properties utilizing statistical analyses; Based on the first three phases, a more extensive study of fiber and web formation in the MB TPU process was made and a multi liner regression model for the MB TPU process versus properties was also developed in Phase 4. In conclusion, the basic MB process was fundamentally valid for the MB TPU process; however, the MB process was more complicated for TPU than PP, because web structures and properties of MB TPUs are very sensitive to MB process conditions: Furthermore, different TPU grades responded very differently to MB processing and exhibited different web structure and properties. In Phase 3 and Phase 4, small fiber diameters of less than 5mum were produced from TPU237, TPU245 and TPU280 pellets, and the mechanical strengths of MB TPU webs including the tensile strength, tear strength, abrasion resistance and tensile elongation were notably good. In addition, the statistical model showed useful interaction regarding trends for processing parameters versus properties of MB TPU webs. Die and air temperature showed multicollinearity problems and fiber diameter was notably affected by air flow rate, throughput and die/air temperature. It was also shown that most of the MB TPU web properties including mechanical strength, air permeability and fiber diameters were affected by air velocity and die temperature.
Real-time radionuclide identification in γ-emitter mixtures based on spiking neural network.
Bobin, C; Bichler, O; Lourenço, V; Thiam, C; Thévenin, M
2016-03-01
Portal radiation monitors dedicated to the prevention of illegal traffic of nuclear materials at international borders need to deliver as fast as possible a radionuclide identification of a potential radiological threat. Spectrometry techniques applied to identify the radionuclides contributing to γ-emitter mixtures are usually performed using off-line spectrum analysis. As an alternative to these usual methods, a real-time processing based on an artificial neural network and Bayes' rule is proposed for fast radionuclide identification. The validation of this real-time approach was carried out using γ-emitter spectra ((241)Am, (133)Ba, (207)Bi, (60)Co, (137)Cs) obtained with a high-efficiency well-type NaI(Tl). The first tests showed that the proposed algorithm enables a fast identification of each γ-emitting radionuclide using the information given by the whole spectrum. Based on an iterative process, the on-line analysis only needs low-statistics spectra without energy calibration to identify the nature of a radiological threat. Copyright © 2015 Elsevier Ltd. All rights reserved.
Besachio, David A; Khaleel, Ziyad; Shah, Lubdha M
2015-12-01
Posterior odontoid process inclination has been demonstrated as a factor associated with Chiari malformation Type I (CM-I) in the pediatric population; however, no studies to date have examined this measurement in the adult CM-I population. The purpose of this study was to evaluate craniocervical junction (CCJ) measurements in adult CM-I versus a control group. The odontoid retroflexion, odontoid retroversion, odontoid height, posterior basion to C-2 line measured to the dural margin (pB-C2 line), posterior basion to C-2 line measured to the dorsal odontoid cortical margin (pB-C2* line), and clivus-canal angle measurements were retrospectively analyzed in adult patients with CM-I using MRI. These measurements were compared with normative values established from CT scans of the cervical spine in adults without CM-I. A statistically significant difference was found between 55 adults with CM-I and 150 sex-matched controls (125 used for analysis) in the mean clivus-canal angle and the mean pB-C2 line. These data suggest that there are sex-specific differences with respect to measurements at the CCJ between men and women, with women showing a more posteriorly inclined odontoid process. There were also differences between the CM-I and control groups: a more acute clivus-canal angle was associated with CM-I in the adult population. These CCJ findings could have an influence on presurgical planning.
Poisson-event-based analysis of cell proliferation.
Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul
2015-05-01
A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.
Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P
2013-01-01
We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.
Kriete, A; Schäffer, R; Harms, H; Aus, H M
1987-06-01
Nuclei of the cells from the thyroid gland were analyzed in a transmission electron microscope by direct TV scanning and on-line image processing. The method uses the advantages of a visual-perception model to detect structures in noisy and low-contrast images. The features analyzed include area, a form factor and texture parameters from the second derivative stage. Three tumor-free thyroid tissues, three follicular adenomas, three follicular carcinomas and three papillary carcinomas were studied. The computer-aided cytophotometric method showed that the most significant differences were the statistics of the chromatin texture features of homogeneity and regularity. These findings document the possibility of an automated differentiation of tumors at the ultrastructural level.
NASA Technical Reports Server (NTRS)
Masuoka, E.
1985-01-01
Systematic noise is present in Airborne Imaging Spectrometer (AIS) data collected on October 26, 1983 and May 5, 1984 in grating position 0 (1.2 to 1.5 microns). In the October data set the noise occurs as 135 scan lines of low DN's every 270 scan lines. The noise is particularly bad in bands nine through thirty, restricting effective analysis to at best ten of the 32 bands. In the May data the regions of severe noise have been eliminated, but systematic noise is present with three frequencies (3, 106 and 200 scan lines) in all thirty two bands. The periodic nature of the noise in both data sets suggests that it could be removed as part of routine processing. This is necessary before classification routines or statistical analyses are used with these data.
A comprehensive study on pavement edge line implementation.
DOT National Transportation Integrated Search
2014-04-01
The previous 2011 study Safety Improvement from Edge Lines on Rural Two-Lane Highways analyzed the crash data of : three years before and one year after edge line implementation by using the latest safety analysis statistical method. It : concl...
NASA Astrophysics Data System (ADS)
Bingley, L.; Angelopoulos, V.; Zhang, X. J.; Sibeck, D. G.; Halford, A. J.
2017-12-01
While many advances have been made in the understanding of particle acceleration processes in the radiation belts, many questions regarding the loss processes remain. One such loss process is the resonant interaction between relativistic electrons and Electromagnetic Ion Cyclotron (EMIC) waves. This study examines statistically the association of equatorial pitch-angle distributions of > 1 MeV particles measured on Van Allen Probes and in-situ EMIC wave observations measured on Van Allen Probes and THEMIS during a unique three-month period of line-of-apsides conjunctions between the two missions. We find a large sample of EMIC wave events associated with widening of the particle loss cone. The availability of multiple spacecraft enables the review of the spatial and temporal extent of EMIC waves that result in changes in particle pitch-angle distributions, as well as a quantitative look at background plasma and magnetic field conditions. We compare our results with expectations from diffusion theory. We are thus able to assess more directly than previous studies the role of EMIC waves in particle scattering.
Rapid Statistical Learning Supporting Word Extraction From Continuous Speech.
Batterink, Laura J
2017-07-01
The identification of words in continuous speech, known as speech segmentation, is a critical early step in language acquisition. This process is partially supported by statistical learning, the ability to extract patterns from the environment. Given that speech segmentation represents a potential bottleneck for language acquisition, patterns in speech may be extracted very rapidly, without extensive exposure. This hypothesis was examined by exposing participants to continuous speech streams composed of novel repeating nonsense words. Learning was measured on-line using a reaction time task. After merely one exposure to an embedded novel word, learners demonstrated significant learning effects, as revealed by faster responses to predictable than to unpredictable syllables. These results demonstrate that learners gained sensitivity to the statistical structure of unfamiliar speech on a very rapid timescale. This ability may play an essential role in early stages of language acquisition, allowing learners to rapidly identify word candidates and "break in" to an unfamiliar language.
NASA Astrophysics Data System (ADS)
Yao, Yuchen; Bao, Jie; Skyllas-Kazacos, Maria; Welch, Barry J.; Akhmetov, Sergey
2018-04-01
Individual anode current signals in aluminum reduction cells provide localized cell conditions in the vicinity of each anode, which contain more information than the conventionally measured cell voltage and line current. One common use of this measurement is to identify process faults that can cause significant changes in the anode current signals. While this method is simple and direct, it ignores the interactions between anode currents and other important process variables. This paper presents an approach that applies multivariate statistical analysis techniques to individual anode currents and other process operating data, for the detection and diagnosis of local process abnormalities in aluminum reduction cells. Specifically, since the Hall-Héroult process is time-varying with its process variables dynamically and nonlinearly correlated, dynamic kernel principal component analysis with moving windows is used. The cell is discretized into a number of subsystems, with each subsystem representing one anode and cell conditions in its vicinity. The fault associated with each subsystem is identified based on multivariate statistical control charts. The results show that the proposed approach is able to not only effectively pinpoint the problematic areas in the cell, but also assess the effect of the fault on different parts of the cell.
Line transport in turbulent atmosphere
NASA Astrophysics Data System (ADS)
Nikoghossian, Artur
We consider the spectral line transfer in turbulent atmospheres with a spatially correlated velocity field. Both the finite and semi-infinite media are treated. In finding the observed intensities we first deal with the problem for determining the mean intensity of radiation emerging from the medium for a fixed value of turbulent velocity at its boundary. New approach proposed in solving this problem is based on invariant imbedding technique which yields the solution of the proper problems for a family of media of different optical thicknesses and allows tackling different kinds of inhomogeneous problems. The dependence of the line profile, integral intensity and the line width on the mean correlation length and average value of the hydrodynamic velocity is studied. It is shown that the transition from a micro-turbulent regime to a macro-turbulent one occurs within a comparatively narrow range of variation in the correlation length. The diffuse reflection of the line radiation from a one-dimensional semi-infinite turbulent atmosphere is examined. In addition to the observed spectral line profile, statistical averages describing the diffusion process in the atmosphere (mean number of scattering events, average time spent by a diffusing photon in the medium) are determined. The dependence of these quantities on the average hydrodynamic velocity and correlation coefficient is studied.
Line Transport in Turbulent Atmospheres
NASA Astrophysics Data System (ADS)
Nikoghossian, A. G.
2017-07-01
The spectral line transfer in turbulent atmospheres with a spatially correlated velocity field is examined. Both the finite and semi-infinite media are treated. In finding the observed intensities we first deal with the problem for determining the mean intensity of radiation emerging from the medium for a fixed value of turbulent velocity at its boundary. A new approach proposed for solving this problem is based on the invariant imbedding technique which yields the solution of the proper problems for a family of media of different optical thicknesses and allows tackling different kinds of inhomogeneous problems. The dependence of the line profile, integral intensity, and the line width on the mean correlation length and the average value of the hydrodynamic velocity is studied. It is shown that the transition from a micro-turbulent regime to a macro-turbulence occurs within a comparatively narrow range of variation in the correlation length . Ambartsumian's principle of invariance is used to solve the problem of diffuse reflection of the line radiation from a one-dimensional semi-infinite turbulent atmosphere. In addition to the observed spectral line profile, statistical averages describing the diffusion process in the atmosphere (mean number of scattering events, average time spent by a diffusing photon in the medium) are determined. The dependence of these quantities on the average hydrodynamic velocity and correlation coefficient is studied.
How many spectral lines are statistically significant?
NASA Astrophysics Data System (ADS)
Freund, J.
When experimental line spectra are fitted with least squares techniques one frequently does not know whether n or n + 1 lines may be fitted safely. This paper shows how an F-test can be applied in order to determine the statistical significance of including an extra line into the fitting routine.
Magnetospheric space plasma investigations
NASA Technical Reports Server (NTRS)
Comfort, Richard H.; Horwitz, James L.
1994-01-01
A time dependent semi-kinetic model that includes self collisions and ion-neutral collisions and chemistry was developed. Light ion outflow in the polar cap transition region was modeled and compared with data results. A model study of wave heating of O+ ions in the topside transition region was carried out using a code which does local calculations that include ion-neutral and Coulomb self collisions as well as production and loss of O+. Another project is a statistical study of hydrogen spin curve characteristics in the polar cap. A statistical study of the latitudinal distribution of core plasmas along the L=4.6 field line using DE-1/RIMS data was completed. A short paper on dual spacecraft estimates of ion temperature profiles and heat flows in the plasmasphere ionosphere system was prepared. An automated processing code was used to process RIMS data from 1981 to 1984.
Six Sigma methods applied to cryogenic coolers assembly line
NASA Astrophysics Data System (ADS)
Ventre, Jean-Marc; Germain-Lacour, Michel; Martin, Jean-Yves; Cauquil, Jean-Marc; Benschop, Tonny; Griot, René
2009-05-01
Six Sigma method have been applied to manufacturing process of a rotary Stirling cooler: RM2. Name of the project is NoVa as main goal of the Six Sigma approach is to reduce variability (No Variability). Project has been based on the DMAIC guideline following five stages: Define, Measure, Analyse, Improve, Control. Objective has been set on the rate of coolers succeeding performance at first attempt with a goal value of 95%. A team has been gathered involving people and skills acting on the RM2 manufacturing line. Measurement System Analysis (MSA) has been applied to test bench and results after R&R gage show that measurement is one of the root cause for variability in RM2 process. Two more root causes have been identified by the team after process mapping analysis: regenerator filling factor and cleaning procedure. Causes for measurement variability have been identified and eradicated as shown by new results from R&R gage. Experimental results show that regenerator filling factor impacts process variability and affects yield. Improved process haven been set after new calibration process for test bench, new filling procedure for regenerator and an additional cleaning stage have been implemented. The objective for 95% coolers succeeding performance test at first attempt has been reached and kept for a significant period. RM2 manufacturing process is now managed according to Statistical Process Control based on control charts. Improvement in process capability have enabled introduction of sample testing procedure before delivery.
NASA Technical Reports Server (NTRS)
Vilhu, Osmi; Linsky, Jeffrey L.
1990-01-01
Mean coronal temperatures of some active G-K stars were derived from Rev1-processed Einstein-observatory's IPC-spectra. The combined X-ray and transition region emission line data are in rough agreement with static coronal loop models. Although the sample is too small to derive any statistically significant conclusions, it suggests that the mean coronal temperature depends linearly on the inverse Rossby-number, with saturation at short rotation periods.
Development of new on-line statistical program for the Korean Society for Radiation Oncology
Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Choi, Eun Kyung; Cho, Kwan Ho
2015-01-01
Purpose To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. Materials and Methods The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Results Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. Conclusion The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology. PMID:26157684
Development of new on-line statistical program for the Korean Society for Radiation Oncology.
Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Shin, Kyung Hwan; Choi, Eun Kyung; Cho, Kwan Ho
2015-06-01
To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology.
Hot Ion Flows in the Distant Magnetotail: ARTEMIS Observations From Lunar Orbit to ˜-200 RE
NASA Astrophysics Data System (ADS)
Artemyev, A. V.; Angelopoulos, V.; Runov, A.; Vasko, I. Y.
2017-10-01
Plasma energization in Earth's magnetotail is supported by acceleration processes in (and around) magnetic reconnection regions. Hot plasma flows and strong electromagnetic waves, generated by magnetic energy release during reconnection, transport energy necessary for current system intensification and particle acceleration in the inner magnetosphere. Earth's magnetotail configuration includes two main reconnection regions (X lines): the near-Earth X line, which has been well studied by several multispacecraft missions, and the distant X line, which has been much less investigated. In this paper, we utilize the unique data set gathered by two ARTEMIS spacecraft in 2010 at radial distances between lunar orbit and ˜200 RE (Earth radii). We identify an X line at around ˜80 RE and collect statistics on hot plasma flows observed around and beyond this distance. Ion spectra within these flows are well fitted by a power law with the exponential tail starting above an energy ɛ0˜ 2-5 keV. Assuming that these spectra are originated at the distant X line, we examine the characteristics of the acceleration at the distant tail reconnection region.
Karimi, Mohammad H; Asemani, Davud
2014-05-01
Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
1982-02-15
function of the doping density at 300 and 77 K for the classical Boltzmann statistics or depletion approximation (solid line) and for the approximate...Fermi-Dirac statistics (equation (19) dotted line)• This comparison demonstrates that the deviation from Boltzmann statistics is quite noticeable...tunneling Schottky barriers cannot be obtained at these doping levels. The dotted lines are obtained when Boltzmann statistics are used in the Al Ga
Difference to Inference: teaching logical and statistical reasoning through on-line interactivity.
Malloy, T E
2001-05-01
Difference to Inference is an on-line JAVA program that simulates theory testing and falsification through research design and data collection in a game format. The program, based on cognitive and epistemological principles, is designed to support learning of the thinking skills underlying deductive and inductive logic and statistical reasoning. Difference to Inference has database connectivity so that game scores can be counted as part of course grades.
Anhøj, Jacob; Olesen, Anne Vingaard
2014-01-01
A run chart is a line graph of a measure plotted over time with the median as a horizontal line. The main purpose of the run chart is to identify process improvement or degradation, which may be detected by statistical tests for non-random patterns in the data sequence. We studied the sensitivity to shifts and linear drifts in simulated processes using the shift, crossings and trend rules for detecting non-random variation in run charts. The shift and crossings rules are effective in detecting shifts and drifts in process centre over time while keeping the false signal rate constant around 5% and independent of the number of data points in the chart. The trend rule is virtually useless for detection of linear drift over time, the purpose it was intended for.
Assessing the significance of pedobarographic signals using random field theory.
Pataky, Todd C
2008-08-07
Traditional pedobarographic statistical analyses are conducted over discrete regions. Recent studies have demonstrated that regionalization can corrupt pedobarographic field data through conflation when arbitrary dividing lines inappropriately delineate smooth field processes. An alternative is to register images such that homologous structures optimally overlap and then conduct statistical tests at each pixel to generate statistical parametric maps (SPMs). The significance of SPM processes may be assessed within the framework of random field theory (RFT). RFT is ideally suited to pedobarographic image analysis because its fundamental data unit is a lattice sampling of a smooth and continuous spatial field. To correct for the vast number of multiple comparisons inherent in such data, recent pedobarographic studies have employed a Bonferroni correction to retain a constant family-wise error rate. This approach unfortunately neglects the spatial correlation of neighbouring pixels, so provides an overly conservative (albeit valid) statistical threshold. RFT generally relaxes the threshold depending on field smoothness and on the geometry of the search area, but it also provides a framework for assigning p values to suprathreshold clusters based on their spatial extent. The current paper provides an overview of basic RFT concepts and uses simulated and experimental data to validate both RFT-relevant field smoothness estimations and RFT predictions regarding the topological characteristics of random pedobarographic fields. Finally, previously published experimental data are re-analysed using RFT inference procedures to demonstrate how RFT yields easily understandable statistical results that may be incorporated into routine clinical and laboratory analyses.
Monitoring of antisolvent crystallization of sodium scutellarein by combined FBRM-PVM-NIR.
Liu, Xuesong; Sun, Di; Wang, Feng; Wu, Yongjiang; Chen, Yong; Wang, Longhu
2011-06-01
Antisolvent crystallization can be used as an alternative to cooling or evaporation for the separation and purification of solid product in the pharmaceutical industry. To improve the process understanding of antisolvent crystallization, the use of in-line tools is vital. In this study, the process analytical technology (PAT) tools including focused beam reflectance measurement (FBRM), particle video microscope (PVM), and near-infrared spectroscopy (NIRS) were utilized to monitor antisolvent crystallization of sodium scutellarein. FBRM was used to monitor chord count and chord length distribution of sodium scutellarein particles in the crystallizer, and PVM, as an in-line video camera, provided pictures imaging particle shape and dimension. In addition, a quantitative model of PLS was established by in-line NIRS to detect the concentration of sodium scutellarein in the solvent and good calibration statistics were obtained (r(2) = 0.976) with the residual predictive deviation value of 11.3. The discussion over sensitivities, strengths, and weaknesses of the PAT tools may be helpful in selection of suitable PAT techniques. These in-line techniques eliminate the need for sample preparation and offer a time-saving approach to understand and monitor antisolvent crystallization process. Copyright © 2011 Wiley-Liss, Inc.
An Analysis LANDSAT-4 Thematic Mapper Geometric Properties
NASA Technical Reports Server (NTRS)
Walker, R. E.; Zobrist, A. L.; Bryant, N. A.; Gokhman, B.; Friedman, S. Z.; Logan, T. L.
1984-01-01
LANDSAT Thematic Mapper P-data of Washington, D. C., Harrisburg, PA, and Salton Sea, CA are analyzed to determine magnitudes and causes of error in the geometric conformity of the data to known Earth surface geometry. Several tests of data geometry are performed. Intraband and interband correlation and registration are investigated, exclusive of map based ground truth. The magnitudes and statistical trends of pixel offsets between a single band's mirror scans (due to processing procedures) are computed, and the inter-band integrity of registration is analyzed. A line to line correlation analysis is included.
Universal Capacitance Model for Real-Time Biomass in Cell Culture.
Konakovsky, Viktor; Yagtu, Ali Civan; Clemens, Christoph; Müller, Markus Michael; Berger, Martina; Schlatter, Stefan; Herwig, Christoph
2015-09-02
: Capacitance probes have the potential to revolutionize bioprocess control due to their safe and robust use and ability to detect even the smallest capacitors in the form of biological cells. Several techniques have evolved to model biomass statistically, however, there are problems with model transfer between cell lines and process conditions. Errors of transferred models in the declining phase of the culture range for linear models around +100% or worse, causing unnecessary delays with test runs during bioprocess development. The goal of this work was to develop one single universal model which can be adapted by considering a potentially mechanistic factor to estimate biomass in yet untested clones and scales. The novelty of this work is a methodology to select sensitive frequencies to build a statistical model which can be shared among fermentations with an error between 9% and 38% (mean error around 20%) for the whole process, including the declining phase. A simple linear factor was found to be responsible for the transferability of biomass models between cell lines, indicating a link to their phenotype or physiology.
Watersheds in disordered media
NASA Astrophysics Data System (ADS)
Andrade, Joséi, Jr.; Araújo, Nuno; Herrmann, Hans; Schrenk, Julian
2015-02-01
What is the best way to divide a rugged landscape? Since ancient times, watersheds separating adjacent water systems that flow, for example, toward different seas, have been used to delimit boundaries. Interestingly, serious and even tense border disputes between countries have relied on the subtle geometrical properties of these tortuous lines. For instance, slight and even anthropogenic modifications of landscapes can produce large changes in a watershed, and the effects can be highly nonlocal. Although the watershed concept arises naturally in geomorphology, where it plays a fundamental role in water management, landslide, and flood prevention, it also has important applications in seemingly unrelated fields such as image processing and medicine. Despite the far-reaching consequences of the scaling properties on watershed-related hydrological and political issues, it was only recently that a more profound and revealing connection has been disclosed between the concept of watershed and statistical physics of disordered systems. This review initially surveys the origin and definition of a watershed line in a geomorphological framework to subsequently introduce its basic geometrical and physical properties. Results on statistical properties of watersheds obtained from artificial model landscapes generated with long-range correlations are presented and shown to be in good qualitative and quantitative agreement with real landscapes.
Toyota's inspection system for vehicular emissions at assembly lines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tanaka, T.; Nakano, H.; Usami, I.
1976-01-01
In order that all Toyota production vehicles may satisfy the emission requirements and be free from possible defects such as catalytic converter damage, a system called ECAS, which allows us to assure satisfactory basic emission performance levels has been developed and put into actual use at assembly lines. This system consists of the following four tests: Idle Test, Functional Test, Short Cycle Test and Steady State Inspection Test. By using this system, all operations from vehicle setup, on a chassis dynamometer to statistical analysis of the data, measurement, judgement of the obtained data, type-out of the results, indication for actionmore » to be taken, data filing and statistical treatment of the data, are processed automatically and controlled by the computer. In the Short Cycle Test the up-stream emissions of the vehicle, tracing Toyota's unique short cyclic mode on a chassis dynamometer, are continuously measured. Based on the emission levels during each mode and the total emission level obtained from the above test we can diagnose, not only the emission control systems of a vehicle and its engine conditions such as valve clearance maladjustment and carburetor defects, but also the emission characteristics of this vehicle.« less
Li, Wen-Long; Qu, Hai-Bin
2016-10-01
In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.
Surveying Low-Mass Star Formation with the Submillimeter Array
NASA Astrophysics Data System (ADS)
Dunham, Michael
2018-01-01
Large astronomical surveys yield important statistical information that can’t be derived from single-object and small-number surveys. In this talk I will review two recent surveys in low-mass star formation undertaken by the Submillimeter Array (SMA): a millimeter continuum survey of disks surrounding variably accreting young stars, and a complete continuum and molecular line survey of all protostars in the nearby Perseus Molecular Cloud. I will highlight several new insights into the processes by which low-mass stars gain their mass that have resulted from the statistical power of these surveys.
Adaptation to stimulus statistics in the perception and neural representation of auditory space.
Dahmen, Johannes C; Keating, Peter; Nodal, Fernando R; Schulz, Andreas L; King, Andrew J
2010-06-24
Sensory systems are known to adapt their coding strategies to the statistics of their environment, but little is still known about the perceptual implications of such adjustments. We investigated how auditory spatial processing adapts to stimulus statistics by presenting human listeners and anesthetized ferrets with noise sequences in which interaural level differences (ILD) rapidly fluctuated according to a Gaussian distribution. The mean of the distribution biased the perceived laterality of a subsequent stimulus, whereas the distribution's variance changed the listeners' spatial sensitivity. The responses of neurons in the inferior colliculus changed in line with these perceptual phenomena. Their ILD preference adjusted to match the stimulus distribution mean, resulting in large shifts in rate-ILD functions, while their gain adapted to the stimulus variance, producing pronounced changes in neural sensitivity. Our findings suggest that processing of auditory space is geared toward emphasizing relative spatial differences rather than the accurate representation of absolute position.
Sensitivity to the Sampling Process Emerges From the Principle of Efficiency.
Jara-Ettinger, Julian; Sun, Felix; Schulz, Laura; Tenenbaum, Joshua B
2018-05-01
Humans can seamlessly infer other people's preferences, based on what they do. Broadly, two types of accounts have been proposed to explain different aspects of this ability. The first account focuses on spatial information: Agents' efficient navigation in space reveals what they like. The second account focuses on statistical information: Uncommon choices reveal stronger preferences. Together, these two lines of research suggest that we have two distinct capacities for inferring preferences. Here we propose that this is not the case, and that spatial-based and statistical-based preference inferences can be explained by the assumption that agents are efficient alone. We show that people's sensitivity to spatial and statistical information when they infer preferences is best predicted by a computational model of the principle of efficiency, and that this model outperforms dual-system models, even when the latter are fit to participant judgments. Our results suggest that, as adults, a unified understanding of agency under the principle of efficiency underlies our ability to infer preferences. Copyright © 2018 Cognitive Science Society, Inc.
Intact implicit statistical learning in borderline personality disorder.
Unoka, Zsolt; Vizin, Gabriella; Bjelik, Anna; Radics, Dóra; Nemeth, Dezso; Janacsek, Karolina
2017-09-01
Wide-spread neuropsychological deficits have been identified in borderline personality disorder (BPD). Previous research found impairments in decision making, declarative memory, working memory and executive functions; however, no studies have focused on implicit learning in BPD yet. The aim of our study was to investigate implicit statistical learning by comparing learning performance of 19 BPD patients and 19 healthy, age-, education- and gender-matched controls on a probabilistic sequence learning task. Moreover, we also tested whether participants retain the acquired knowledge after a delay period. To this end, participants were retested on a shorter version of the same task 24h after the learning phase. We found intact implicit statistical learning as well as retention of the acquired knowledge in this personality disorder. BPD patients seem to be able to extract and represent regularities implicitly, which is in line with the notion that implicit learning is less susceptible to illness compared to the more explicit processes. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
On-line consolidation of thermoplastic composites
NASA Astrophysics Data System (ADS)
Shih, Po-Jen
An on-line consolidation system, which includes a computer-controlled filament winding machine and a consolidation head assembly, has been designed and constructed to fabricate composite parts from thermoplastic towpregs. A statistical approach was used to determine the significant processing parameters and their effect on the mechanical and physical properties of composite cylinders fabricated by on-line consolidation. A central composite experimental design was used to select the processing conditions for manufacturing the composite cylinders. The thickness, density, void content, degree of crystallinity and interlaminar shear strength (ILSS) were measured for each composite cylinder. Micrographs showed that complete intimate contact and uniform fiber-matrix distribution were achieved. The degree of crystallinity of the cylinders was found to be in the range of 25-30%. Under optimum processing conditions, an ILSS of 58 MPa and a void content of <1% were achieved for APC-2 (PEEK/Carbon fiber) composite cylinders. An in-situ measurement system which uses a slip ring assembly and a computer data acquisition system was developed to obtain temperature data during winding. Composite cylinders were manufactured with eight K-type thermocouples installed in various locations inside the cylinder. The temperature distribution inside the composite cylinder during winding was measured for different processing conditions. ABAQUS finite element models of the different processes that occur during on-line consolidation were constructed. The first model was used to determine the convective heat transfer coefficient for the hot-air heat source. A convective heat transfer coefficient of 260 w/msp{2°}K was obtained by matching the calculated temperature history to the in-situ measurement data. To predict temperature distribution during winding an ABAQUS winding simulation model was developed. The winding speed was modeled by incrementally moving the convective boundary conditions around the outer surface of the composite cylinder. A towpreg heating model was constructed to predict the temperature distribution on the cross section of the incoming towpreg. For the process-induced thermal stresses analysis, a thermoelastic finite element model was constructed. Using the temperature history obtained from thermal analysis as the initial conditions, the thermal stresses during winding and cooling were investigated.
Santos, Juliana Lane Paixão Dos; Samapundo, Simbarashe; Biyikli, Ayse; Van Impe, Jan; Akkermans, Simen; Höfte, Monica; Abatih, Emmanuel Nji; Sant'Ana, Anderson S; Devlieghere, Frank
2018-05-19
Heat-resistant moulds (HRMs) are well known for their ability to survive pasteurization and spoil high-acid food products, which is of great concern for processors of fruit-based products worldwide. Whilst the majority of the studies on HRMs over the last decades have addressed their inactivation, few data are currently available regarding their contamination levels in fruit and fruit-based products. Thus, this study aimed to quantify and identify heat-resistant fungal ascospores from samples collected throughout the processing of pasteurized high-acid fruit products. In addition, an assessment on the effect of processing on the contamination levels of HRMs in these products was carried out. A total of 332 samples from 111 batches were analyzed from three processing plants (=three processing lines): strawberry puree (n = 88, Belgium), concentrated orange juice (n = 90, Brazil) and apple puree (n = 154, the Netherlands). HRMs were detected in 96.4% (107/111) of the batches and 59.3% (197/332) of the analyzed samples. HRMs were present in 90.9% of the samples from the strawberry puree processing line (1-215 ascospores/100 g), 46.7% of the samples from the orange juice processing line (1-200 ascospores/100 g) and 48.7% of samples from the apple puree processing line (1-84 ascospores/100 g). Despite the high occurrence, the majority (76.8%, 255/332) of the samples were either not contaminated or presented low levels of HRMs (<10 ascospores/100 g). For both strawberry puree and concentrated orange juice, processing had no statistically significant effect on the levels of HRMs (p > 0.05). On the contrary, a significant reduction (p < 0.05) in HRMs levels was observed during the processing of apple puree. Twelve species were identified belonging to four genera - Byssochlamys, Aspergillus with Neosartorya-type ascospores, Talaromyces and Rasamsonia. N. fumigata (23.6%), N. fischeri (19.1%) and B. nivea (5.5%) were the predominant species in pasteurized products. The quantitative data (contamination levels of HRMs) were fitted to exponential distributions and will ultimately be included as input to spoilage risk assessment models which would allow better control of the spoilage of heat treated fruit products caused by heat-resistant moulds. Copyright © 2018 Elsevier B.V. All rights reserved.
ATLASGAL -- A molecular view of an unbiased sample of massive star forming clumps
NASA Astrophysics Data System (ADS)
Figura, Charles; Urquhart, James; Wyrowski, Friedrich; Giannetti, Andrea; Kim, Wonju
2018-01-01
Massive stars play an important role in many areas of astrophysics, from regulating star formation to driving the evolution of their host galaxy. Study of these stars is made difficult by their short evolutionary timescales, small populations and greater distances, and further complicated because they reach the main sequence while still shrouded in their natal clumps. As a result, many aspects of their formation are still poorly understood.We have assembled a large and statistically representative collection of massive star-forming environments that span all evolutionary stages of development by correlating mid-infrared and dust continnum surveys. We have conducted follow-up single-pointing observations toward a sample of approximately 600 of these clumps with the Mopra telescope using an 8 GHz bandwidth that spans some 27 molecular and mm-radio recombination line transitions. These lines trace a wide range of interstellar conditions with varying thermal, chemical, and kinematic properties. Many of these lines exhibit hyperfine structure allowing more detailed measurements of the clump environment (e.g. rotation temperatures and column densities).From these twenty-seven lines, we have identified thirteen line intensity ratios that strongly trace the evolutionary state of these clumps. We have investigated individual molecular and mm-radio recombination lines, contrasting these with radio and sub-mm continuum observations. We present a summary of the results of the statistical analysis of the sample, and compare them with previous similar studies to test their utility as chemical clocks of the evolutionary processes.
NASA Astrophysics Data System (ADS)
Fauziah, D.; Mardiyana; Saputro, D. R. S.
2018-05-01
Assessment is an integral part in the learning process. The process and the result should be in line, regarding to measure the ability of learners. Authentic assessment refers to a form of assessment that measures the competence of attitudes, knowledge, and skills. In fact, many teachers including mathematics teachers who have implemented curriculum based teaching 2013 feel confuse and difficult in mastering the use of authentic assessment instruments. Therefore, it is necessary to design an authentic assessment instrument with an interactive mini media project where teacher can adopt it in the assessment. The type of this research is developmental research. The developmental research refers to the 4D models development, which consist of four stages: define, design, develop and disseminate. The research purpose is to create a valid mini project interactive media on statistical materials in junior high school. The retrieved valid instrument based on expert judgment are 3,1 for eligibility constructions aspect, and 3,2 for eligibility presentation aspect, 3,25 for eligibility contents aspect, and 2,9 for eligibility didactic aspect. The research results obtained interactive mini media projects on statistical materials using Adobe Flash so it can help teachers and students in achieving learning objectives.
On-line Machine Learning and Event Detection in Petascale Data Streams
NASA Astrophysics Data System (ADS)
Thompson, David R.; Wagstaff, K. L.
2012-01-01
Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data mining. This talk describes research performed at the Jet Propulsion Laboratory, California Institute of Technology. Copyright 2012, All Rights Reserved. U.S. Government support acknowledged.
High-resolution EEG techniques for brain-computer interface applications.
Cincotti, Febo; Mattia, Donatella; Aloise, Fabio; Bufalari, Simona; Astolfi, Laura; De Vico Fallani, Fabrizio; Tocci, Andrea; Bianchi, Luigi; Marciani, Maria Grazia; Gao, Shangkai; Millan, Jose; Babiloni, Fabio
2008-01-15
High-resolution electroencephalographic (HREEG) techniques allow estimation of cortical activity based on non-invasive scalp potential measurements, using appropriate models of volume conduction and of neuroelectrical sources. In this study we propose an application of this body of technologies, originally developed to obtain functional images of the brain's electrical activity, in the context of brain-computer interfaces (BCI). Our working hypothesis predicted that, since HREEG pre-processing removes spatial correlation introduced by current conduction in the head structures, by providing the BCI with waveforms that are mostly due to the unmixed activity of a small cortical region, a more reliable classification would be obtained, at least when the activity to detect has a limited generator, which is the case in motor related tasks. HREEG techniques employed in this study rely on (i) individual head models derived from anatomical magnetic resonance images, (ii) distributed source model, composed of a layer of current dipoles, geometrically constrained to the cortical mantle, (iii) depth-weighted minimum L(2)-norm constraint and Tikhonov regularization for linear inverse problem solution and (iv) estimation of electrical activity in cortical regions of interest corresponding to relevant Brodmann areas. Six subjects were trained to learn self modulation of sensorimotor EEG rhythms, related to the imagination of limb movements. Off-line EEG data was used to estimate waveforms of cortical activity (cortical current density, CCD) on selected regions of interest. CCD waveforms were fed into the BCI computational pipeline as an alternative to raw EEG signals; spectral features are evaluated through statistical tests (r(2) analysis), to quantify their reliability for BCI control. These results are compared, within subjects, to analogous results obtained without HREEG techniques. The processing procedure was designed in such a way that computations could be split into a setup phase (which includes most of the computational burden) and the actual EEG processing phase, which was limited to a single matrix multiplication. This separation allowed to make the procedure suitable for on-line utilization, and a pilot experiment was performed. Results show that lateralization of electrical activity, which is expected to be contralateral to the imagined movement, is more evident on the estimated CCDs than in the scalp potentials. CCDs produce a pattern of relevant spectral features that is more spatially focused, and has a higher statistical significance (EEG: 0.20+/-0.114 S.D.; CCD: 0.55+/-0.16 S.D.; p=10(-5)). A pilot experiment showed that a trained subject could utilize voluntary modulation of estimated CCDs for accurate (eight targets) on-line control of a cursor. This study showed that it is practically feasible to utilize HREEG techniques for on-line operation of a BCI system; off-line analysis suggests that accuracy of BCI control is enhanced by the proposed method.
Mesoscopic Fluctuations for the Thinned Circular Unitary Ensemble
NASA Astrophysics Data System (ADS)
Berggren, Tomas; Duits, Maurice
2017-09-01
In this paper we study the asymptotic behavior of mesoscopic fluctuations for the thinned Circular Unitary Ensemble. The effect of thinning is that the eigenvalues start to decorrelate. The decorrelation is stronger on the larger scales than on the smaller scales. We investigate this behavior by studying mesoscopic linear statistics. There are two regimes depending on the scale parameter and the thinning parameter. In one regime we obtain a CLT of a classical type and in the other regime we retrieve the CLT for CUE. The two regimes are separated by a critical line. On the critical line the limiting fluctuations are no longer Gaussian, but described by infinitely divisible laws. We argue that this transition phenomenon is universal by showing that the same transition and their laws appear for fluctuations of the thinned sine process in a growing box. The proofs are based on a Riemann-Hilbert problem for integrable operators.
Serio, Livia Maria; Palumbo, Davide; De Filippis, Luigi Alberto Ciro; Galietti, Umberto; Ludovico, Antonio Domenico
2016-02-23
A study of the Friction Stir Welding (FSW) process was carried out in order to evaluate the influence of process parameters on the mechanical properties of aluminum plates (AA5754-H111). The process was monitored during each test by means of infrared cameras in order to correlate temperature information with eventual changes of the mechanical properties of joints. In particular, two process parameters were considered for tests: the welding tool rotation speed and the welding tool traverse speed. The quality of joints was evaluated by means of destructive and non-destructive tests. In this regard, the presence of defects and the ultimate tensile strength (UTS) were investigated for each combination of the process parameters. A statistical analysis was carried out to assess the correlation between the thermal behavior of joints and the process parameters, also proving the capability of Infrared Thermography for on-line monitoring of the quality of joints.
Serio, Livia Maria; Palumbo, Davide; De Filippis, Luigi Alberto Ciro; Galietti, Umberto; Ludovico, Antonio Domenico
2016-01-01
A study of the Friction Stir Welding (FSW) process was carried out in order to evaluate the influence of process parameters on the mechanical properties of aluminum plates (AA5754-H111). The process was monitored during each test by means of infrared cameras in order to correlate temperature information with eventual changes of the mechanical properties of joints. In particular, two process parameters were considered for tests: the welding tool rotation speed and the welding tool traverse speed. The quality of joints was evaluated by means of destructive and non-destructive tests. In this regard, the presence of defects and the ultimate tensile strength (UTS) were investigated for each combination of the process parameters. A statistical analysis was carried out to assess the correlation between the thermal behavior of joints and the process parameters, also proving the capability of Infrared Thermography for on-line monitoring of the quality of joints. PMID:28773246
Experiments with recursive estimation in astronomical image processing
NASA Technical Reports Server (NTRS)
Busko, I.
1992-01-01
Recursive estimation concepts were applied to image enhancement problems since the 70's. However, very few applications in the particular area of astronomical image processing are known. These concepts were derived, for 2-dimensional images, from the well-known theory of Kalman filtering in one dimension. The historic reasons for application of these techniques to digital images are related to the images' scanned nature, in which the temporal output of a scanner device can be processed on-line by techniques borrowed directly from 1-dimensional recursive signal analysis. However, recursive estimation has particular properties that make it attractive even in modern days, when big computer memories make the full scanned image available to the processor at any given time. One particularly important aspect is the ability of recursive techniques to deal with non-stationary phenomena, that is, phenomena which have their statistical properties variable in time (or position in a 2-D image). Many image processing methods make underlying stationary assumptions either for the stochastic field being imaged, for the imaging system properties, or both. They will underperform, or even fail, when applied to images that deviate significantly from stationarity. Recursive methods, on the contrary, make it feasible to perform adaptive processing, that is, to process the image by a processor with properties tuned to the image's local statistical properties. Recursive estimation can be used to build estimates of images degraded by such phenomena as noise and blur. We show examples of recursive adaptive processing of astronomical images, using several local statistical properties to drive the adaptive processor, as average signal intensity, signal-to-noise and autocorrelation function. Software was developed under IRAF, and as such will be made available to interested users.
Daltrozzo, Jerome; Conway, Christopher M.
2014-01-01
Statistical-sequential learning (SL) is the ability to process patterns of environmental stimuli, such as spoken language, music, or one’s motor actions, that unfold in time. The underlying neurocognitive mechanisms of SL and the associated cognitive representations are still not well understood as reflected by the heterogeneity of the reviewed cognitive models. The purpose of this review is: (1) to provide a general overview of the primary models and theories of SL, (2) to describe the empirical research – with a focus on the event-related potential (ERP) literature – in support of these models while also highlighting the current limitations of this research, and (3) to present a set of new lines of ERP research to overcome these limitations. The review is articulated around three descriptive dimensions in relation to SL: the level of abstractness of the representations learned through SL, the effect of the level of attention and consciousness on SL, and the developmental trajectory of SL across the life-span. We conclude with a new tentative model that takes into account these three dimensions and also point to several promising new lines of SL research. PMID:24994975
Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin
2014-06-05
In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Time independent summary statistics may aid the understanding of drugs' action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies.
2014-01-01
Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Conclusion Time independent summary statistics may aid the understanding of drugs’ action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies. PMID:24902483
Brouckaert, D; Uyttersprot, J-S; Broeckx, W; De Beer, T
2018-03-01
Calibration transfer or standardisation aims at creating a uniform spectral response on different spectroscopic instruments or under varying conditions, without requiring a full recalibration for each situation. In the current study, this strategy is applied to construct at-line multivariate calibration models and consequently employ them in-line in a continuous industrial production line, using the same spectrometer. Firstly, quantitative multivariate models are constructed at-line at laboratory scale for predicting the concentration of two main ingredients in hard surface cleaners. By regressing the Raman spectra of a set of small-scale calibration samples against their reference concentration values, partial least squares (PLS) models are developed to quantify the surfactant levels in the liquid detergent compositions under investigation. After evaluating the models performance with a set of independent validation samples, a univariate slope/bias correction is applied in view of transporting these at-line calibration models to an in-line manufacturing set-up. This standardisation technique allows a fast and easy transfer of the PLS regression models, by simply correcting the model predictions on the in-line set-up, without adjusting anything to the original multivariate calibration models. An extensive statistical analysis is performed in order to assess the predictive quality of the transferred regression models. Before and after transfer, the R 2 and RMSEP of both models is compared for evaluating if their magnitude is similar. T-tests are then performed to investigate whether the slope and intercept of the transferred regression line are not statistically different from 1 and 0, respectively. Furthermore, it is inspected whether no significant bias can be noted. F-tests are executed as well, for assessing the linearity of the transfer regression line and for investigating the statistical coincidence of the transfer and validation regression line. Finally, a paired t-test is performed to compare the original at-line model to the slope/bias corrected in-line model, using interval hypotheses. It is shown that the calibration models of Surfactant 1 and Surfactant 2 yield satisfactory in-line predictions after slope/bias correction. While Surfactant 1 passes seven out of eight statistical tests, the recommended validation parameters are 100% successful for Surfactant 2. It is hence concluded that the proposed strategy for transferring at-line calibration models to an in-line industrial environment via a univariate slope/bias correction of the predicted values offers a successful standardisation approach. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yi, Yong; Chen, Zhengying; Wang, Liming
2018-05-01
Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.
Slope and Line of Best Fit: A Transfer of Knowledge Case Study
ERIC Educational Resources Information Center
Nagle, Courtney; Casey, Stephanie; Moore-Russo, Deborah
2017-01-01
This paper brings together research on slope from mathematics education and research on line of best fit from statistics education by considering what knowledge of slope students transfer to a novel task involving determining the placement of an informal line of best fit. This study focuses on two students who transitioned from placing inaccurate…
Search for Long Period Solar Normal Modes in Ambient Seismic Noise
NASA Astrophysics Data System (ADS)
Caton, R.; Pavlis, G. L.
2016-12-01
We search for evidence of solar free oscillations (normal modes) in long period seismic data through multitaper spectral analysis of array stacks. This analysis is similar to that of Thomson & Vernon (2015), who used data from the most quiet single stations of the global seismic network. Our approach is to use stacks of large arrays of noisier stations to reduce noise. Arrays have the added advantage of permitting the use of nonparametic statistics (jackknife errors) to provide objective error estimates. We used data from the Transportable Array, the broadband borehole array at Pinyon Flat, and the 3D broadband array in Homestake Mine in Lead, SD. The Homestake Mine array has 15 STS-2 sensors deployed in the mine that are extremely quiet at long periods due to stable temperatures and stable piers anchored to hard rock. The length of time series used ranged from 50 days to 85 days. We processed the data by low-pass filtering with a corner frequency of 10 mHz, followed by an autoregressive prewhitening filter and median stack. We elected to use the median instead of the mean in order to get a more robust stack. We then used G. Prieto's mtspec library to compute multitaper spectrum estimates on the data. We produce delete-one jackknife error estimates of the uncertainty at each frequency by computing median stacks of all data with one station removed. The results from the TA data show tentative evidence for several lines between 290 μHz and 400 μHz, including a recurring line near 379 μHz. This 379 μHz line is near the Earth mode 0T2 and the solar mode 5g5, suggesting that 5g5 could be coupling into the Earth mode. Current results suggest more statistically significant lines may be present in Pinyon Flat data, but additional processing of the data is underway to confirm this observation.
NASA Astrophysics Data System (ADS)
Ramgraber, M.; Schirmer, M.
2017-12-01
As computational power grows and wireless sensor networks find their way into common practice, it becomes increasingly feasible to pursue on-line numerical groundwater modelling. The reconciliation of model predictions with sensor measurements often necessitates the application of Sequential Monte Carlo (SMC) techniques, most prominently represented by the Ensemble Kalman Filter. In the pursuit of on-line predictions it seems advantageous to transcend the scope of pure data assimilation and incorporate on-line parameter calibration as well. Unfortunately, the interplay between shifting model parameters and transient states is non-trivial. Several recent publications (e.g. Chopin et al., 2013, Kantas et al., 2015) in the field of statistics discuss potential algorithms addressing this issue. However, most of these are computationally intractable for on-line application. In this study, we investigate to what extent compromises between mathematical rigour and computational restrictions can be made within the framework of on-line numerical modelling of groundwater. Preliminary studies are conducted in a synthetic setting, with the goal of transferring the conclusions drawn into application in a real-world setting. To this end, a wireless sensor network has been established in the valley aquifer around Fehraltorf, characterized by a highly dynamic groundwater system and located about 20 km to the East of Zürich, Switzerland. By providing continuous probabilistic estimates of the state and parameter distribution, a steady base for branched-off predictive scenario modelling could be established, providing water authorities with advanced tools for assessing the impact of groundwater management practices. Chopin, N., Jacob, P.E. and Papaspiliopoulos, O. (2013): SMC2: an efficient algorithm for sequential analysis of state space models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75 (3), p. 397-426. Kantas, N., Doucet, A., Singh, S.S., Maciejowski, J., and Chopin, N. (2015): On Particle Methods for Parameter Estimation in State-Space Models. Statistical Science, 30 (3), p. 328.-351.
GENOPT 2016: Design of a generalization-based challenge in global optimization
NASA Astrophysics Data System (ADS)
Battiti, Roberto; Sergeyev, Yaroslav; Brunato, Mauro; Kvasov, Dmitri
2016-10-01
While comparing results on benchmark functions is a widely used practice to demonstrate the competitiveness of global optimization algorithms, fixed benchmarks can lead to a negative data mining process. To avoid this negative effect, the GENOPT contest benchmarks can be used which are based on randomized function generators, designed for scientific experiments, with fixed statistical characteristics but individual variation of the generated instances. The generators are available to participants for off-line tests and online tuning schemes, but the final competition is based on random seeds communicated in the last phase through a cooperative process. A brief presentation and discussion of the methods and results obtained in the framework of the GENOPT contest are given in this contribution.
NASA Astrophysics Data System (ADS)
Dombeck, J. P.; Cattell, C. A.; Prasad, N.; Sakher, A.; Hanson, E.; McFadden, J. P.; Strangeway, R. J.
2016-12-01
Field-aligned currents (FACs) provide a fundamental driver and means of Magnetosphere-Ionosphere (M-I) coupling. These currents need to be supported by local physics along the entire field line generally with quasi-static potential structures, but also supporting the time-evolution of the structures and currents, producing Alfvén waves and Alfvénic electron acceleration. In regions of upward current, precipitating auroral electrons are accelerated earthward. These processes can result in ion outflow, changes in ionospheric conductivity, and affect the particle distributions on the field line, affecting the M-I coupling processes supporting the individual FACs and potentially the entire FAC system. The FAST mission was well suited to study both the FACs and the electron auroral acceleration processes. We present the results of the comparisons between meso- and small-scale FACs determined from FAST using the method of Peria, et al., 2000, and our FAST auroral acceleration mechanism study when such identification is possible for the entire ˜13 year FAST mission. We also present the latest results of the electron energy (and number) flux ionospheric input based on acceleration mechanism (and FAC characteristics) from our FAST auroral acceleration mechanism study.
Microtensile bond strength of different acrylic teeth to high-impact denture base resins.
Colebeck, Amanda C; Monaco, Edward A; Pusateri, Christopher R; Davis, Elaine L
2015-01-01
This study evaluated the effect of denture base acrylic, denture tooth composition, and ridge-lap surface treatment on the microtensile bond strength (μTBS) of three commercially available denture teeth and two injection denture processing systems. Sixteen experimental groups were formed (n = 3), according to denture tooth surface treatment (no treatment or surface treatment recommended by the manufacturer), denture base processing technique and acrylic (SR-Ivocap-Ivocap Plus or Success-Lucitone 199), and tooth type-composition at bonding interface (BlueLine DCL-PMMA, Portrait IPN-PMMA, Phonares II-PMMA, Phonares II-NHC). Rectangular bar specimens with a 1 mm(2) cross sectional area were fabricated and subsequently thermocycled at 10,000 cycles between 5°C and 55°C with a 15-second dwell time. Select specimens underwent μTBS testing in a universal testing machine with a 1 kN load cell at 0.5 mm/min crosshead speed. Data were analyzed statistically by two and three-way ANOVA and Tukey post hoc test (α = 0.05). Mean μTBS ranged between 56.2 ± 5.6 and 60.8 ± 5.0 N/mm(2) for the Ivocap Plus specimens and 13.3 ± 5.12 to 60.1 ± 6.0 N/mm(2) for the Lucitone 199 specimens. Among the Ivocap specimens, BlueLine DCL and Phonares II NHC had significantly higher μTBS than Portrait IPN to Ivocap Plus acrylic. There were no statistically significant differences among Blueline, Phonares II PMMA, and Phonares II NHC, or between Phonares II PMMA and Portrait IPN. Within the Luctione 199 specimens, there was a significantly higher μTBS for BlueLine DCL and Phonares II NHC denture teeth with the manufacturer-recommended surface treatment when compared to control surface. BlueLine, Portrait, and Phonares II PMMA groups achieved significantly higher mean μTBS than the Phonares II NHC group. There were no statistically significant differences among BlueLine, Portrait, and Phonares II PMMA groups. When evaluating the μTBS of PMMA and NHC denture teeth to base resins, a stronger bond was achieved using materials produced by the same manufacturer. Within the Luctione 199 specimens, the Phonares II NHC group demonstrated significantly lower bond strength than other specimens, suggesting that gross ridge-lap reduction of NHC denture teeth is not recommended if a base acrylic by a different manufacturer from the tooth is going to be used. © 2014 American College of Prosthodontists.
NASA Technical Reports Server (NTRS)
Gardner, Adrian
2010-01-01
National Aeronautical and Space Administration (NASA) weather and atmospheric environmental organizations are insatiable consumers of geophysical, hydrometeorological and solar weather statistics. The expanding array of internet-worked sensors producing targeted physical measurements has generated an almost factorial explosion of near real-time inputs to topical statistical datasets. Normalizing and value-based parsing of such statistical datasets in support of time-constrained weather and environmental alerts and warnings is essential, even with dedicated high-performance computational capabilities. What are the optimal indicators for advanced decision making? How do we recognize the line between sufficient statistical sampling and excessive, mission destructive sampling ? How do we assure that the normalization and parsing process, when interpolated through numerical models, yields accurate and actionable alerts and warnings? This presentation will address the integrated means and methods to achieve desired outputs for NASA and consumers of its data.
NASA Astrophysics Data System (ADS)
Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang
2017-10-01
Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.
GOMOS serendipitous data products
NASA Astrophysics Data System (ADS)
Fussen, D.; Gomos Team
The GOMOS experiment on board ENVISAT has been measuring more than 200 000 star occultations through the Earth's limb since March 2002. With 4 spectrometers, the wavelength coverage of 245 nm to 942 nm allows to monitor ozone, H2O, NO2, NO3, BrO, OClO, air, aerosols, O2 and the temperature profiles. During the commissioning phase, GOMOS turned out to be a successful remote sounder of the Earth's atmosphere between 10 and 120 km. On the other hand, an intensive statistical processing of a large data set (about 5000 occultations) has produced high quality transmittance spectra. A preliminary investigation allowed the discovery of extremely interesting spectral signatures in the GOMOS spectra. Keeping in mind that all possible instrument artefacts should be carefully checked, we nevertheless obtained the following results that may become unexpected GOMOS data products in a near future: the excited oxygen "green line" (O(1S)->O(3P)) at 557.7 nm has been clearly identified and will be inverted the D2 sodium absorption at 589.1 nm is easily recognized in the mesosphere. The inversion of the slant path optical thickness (about 0.0025) has produced the first GOMOS Na vertical profile, in close agreement with the local climatological lidar data of Fort Collins a few possible emission or absorption lines are under investigation and need more statistical tests. However a spectral signature at 280 nm and h=˜ 103 km might probably be attributed to a mesospheric Mg+ layer a group of not yet identified stratospheric emission lines between 390 and 400 nm has been detected. Interestingly, the same lines seem to have also been observed by the SALOMON balloon borne experiment operated in night time conditions.
NASA Astrophysics Data System (ADS)
Hamadeh, Emad; Gunther, Norman G.; Niemann, Darrell; Rahman, Mahmud
2006-06-01
Random fluctuations in fabrication process outcomes such as gate line edge roughness (LER) give rise to corresponding fluctuations in scaled down MOS device characteristics. A thermodynamic-variational model is presented to study the effects of LER on threshold voltage and capacitance of sub-50 nm MOS devices. Conceptually, we treat the geometric definition of the MOS devices on a die as consisting of a collection of gates. In turn, each of these gates has an area, A, and a perimeter, P, defined by nominally straight lines subject to random process outcomes producing roughness. We treat roughness as being deviations from straightness consisting of both transverse amplitude and longitudinal wavelength each having lognormal distribution. We obtain closed-form expressions for variance of threshold voltage ( Vth), and device capacitance ( C) at Onset of Strong Inversion (OSI) for a small device. Using our variational model, we characterized the device electrical properties such as σ and σC in terms of the statistical parameters of the roughness amplitude and spatial frequency, i.e., inverse roughness wavelength. We then verified our model with numerical analysis of Vth roll-off for small devices and σ due to dopant fluctuation. Our model was also benchmarked against TCAD of σ as a function of LER. We then extended our analysis to predict variations in σ and σC versus average LER spatial frequency and amplitude, and oxide-thickness. Given the intuitive expectation that LER of very short wavelengths must also have small amplitude, we have investigated the case in which the amplitude mean is inversely related to the frequency mean. We compare with the situation in which amplitude and frequency mean are unrelated. Given also that the gate perimeter may consist of different LER signature for each side, we have extended our analysis to the case when the LER statistical difference between gate sides is moderate, as well as when it is significantly large.
Saltus, R.W.; Kulander, Christopher S.; Potter, Christopher J.
2002-01-01
We have digitized, modified, and analyzed seismic interpretation maps of 12 subsurface stratigraphic horizons spanning portions of the National Petroleum Reserve in Alaska (NPRA). These original maps were prepared by Tetra Tech, Inc., based on about 15,000 miles of seismic data collected from 1974 to 1981. We have also digitized interpreted faults and seismic velocities from Tetra Tech maps. The seismic surfaces were digitized as two-way travel time horizons and converted to depth using Tetra Tech seismic velocities. The depth surfaces were then modified by long-wavelength corrections based on recent USGS seismic re-interpretation along regional seismic lines. We have developed and executed an algorithm to identify and calculate statistics on the area, volume, height, and depth of closed structures based on these seismic horizons. These closure statistics are tabulated and have been used as input to oil and gas assessment calculations for the region. Directories accompanying this report contain basic digitized data, processed data, maps, tabulations of closure statistics, and software relating to this project.
NASA Astrophysics Data System (ADS)
Bommier, Véronique
2016-06-01
Context. We discuss the case of lines formed by scattering, which comprises both coherent and incoherent scattering. Both processes contribute to form the line profiles in the so-called second solar spectrum, which is the spectrum of the linear polarization of such lines observed close to the solar limb. However, most of the lines cannot be simply modeled with a two-level or two-term atom model, and we present a generalized formalism for this purpose. Aims: The aim is to obtain a formalism that is able to describe scattering in line centers (resonant scattering or incoherent scattering) and in far wings (Rayleigh/Raman scattering or coherent scattering) for a multilevel and multiline atom. Methods: The method is designed to overcome the Markov approximation, which is often performed in the atom-photon interaction description. The method was already presented in the two first papers of this series, but the final equations of those papers were for a two-level atom. Results: We present here the final equations generalized for the multilevel and multiline atom. We describe the main steps of the theoretical development, and, in particular, how we performed the series development to overcome the Markov approximation. Conclusions: The statistical equilibrium equations for the atomic density matrix and the radiative transfer equation coefficients are obtained with line profiles. The Doppler redistribution is also taken into account because we show that the statistical equilibrium equations must be solved for each atomic velocity class.
Adams, Jean V.; Slaght, Karen; Boogaard, Michael A.
2016-01-01
The authors developed a package, LW1949, for use with the statistical software R to automatically carry out the manual steps of Litchfield and Wilcoxon's method of evaluating dose–effect experiments. The LW1949 package consistently finds the best fitting dose–effect relation by minimizing the chi-squared statistic of the observed and expected number of affected individuals and substantially speeds up the line-fitting process and other calculations that Litchfield and Wilcoxon originally carried out by hand. Environ Toxicol Chem 2016;9999:1–4. Published 2016 Wiley Periodicals Inc. on behalf of SETAC. This article is a US Government work and, as such, is in the public domain in the United States of America.
NASA Astrophysics Data System (ADS)
Tokuyama, Sekito; Oka, Tomoharu; Takekawa, Shunya; Yamada, Masaya; Iwata, Yuhei; Tsujimoto, Shiho
2017-01-01
High-velocity compact clouds (HVCCs) is one of the populations of peculiar clouds detected in the Central Molecular Zone (CMZ) of our Galaxy. They have compact appearances (< 5 pc) and large velocity widths (> 50 km s-1). Several explanations for the origin of HVCC were proposed; e.g., a series of supernovae (SN) explosions (Oka et al. 1999) or a gravitational kick by a point-like gravitational source (Oka et al. 2016). To investigate the statistical property of HVCCs, a complete list of them is acutely necessary. However, the previous list is not complete since the identification procedure included automated processes and manual selection (Nagai 2008). Here we developed an automated procedure to identify HVCCs in a spectral line data.
Yoshida, S; Arakawa, F; Higuchi, F; Ishibashi, Y; Goto, M; Sugita, Y; Nomura, Y; Niino, D; Shimizu, K; Aoki, R; Hashikawa, K; Kimura, Y; Yasuda, K; Tashiro, K; Kuhara, S; Nagata, K; Ohshima, K
2012-01-01
Objectives The main histological change in rheumatoid arthritis (RA) is the villous proliferation of synovial lining cells, an important source of cytokines and chemokines, which are associated with inflammation. The aim of this study was to evaluate gene expression in the microdissected synovial lining cells of RA patients, using those of osteoarthritis (OA) patients as the control. Methods Samples were obtained during total joint replacement from 11 RA and five OA patients. Total RNA from the synovial lining cells was derived from selected specimens by laser microdissection (LMD) for subsequent cDNA microarray analysis. In addition, the expression of significant genes was confirmed immunohistochemically. Results The 14 519 genes detected by cDNA microarray were used to compare gene expression levels in synovial lining cells from RA with those from OA patients. Cluster analysis indicated that RA cells, including low- and high-expression subgroups, and OA cells were stored in two main clusters. The molecular activity of RA was statistically consistent with its clinical and histological activity. Expression levels of signal transducer and activator of transcription 1 (STAT1), interferon regulatory factor 1 (IRF1), and the chemokines CXCL9, CXCL10, and CCL5 were statistically significantly higher in the synovium of RA than in that of OA. Immunohistochemically, the lining synovium of RA, but not that of OA, clearly expressed STAT1, IRF1, and chemokines, as was seen in microarray analysis combined with LMD. Conclusions Our findings indicate an important role for lining synovial cells in the inflammatory and proliferative processes of RA. Further understanding of the local signalling in structural components is important in rheumatology. PMID:22401175
Schlösser, Magnus; Seitz, Hendrik; Rupp, Simone; Herwig, Philipp; Alecu, Catalin Gabriel; Sturm, Michael; Bornschein, Beate
2013-03-05
Highly accurate, in-line, and real-time composition measurements of gases are mandatory in many processing applications. The quantitative analysis of mixtures of hydrogen isotopologues (H2, D2, T2, HD, HT, and DT) is of high importance in such fields as DT fusion, neutrino mass measurements using tritium β-decay or photonuclear experiments where HD targets are used. Raman spectroscopy is a favorable method for these tasks. In this publication we present a method for the in-line calibration of Raman systems for the nonradioactive hydrogen isotopologues. It is based on precise volumetric gas mixing of the homonuclear species H2/D2 and a controlled catalytic production of the heteronuclear species HD. Systematic effects like spurious exchange reactions with wall materials and others are considered with care during the procedure. A detailed discussion of statistical and systematic uncertainties is presented which finally yields a calibration accuracy of better than 0.4%.
The Galics Project: Virtual Galaxy: from Cosmological N-body Simulations
NASA Astrophysics Data System (ADS)
Guiderdoni, B.
The GalICS project develops extensive semi-analytic post-processing of large cosmological simulations to describe hierarchical galaxy formation. The multiwavelength statistical properties of high-redshift and local galaxies are predicted within the large-scale structures. The fake catalogs and mock images that are generated from the outputs are used for the analysis and preparation of deep surveys. The whole set of results is now available in an on-line database that can be easily queried. The GalICS project represents a first step towards a 'Virtual Observatory of virtual galaxies'.
Warriner, David Roy; Bayley, Martin; Shi, Yubing; Lawford, Patricia Victoria; Narracott, Andrew; Fenner, John
2017-11-21
This study combined themes in cardiovascular modelling, clinical cardiology and e-learning to create an on-line environment that would assist undergraduate medical students in understanding key physiological and pathophysiological processes in the cardiovascular system. An interactive on-line environment was developed incorporating a lumped-parameter mathematical model of the human cardiovascular system. The model outputs were used to characterise the progression of key disease processes and allowed students to classify disease severity with the aim of improving their understanding of abnormal physiology in a clinical context. Access to the on-line environment was offered to students at all stages of undergraduate training as an adjunct to routine lectures and tutorials in cardiac pathophysiology. Student feedback was collected on this novel on-line material in the course of routine audits of teaching delivery. Medical students, irrespective of their stage of undergraduate training, reported that they found the models and the environment interesting and a positive experience. After exposure to the environment, there was a statistically significant improvement in student performance on a series of 6 questions based on cardiovascular medicine, with a 33% and 22% increase in the number of questions answered correctly, p < 0.0001 and p < 0.001 respectively. Considerable improvement was found in students' knowledge and understanding during assessment after exposure to the e-learning environment. Opportunities exist for development of similar environments in other fields of medicine, refinement of the existing environment and further engagement with student cohorts. This work combines some exciting and developing fields in medical education, but routine adoption of these types of tool will be possible only with the engagement of all stake-holders, from educationalists, clinicians, modellers to, most importantly, medical students.
Woods, J
2001-01-01
The third generation cardiac institute will build on the successes of the past in structuring the service line, re-organizing to assimilate specialist interests, and re-positioning to expand cardiac services into cardiovascular services. To meet the challenges of an increasingly competitive marketplace and complex delivery system, the focus for this new model will shift away from improved structures, and toward improved processes. This shift will require a sound methodology for statistically measuring and sustaining process changes related to the optimization of cardiovascular care. In recent years, GE Medical Systems has successfully applied Six Sigma methodologies to enable cardiac centers to control key clinical and market development processes through its DMADV, DMAIC and Change Acceleration processes. Data indicates Six Sigma is having a positive impact within organizations across the United States, and when appropriately implemented, this approach can serve as a solid foundation for building the next generation cardiac institute.
NASA Astrophysics Data System (ADS)
Liu, Yingyi; Zhou, Lijuan; Liu, Yuanqing; Yuan, Haiwen; Ji, Liang
2017-11-01
Audible noise is closely related to corona current on a high voltage direct current (HVDC) transmission line. In this paper, we measured a large amount of audible noise and corona current waveforms simultaneously based on the largest outdoor HVDC corona cage all over the world. By analyzing the experimental data, the related statistical regularities between a corona current spectrum and an audible noise spectrum were obtained. Furthermore, the generation mechanism of audible noise was analyzed theoretically, and the related mathematical expression between the audible noise spectrum and the corona current spectrum, which is suitable for all of these measuring points in the space, has been established based on the electro-acoustic conversion theory. Finally, combined with the obtained mathematical relation, the internal reasons for these statistical regularities appearing in measured corona current and audible noise data were explained. The results of this paper not only present the statistical association regularities between the corona current spectrum and the audible noise spectrum on a HVDC transmission line, but also reveal the inherent reasons of these associated rules.
Derivation of the open-circuit voltage of organic solar cells
NASA Astrophysics Data System (ADS)
Staple, Douglas B.; Oliver, Patricia A. K.; Hill, Ian G.
2014-05-01
Organic photovoltaic cells have improved in efficiency from 1% two decades ago to over 10% today. Continued improvement necessitates a theoretical understanding of the factors determining efficiency. Organic photovoltaic efficiency can be parameterized in terms of open-circuit voltage, short-circuit current, and fill factor. Here we present a theory that explains the dependencies of open-circuit voltage on semiconductor energy levels, light intensity, solar cell and light-source temperatures, charge-carrier recombination, and external fluorescence efficiency. The present theory also explains why recombination at the donor-acceptor heterointerface is a dominant process in heterojunction-based cells. Furthermore, the Carnot efficiency appears, highlighting the connection to basic thermodynamics. The theory presented here is consistent with and builds on the experimental and theoretical observations already in the literature. Crucially, the present theory can be straightforwardly derived in a line-by-line fashion using standard tools from statistical physics.
Real-time line-width measurements: a new feature for reticle inspection systems
NASA Astrophysics Data System (ADS)
Eran, Yair; Greenberg, Gad; Joseph, Amnon; Lustig, Cornel; Mizrahi, Eyal
1997-07-01
The significance of line width control in mask production has become greater with the lessening of defect size. There are two conventional methods used for controlling line widths dimensions which employed in the manufacturing of masks for sub micron devices. These two methods are the critical dimensions (CD) measurement and the detection of edge defects. Achieving reliable and accurate control of line width errors is one of the most challenging tasks in mask production. Neither of the two methods cited above (namely CD measurement and the detection of edge defects) guarantees the detection of line width errors with good sensitivity over the whole mask area. This stems from the fact that CD measurement provides only statistical data on the mask features whereas applying edge defect detection method checks defects on each edge by itself, and does not supply information on the combined result of error detection on two adjacent edges. For example, a combination of a small edge defect together with a CD non- uniformity which are both within the allowed tolerance, may yield a significant line width error, which will not be detected using the conventional methods (see figure 1). A new approach for the detection of line width errors which overcomes this difficulty is presented. Based on this approach, a new sensitive line width error detector was developed and added to Orbot's RT-8000 die-to-database reticle inspection system. This innovative detector operates continuously during the mask inspection process and scans (inspects) the entire area of the reticle for line width errors. The detection is based on a comparison of measured line width that are taken on both the design database and the scanned image of the reticle. In section 2, the motivation for developing this new detector is presented. The section covers an analysis of various defect types, which are difficult to detect using conventional edge detection methods or, alternatively, CD measurements. In section 3, the basic concept of the new approach is introduced together with a description of the new detector and its characteristics. In section 4, the calibration process that took place in order to achieve reliable and repeatable line width measurements is presented. The description of an experiments conducted in order to evaluate the sensitivity of the new detector is given in section 5, followed by a report of the results of this evaluation. The conclusions are presented in section 6.
The S-Process Branching-Point at 205PB
NASA Astrophysics Data System (ADS)
Tonchev, Anton; Tsoneva, N.; Bhatia, C.; Arnold, C. W.; Goriely, S.; Hammond, S. L.; Kelley, J. H.; Kwan, E.; Lenske, H.; Piekarewicz, J.; Raut, R.; Rusev, G.; Shizuma, T.; Tornow, W.
2017-09-01
Accurate neutron-capture cross sections for radioactive nuclei near the line of beta stability are crucial for understanding s-process nucleosynthesis. However, neutron-capture cross sections for short-lived radionuclides are difficult to measure due to the fact that the measurements require both highly radioactive samples and intense neutron sources. We consider photon scattering using monoenergetic and 100% linearly polarized photon beams to obtain the photoabsorption cross section on 206Pb below the neutron separation energy. This observable becomes an essential ingredient in the Hauser-Feshbach statistical model for calculations of capture cross sections on 205Pb. The newly obtained photoabsorption information is also used to estimate the Maxwellian-averaged radiative cross section of 205Pb(n,g)206Pb at 30 keV. The astrophysical impact of this measurement on s-process nucleosynthesis will be discussed. This work was performed under the auspices of US DOE by LLNL under Contract DE-AC52-07NA27344.
Pandey, Anil K; Bisht, Chandan S; Sharma, Param D; ArunRaj, Sreedharan Thankarajan; Taywade, Sameer; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-11-01
Tc-methylene diphosphonate (Tc-MDP) bone scintigraphy images have limited number of counts per pixel. A noise filtering method based on local statistics of the image produces better results than a linear filter. However, the mask size has a significant effect on image quality. In this study, we have identified the optimal mask size that yields a good smooth bone scan image. Forty four bone scan images were processed using mask sizes 3, 5, 7, 9, 11, 13, and 15 pixels. The input and processed images were reviewed in two steps. In the first step, the images were inspected and the mask sizes that produced images with significant loss of clinical details in comparison with the input image were excluded. In the second step, the image quality of the 40 sets of images (each set had input image, and its corresponding three processed images with 3, 5, and 7-pixel masks) was assessed by two nuclear medicine physicians. They selected one good smooth image from each set of images. The image quality was also assessed quantitatively with a line profile. Fisher's exact test was used to find statistically significant differences in image quality processed with 5 and 7-pixel mask at a 5% cut-off. A statistically significant difference was found between the image quality processed with 5 and 7-pixel mask at P=0.00528. The identified optimal mask size to produce a good smooth image was found to be 7 pixels. The best mask size for the John-Sen Lee filter was found to be 7×7 pixels, which yielded Tc-methylene diphosphonate bone scan images with the highest acceptable smoothness.
Kurek, Marta; Żądzińska, Elżbieta; Sitek, Aneta; Borowska-Strugińska, Beata; Rosset, Iwona; Lorkiewicz, Wiesław
2016-01-01
The neonatal line is usually the first accentuated incremental line visible on the enamel. The prenatal environment significantly contributes to the width of the neonatal line, influencing the pace of reaching post-delivery homeostasis by the newborn's organism. Studies of the enamel of the earliest developing deciduous teeth can provide an insight into the prenatal development and the perinatal conditions of children of past human populations, thus being an additional source contributing to consideration of the influence of prenatal and perinatal factors modifying growth processes. The aim of this study was to examine whether the neonatal line, reflecting the conditions of the prenatal and perinatal environment, differed between the Neolithic, the mediaeval and the modern populations from the Kujawy region in north-central Poland. The material consisted of longitudinally ground sections of 57 human deciduous incisors obtained from children aged 1.0-7.5 years representing three archaeological series from Brześć Kujawski site. All teeth were sectioned in the labio-linqual plane using a diamond blade (Buechler IsoMet 1000). Final specimens were observed with the microscope Delta Optical Evolution 300 at 10× and 40× magnifications. For each tooth, linear measurements of the neonatal line width were performed on its labial surface at the three levels from the cemento-enamel junction. No significant difference was found in the mean neonatal line width depending on the tooth type and archaeological site, although the thickest neonatal line characterised children from the Neolithic series. In all analysed series, the neonatal line width was diversified depending on the child's age at death. The value of Spearman's rank correlation coefficient calculated for the correlation between the child's age at death and the neonatal line width was statistically significant. A clear increase in the width of the neonatal line was thus observed along with a decrease in the child's age at death. Copyright © 2015 Elsevier GmbH. All rights reserved.
NASA Technical Reports Server (NTRS)
Begni, G.; BOISSIN; Desachy, M. J.; PERBOS
1984-01-01
The geometric accuray of LANDSAT TM raw data of Toulouse (France) raw data of Mississippi, and preprocessed data of Mississippi was examined using a CDC computer. Analog images were restituted on the VIZIR SEP device. The methods used for line to line and band to band registration are based on automatic correlation techniques and are widely used in automated image to image registration at CNES. Causes of intraband and interband misregistration are identified and statistics are given for both line to line and band to band misregistration.
Automating approximate Bayesian computation by local linear regression.
Thornton, Kevin R
2009-07-07
In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.
Volcano plots in analyzing differential expressions with mRNA microarrays.
Li, Wentian
2012-12-01
A volcano plot displays unstandardized signal (e.g. log-fold-change) against noise-adjusted/standardized signal (e.g. t-statistic or -log(10)(p-value) from the t-test). We review the basic and interactive use of the volcano plot and its crucial role in understanding the regularized t-statistic. The joint filtering gene selection criterion based on regularized statistics has a curved discriminant line in the volcano plot, as compared to the two perpendicular lines for the "double filtering" criterion. This review attempts to provide a unifying framework for discussions on alternative measures of differential expression, improved methods for estimating variance, and visual display of a microarray analysis result. We also discuss the possibility of applying volcano plots to other fields beyond microarray.
Prediction of ppm level electrical failure by using physical variation analysis
NASA Astrophysics Data System (ADS)
Hou, Hsin-Ming; Kung, Ji-Fu; Hsu, Y.-B.; Yamazaki, Y.; Maruyama, Kotaro; Toyoshima, Yuya; Chen, Chu-en
2016-03-01
The quality of patterns printed on wafer may be attributed to factors such as process window control, pattern fidelity, overlay performance, and metrology. Each of these factors play an important role in making the process more effective by ensuring that certain design- and process-specific parameters are kept within acceptable variation. Since chip size and pattern density are increasing accordingly, in-line real time catching the in-chip weak patterns/defects per million opportunities (WP-DPMO) plays more and more significant role for product yield with high density memory. However, the current in-line inspection tools focus on single layer defect inspection, not effectively and efficiently to catch multi-layer weak patterns/defects even through voltage contrast and/or special test structure design [1]-[2]. In general, the multi-layer weak patterns/defects are escaped easily by using in-line inspection and cause ignorance of product dysfunction until off-line time-consuming final PFA/EFA will be used. To effectively and efficiently in-line real time monitor the potential multi-layer weak patterns, we quantify the bridge electrical metric between contact and gate electrodes into CD physical metric via big data from the larger field of view (FOV: 8k x 16k with 3 nm pixel equalizes to image main field size 34 um x 34 um @ 3 nm pixel) e-beam quality image contour compared to layout GDS database (D2DB) as shown in Fig. 1. Hadoop-based distributed parallel computing is implemented to improve the performance of big data architectures, Fig. 2. Therefore, the state of art in-line real time catching in-chip potential multi-layer weak patterns can be proven and achieved by following some studying cases [3]. Therefore, manufacturing sources of variations can be partitioned to systematic and random variations by applying statistical techniques based on the big data fundamental infrastructures. After big data handling, the in-chip CD and AA variations are distinguished by their spatial correlation distance. For local variations (LV) there is no correlation, whereas for global variations (GV) the correlation distance is very large [7]-[9]. This is the first time to certificate the validation of spatial distribution from the affordable bias contour big data fundamental infrastructures. And then apply statistical techniques to dig out the variation sources. The GV come from systematic issue, which could be compensated by adaptive LT condition or OPC correction. But LV comes from random issue, which being considered as intrinsic problem such as structure, material, tool capability… etc. In this paper studying, we can find out the advanced technology node SRAM contact CD local variation (LV) dominates in total variation, about 70%. It often plays significant in-line real time catching WP-DPMO role of the product yield loss, especially for wafer edge is the worst loss within wafer distribution and causes serious reliability concern. The major root cause of variations comes from the PR material induced burr defect (LV), the second one comes from GV enhanced wafer edge short opportunity, which being attributed to three factors, first one factor is wafer edge CD deliberated enlargement for yield improvement as shown in Fig. 10. Second factor is overlaps/AA shifts due to tool capability dealing with incoming wafer's war page issue and optical periphery layout dependent working pitch issue as shown in Fig. 9 (1)., the last factor comes from wafer edge burr enhanced by wafer edge larger Photo Resistance (PR) spin centrifugal force. After implementing KPIs such as GV related AA/CD indexes as shown in Fig. 9 (1) and 10, respectively, and LV related burr index as shown in Fig. 11., we can construct the parts per million (PPM) level short probability model via multi-variables regression, canonical correlation analysis and logistic transformation. The model provides prediction of PPM level electrical failure by using in-line real time physical variation analysis. However in order to achieve Total Quality Management (TQM), the adaptive Statistical Process Control (SPC) charts can be implemented to in-line real time catch PPM level product malfunction at manufacturing stage. Applying for early stage monitor likes incoming raw material, Photo Resistance (PR) … etc., the LV related burr KPI SPC charts could be a powerful quality inspection vehicle. To sum up the paper's contributions, the state of art in-line real time catching in-chip potential multi-layer physical weak patterns can be proven and achieved effectively and efficiently to associate with PPM level product dysfunction.
Detection of nonlinear transfer functions by the use of Gaussian statistics
NASA Technical Reports Server (NTRS)
Sheppard, J. G.
1972-01-01
The possibility of using on-line signal statistics to detect electronic equipment nonlinearities is discussed. The results of an investigation using Gaussian statistics are presented, and a nonlinearity test that uses ratios of the moments of a Gaussian random variable is developed and discussed. An outline for further investigation is presented.
NASA Astrophysics Data System (ADS)
Wu, Y.; Shen, B. W.; Cheung, S.
2016-12-01
Recent advance in high-resolution global hurricane simulations and visualizations have collectively suggested the importance of both downscaling and upscaling processes in the formation and intensification of TCs. To reveal multiscale processes from massive volume of global data for multiple years, a scalable Parallel Ensemble Empirical Mode Decomposition (PEEMD) method has been developed for the analysis. In this study, the PEEMD is applied to analyzing 10-year (2004-2013) ERA-Interim global 0.750 resolution reanalysis data to explore the role of the downscaling processes in tropical cyclogenesis associated with African Easterly Waves (AEWs). Using the PEEMD, raw data are decomposed into oscillatory Intrinsic Function Modes (IMFs) that represent atmospheric systems of the various length scales and the trend mode that represents a non-oscillatory large scale environmental flow. Among oscillatory modes, results suggest that the third oscillatory mode (IMF3) is statistically correlated with the TC/AEW scale systems. Therefore, IMF3 and trend mode are analyzed in details. Our 10-year analysis shows that more than 50% of the AEW associated hurricanes reveal the association of storms' formation with the significant downscaling shear transfer from the larger-scale trend mode to the smaller scale IMF3. Future work will apply the PEEMD to the analysis of higher-resolution datasets to explore the role of the upscaling processes provided by the convection (or TC) in the development of the TC (or AEW). Figure caption: The tendency for horizontal wind shear for the total winds (black line), IMF3 (blue line), and trend mode (red line) and SLP (black dotted line) along the storm track of Helene (2006).
Stimulated Electronic X-Ray Raman Scattering
NASA Astrophysics Data System (ADS)
Weninger, Clemens; Purvis, Michael; Ryan, Duncan; London, Richard A.; Bozek, John D.; Bostedt, Christoph; Graf, Alexander; Brown, Gregory; Rocca, Jorge J.; Rohringer, Nina
2013-12-01
We demonstrate strong stimulated inelastic x-ray scattering by resonantly exciting a dense gas target of neon with femtosecond, high-intensity x-ray pulses from an x-ray free-electron laser (XFEL). A small number of lower energy XFEL seed photons drive an avalanche of stimulated resonant inelastic x-ray scattering processes that amplify the Raman scattering signal by several orders of magnitude until it reaches saturation. Despite the large overall spectral width, the internal spiky structure of the XFEL spectrum determines the energy resolution of the scattering process in a statistical sense. This is demonstrated by observing a stochastic line shift of the inelastically scattered x-ray radiation. In conjunction with statistical methods, XFELs can be used for stimulated resonant inelastic x-ray scattering, with spectral resolution smaller than the natural width of the core-excited, intermediate state.
Automatic parquet block sorting using real-time spectral classification
NASA Astrophysics Data System (ADS)
Astrom, Anders; Astrand, Erik; Johansson, Magnus
1999-03-01
This paper presents a real-time spectral classification system based on the PGP spectrograph and a smart image sensor. The PGP is a spectrograph which extracts the spectral information from a scene and projects the information on an image sensor, which is a method often referred to as Imaging Spectroscopy. The classification is based on linear models and categorizes a number of pixels along a line. Previous systems adopting this method have used standard sensors, which often resulted in poor performance. The new system, however, is based on a patented near-sensor classification method, which exploits analogue features on the smart image sensor. The method reduces the enormous amount of data to be processed at an early stage, thus making true real-time spectral classification possible. The system has been evaluated on hardwood parquet boards showing very good results. The color defects considered in the experiments were blue stain, white sapwood, yellow decay and red decay. In addition to these four defect classes, a reference class was used to indicate correct surface color. The system calculates a statistical measure for each parquet block, giving the pixel defect percentage. The patented method makes it possible to run at very high speeds with a high spectral discrimination ability. Using a powerful illuminator, the system can run with a line frequency exceeding 2000 line/s. This opens up the possibility to maintain high production speed and still measure with good resolution.
NASA Astrophysics Data System (ADS)
Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.
2017-12-01
In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.
Preliminary observations of the SELENE Gamma Ray Spectrometer
NASA Astrophysics Data System (ADS)
Forni, O.; Diez, B.; Gasnault, O.; Munoz, B.; D'Uston, C.; Reedy, R. C.; Hasebe, N.
2008-09-01
Introduction We analyze the spectra measured by the Gamma Ray Spectrometer (GRS) on board the SELENE satellite [1]. SELENE was inserted in lunar orbit on 4 Oct. 2007. After passing through a health check and a function check, the GRS was shifted to nominal observation on 21 Dec. 2007. The spectra consist in various lines of interest (O, Mg, Al, Si, Ti, Ca, Fe, K, Th, U, and possibly H) superposed on a continuum. The energies of the gamma rays identify the nuclides responsible for the gamma ray emission and their intensities relate to their abundance. Data collected through 17 Feb. 2008 are studied here, corresponding to an accumulation time (Fig. 1) sufficiently good to allow preliminary mapping. Analysis of the global gamma ray spectrum In order to obtain spectra with counting statistics sufficient for peak analysis, we accumulate all observations. The identification of lines is performed on this global lunar spectrum (Fig 2). Fit of individual lines The gamma ray lines that arise from decay of longlived radioactive species are among the easiest to analyze. So far the abundance of two species is studied thanks to such lines: potassium (1461 keV) and thorium (2614 keV). Secondary neutrons from cosmic ray interactions also produce gamma ray when reacting with the planetary material, according to scattering or absorption reactions. However these lines need substantial corrections before an interpretation in terms of abundance can be performed. Lines have been examined with different techniques. The simplest method consists in summing the spectra in a window containing the line of interest. The continuum is adjusted with a polynomial and removed. Such a method was used for the gamma ray spectra collected by Lunar Prospector [2]. This method is especially robust for isolated lines, such as those of K and Th mentioned above, or with very low statistics. The second method consists in fitting the lines by summing a quadratic continuum with Gaussian lines and exponential tails. We presently fit the spectra thanks to a program developed at CESR: Aquarius. Afterwards the areas associated with the parameters of these ideal lines are calculated. This method is welladapted for interfering lines, such as U, Al, and H around 2210 keV, but it requires good statistics. These two methods were used to analyze the Mars Odyssey gamma-ray spectra [3]. Prettyman et al. [4] applied a third method where theoretical spectra are simulated and matched against the observations. Below we propose a fourth approach based on statistical analyzes. Mapping of elemental abundances Data returned by the spacecraft are time-tagged records acquired with a resolution of 17 seconds. The angular distance covered by the spacecraft during this interval corresponds to about 1° at the surface. However the true resolution of the instrument is lower because gamma rays come from all directions onto the spacecraft. The resolution is therefore set by the field of view of the instrument, which depends on the spacecraft altitude and the geometry of the instrument. The full width half maximum of the instrumental response has been estimated to be 130 km at 1 MeV by the SELENE GRS team. We have tiled the data in agreement with the better resolution we could obtain depending on the intensity of a given line. The thorium line at 2614 keV was thus mapped at a resolution of 3° with the first method described above (sum over 2550-2640 keV). Then this map was smoothed with a 5° filter (152 km radius) to approximate the response function of the instrument. Finally the counting rate was converted into abundance (Fig. 3), using the compositions at landing sites and in the highlands as did Gillis et al. [5]. Statistical analysis We have also analysed the data with various multivariate techniques, one of them being the Independent Component Analysis (ICA) [6, 7]. ICA defines a generative model for the observed multivariate data, which is typically given as a large database of samples. In the model, the data variables are assumed to be linear mixtures of some unknown latent variables, and the mixing system is also unknown. The latent variables are assumed non- Gaussian and mutually independent and they are called the independent components of the observed data. These independent components, also called sources or factors, can be found by ICA. This is done by maximising a non-gaussianity criterion of the sources. As in [8], we have used the JADE algorithm developed and described in [9] for our analysis that we focused in the energy range from 750 to 3000 keV. We identify at least three meaningful components. The first one is correlated to the Thorium map (Fig. 4). The corresponding correlation coefficient spectrum exhibits the lines of Thorium, Potassium and Uranium (Fig. 5). The second component (Fig. 6) is clearly correlated with the Iron as shown on its corresponding spectrum (Fig. 5). A third component, identified at lower resolution, seems to be partly correlated with the altitude of the spacecraft (not shown). Further improvement in the data reduction, like corrections for altitude, cosmic ray, and neutron current variations should allow a better interpretation of the data. Acknowledgement. The SELENE GRS team members are: N. Hasebe, O. Okudaira, N. Yamashita, S. Kobayashi, Y. Karouji, M. Hareyama, S. Kodaira, S. Komatsu, K. Hayatsu, K. Iwabuchi, S. Nemoto, E. Shibamura, M.-N. Kobayashi, R.C. Reedy, K.J. Kim, C. d'Uston, S. Maurice, O. Gasnault, O. Forni, B. Diez. References. [1] Hasebe, N. et al. (2008) Earth, Planets and Space, 60, 299-312.. [2] Lawrence, D.J. et al. (1999) Geophys. Res. Lett., 26 (17), 2681-2684. [3] Evans, L.E. et al. (2006) J. Geophys. Res., 111, E03S04. [4] Prettyman, T.H. et al. (2006) J. Geophys. Res., 111, E12007. [5] Gillis, J.J. et al. (2004) Geo. et Cosmo. Acta, 68 (18), 3791-3805. [6] Comon P. (1994) Signal Processing, 36, 287-314. [7] Hyvärinen, A. and E. Oja (2000) Neural Networks, 13(4-5), 411-430. [8] Forni O. et al. (2005) LPSC, 36, 1623 [9] Cardoso, J.-F. (1997) IEEE Letters on Signal Processing, 4, 112-114.
NASA Astrophysics Data System (ADS)
Protassov, R.; van Dyk, D.; Connors, A.; Kashyap, V.; Siemiginowska, A.
2000-12-01
We examine the x-ray spectrum of the afterglow of GRB 970508, analyzed for Fe line emission by Piro et al (1999, ApJL, 514, L73). This is a difficult and extremely important measurement: the detection of x-ray afterglows from γ -ray bursts is at best a tricky business, relying on near-real satellite time response to unpredictable events; and a great deal of luck in catching a burst bright enough for a useful spectral analysis. Detecting a clear atomic (or cyclotron) line in the generally smooth and featureless afterglow (or burst) emission not only gives one of the few very specific keys to the physics local to the emission region, but also provides clues or confirmation of its distance (via redshift). Unfortunately, neither the likelihood ratio test or the related F-statistic commonly used to detect spectral lines adhere to their nominal Chi square and F-distributions. Thus we begin by calibrating the F-statistic used in Piro et al (1999, ApJL, 514, L73) via a simulation study. The simulation study relies on a completely specified source model, i.e. we do Monte Carlo simulations with all model parameters fixed (so--called ``parametric bootstrapping''). Second, we employ the method of posterior predictive p-values to calibrate a LRT statistic while accounting for the uncertainty in the parameters of the source model. Our analysis reveals evidence for the Fe K line.
Optimal filtering and Bayesian detection for friction-based diagnostics in machines.
Ray, L R; Townsend, J R; Ramasubramanian, A
2001-01-01
Non-model-based diagnostic methods typically rely on measured signals that must be empirically related to process behavior or incipient faults. The difficulty in interpreting a signal that is indirectly related to the fundamental process behavior is significant. This paper presents an integrated non-model and model-based approach to detecting when process behavior varies from a proposed model. The method, which is based on nonlinear filtering combined with maximum likelihood hypothesis testing, is applicable to dynamic systems whose constitutive model is well known, and whose process inputs are poorly known. Here, the method is applied to friction estimation and diagnosis during motion control in a rotating machine. A nonlinear observer estimates friction torque in a machine from shaft angular position measurements and the known input voltage to the motor. The resulting friction torque estimate can be analyzed directly for statistical abnormalities, or it can be directly compared to friction torque outputs of an applicable friction process model in order to diagnose faults or model variations. Nonlinear estimation of friction torque provides a variable on which to apply diagnostic methods that is directly related to model variations or faults. The method is evaluated experimentally by its ability to detect normal load variations in a closed-loop controlled motor driven inertia with bearing friction and an artificially-induced external line contact. Results show an ability to detect statistically significant changes in friction characteristics induced by normal load variations over a wide range of underlying friction behaviors.
Schaefer, C; Lecomte, C; Clicq, D; Merschaert, A; Norrant, E; Fotiadu, F
2013-09-01
The final step of an active pharmaceutical ingredient (API) manufacturing synthesis process consists of a crystallization during which the API and residual solvent contents have to be quantified precisely in order to reach a predefined seeding point. A feasibility study was conducted to demonstrate the suitability of on-line NIR spectroscopy to control this step in line with new version of the European Medicines Agency (EMA) guideline [1]. A quantitative method was developed at laboratory scale using statistical design of experiments (DOE) and multivariate data analysis such as principal component analysis (PCA) and partial least squares (PLS) regression. NIR models were built to quantify the API in the range of 9-12% (w/w) and to quantify the residual methanol in the range of 0-3% (w/w). To improve the predictive ability of the models, the development procedure encompassed: outliers elimination, optimum model rank definition, spectral range and spectral pre-treatment selection. Conventional criteria such as, number of PLS factors, R(2), root mean square errors of calibration, cross-validation and prediction (RMSEC, RMSECV, RMSEP) enabled the selection of three model candidates. These models were tested in the industrial pilot plant during three technical campaigns. Results of the most suitable models were evaluated against to the chromatographic reference methods. Maximum relative bias of 2.88% was obtained about API target content. Absolute bias of 0.01 and 0.02% (w/w) respectively were achieved at methanol content levels of 0.10 and 0.13% (w/w). The repeatability was assessed as sufficient for the on-line monitoring of the 2 analytes. The present feasibility study confirmed the possibility to use on-line NIR spectroscopy as a PAT tool to monitor in real-time both the API and the residual methanol contents, in order to control the seeding of an API crystallization at industrial scale. Furthermore, the successful scale-up of the method proved its capability to be implemented in the manufacturing plant with the launch of the new API process. Copyright © 2013 Elsevier B.V. All rights reserved.
Statistical evaluation of stability data: criteria for change-over-time and data variability.
Bar, Raphael
2003-01-01
In a recently issued ICH Q1E guidance on evaluation of stability data of drug substances and products, the need to perform a statistical extrapolation of a shelf-life of a drug product or a retest period for a drug substance is based heavily on whether data exhibit a change-over-time and/or variability. However, this document suggests neither measures nor acceptance criteria of these two parameters. This paper demonstrates a useful application of simple statistical parameters for determining whether sets of stability data from either accelerated or long-term storage programs exhibit a change-over-time and/or variability. These parameters are all derived from a simple linear regression analysis first performed on the stability data. The p-value of the slope of the regression line is taken as a measure for change-over-time, and a value of 0.25 is suggested as a limit to insignificant change of the quantitative stability attributes monitored. The minimal process capability index, Cpk, calculated from the standard deviation of the regression line, is suggested as a measure for variability with a value of 2.5 as a limit for an insignificant variability. The usefulness of the above two parameters, p-value and Cpk, was demonstrated on stability data of a refrigerated drug product and on pooled data of three batches of a drug substance. In both cases, the determined parameters allowed characterization of the data in terms of change-over-time and variability. Consequently, complete evaluation of the stability data could be pursued according to the ICH guidance. It is believed that the application of the above two parameters with their acceptance criteria will allow a more unified evaluation of stability data.
High Agreement and High Prevalence: The Paradox of Cohen's Kappa.
Zec, Slavica; Soriani, Nicola; Comoretto, Rosanna; Baldi, Ileana
2017-01-01
Cohen's Kappa is the most used agreement statistic in literature. However, under certain conditions, it is affected by a paradox which returns biased estimates of the statistic itself. The aim of the study is to provide sufficient information which allows the reader to make an informed choice of the correct agreement measure, by underlining some optimal properties of Gwet's AC1 in comparison to Cohen's Kappa, using a real data example. During the process of literature review, we have asked a panel of three evaluators to come up with a judgment on the quality of 57 randomized controlled trials assigning a score to each trial using the Jadad scale. The quality was evaluated according to the following dimensions: adopted design, randomization unit, type of primary endpoint. With respect to each of the above described features, the agreement between the three evaluators has been calculated using Cohen's Kappa statistic and Gwet's AC1 statistic and, finally, the values have been compared with the observed agreement. The values of the Cohen's Kappa statistic would lead to believe that the agreement levels for the variables Unit, Design and Primary Endpoints are totally unsatisfactory. The AC1 statistic, on the contrary, shows plausible values which are in line with the respective values of the observed concordance. We conclude that it would always be appropriate to adopt the AC1 statistic, thus bypassing any risk of incurring the paradox and drawing wrong conclusions about the results of agreement analysis.
Physical mechanism and numerical simulation of the inception of the lightning upward leader
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Qingmin; Lu Xinchang; Shi Wei
2012-12-15
The upward leader is a key physical process of the leader progression model of lightning shielding. The inception mechanism and criterion of the upward leader need further understanding and clarification. Based on leader discharge theory, this paper proposes the critical electric field intensity of the stable upward leader (CEFISUL) and characterizes it by the valve electric field intensity on the conductor surface, E{sub L}, which is the basis of a new inception criterion for the upward leader. Through numerical simulation under various physical conditions, we verified that E{sub L} is mainly related to the conductor radius, and data fitting yieldsmore » the mathematical expression of E{sub L}. We further establish a computational model for lightning shielding performance of the transmission lines based on the proposed CEFISUL criterion, which reproduces the shielding failure rate of typical UHV transmission lines. The model-based calculation results agree well with the statistical data from on-site operations, which show the effectiveness and validity of the CEFISUL criterion.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarocki, John Charles; Zage, David John; Fisher, Andrew N.
LinkShop is a software tool for applying the method of Linkography to the analysis time-sequence data. LinkShop provides command line, web, and application programming interfaces (API) for input and processing of time-sequence data, abstraction models, and ontologies. The software creates graph representations of the abstraction model, ontology, and derived linkograph. Finally, the tool allows the user to perform statistical measurements of the linkograph and refine the ontology through direct manipulation of the linkograph.
Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.
Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep
2016-04-01
This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk < 1.0 as "incapable" (1). A C pk > 1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.
English Collocation Learning through Corpus Data: On-Line Concordance and Statistical Information
ERIC Educational Resources Information Center
Ohtake, Hiroshi; Fujita, Nobuyuki; Kawamoto, Takeshi; Morren, Brian; Ugawa, Yoshihiro; Kaneko, Shuji
2012-01-01
We developed an English Collocations On Demand system offering on-line corpus and concordance information to help Japanese researchers acquire a better command of English collocation patterns. The Life Science Dictionary Corpus consists of approximately 90,000,000 words collected from life science related research papers published in academic…
Walenski, Matthew; Swinney, David
2009-01-01
The central question underlying this study revolves around how children process co-reference relationships—such as those evidenced by pronouns (him) and reflexives (himself)—and how a slowed rate of speech input may critically affect this process. Previous studies of child language processing have demonstrated that typical language developing (TLD) children as young as 4 years of age process co-reference relations in a manner similar to adults on-line. In contrast, off-line measures of pronoun comprehension suggest a developmental delay for pronouns (relative to reflexives). The present study examines dependency relations in TLD children (ages 5–13) and investigates how a slowed rate of speech input affects the unconscious (on-line) and conscious (off-line) parsing of these constructions. For the on-line investigations (using a cross-modal picture priming paradigm), results indicate that at a normal rate of speech TLD children demonstrate adult-like syntactic reflexes. At a slowed rate of speech the typical language developing children displayed a breakdown in automatic syntactic parsing (again, similar to the pattern seen in unimpaired adults). As demonstrated in the literature, our off-line investigations (sentence/picture matching task) revealed that these children performed much better on reflexives than on pronouns at a regular speech rate. However, at the slow speech rate, performance on pronouns was substantially improved, whereas performance on reflexives was not different than at the regular speech rate. We interpret these results in light of a distinction between fast automatic processes (relied upon for on-line processing in real time) and conscious reflective processes (relied upon for off-line processing), such that slowed speech input disrupts the former, yet improves the latter. PMID:19343495
Ordering statistics of four random walkers on a line
NASA Astrophysics Data System (ADS)
Helenbrook, Brian; ben-Avraham, Daniel
2018-05-01
We study the ordering statistics of four random walkers on the line, obtaining a much improved estimate for the long-time decay exponent of the probability that a particle leads to time t , Plead(t ) ˜t-0.91287850 , and that a particle lags to time t (never assumes the lead), Plag(t ) ˜t-0.30763604 . Exponents of several other ordering statistics for N =4 walkers are obtained to eight-digit accuracy as well. The subtle correlations between n walkers that lag jointly, out of a field of N , are discussed: for N =3 there are no correlations and Plead(t ) ˜Plag(t) 2 . In contrast, our results rule out the possibility that Plead(t ) ˜Plag(t) 3 for N =4 , although the correlations in this borderline case are tiny.
OSO 8 observational limits to the acoustic coronal heating mechanism
NASA Technical Reports Server (NTRS)
Bruner, E. C., Jr.
1981-01-01
An improved analysis of time-resolved line profiles of the C IV resonance line at 1548 A has been used to test the acoustic wave hypothesis of solar coronal heating. It is shown that the observed motions and brightness fluctuations are consistent with the existence of acoustic waves. Specific account is taken of the effect of photon statistics on the observed velocities, and a test is devised to determine whether the motions represent propagating or evanescent waves. It is found that on the average about as much energy is carried upward as downward such that the net acoustic flux density is statistically consistent with zero. The statistical uncertainty in this null result is three orders of magnitue lower than the flux level needed to heat the corona.
New Data in the ADS Abstract and Article Service
NASA Astrophysics Data System (ADS)
Eichhorn, G.; Accomazzi, A.; Grant, C. S.; Kurtz, M. J.; Murray, S. S.
1996-05-01
In the last few months the data holdings in the ADS have been considerably expanded. In the abstracts databases we have included over 50,000 abstracts from SPIE conference proceedings (provided by SPIE), a complete set of references for lunar and planetary sciences, and abstracts from recent Lunar and Planetary Institute sponsored conferences (both provided by the Lunar and Planetary Institute). We also extended our cooperation with the CDS in Strasbourg, France by providing a link to the list of objects that are in the SIMBAD database for each reference. The ADS article service now holds full-text articles for 20 years of the Astrophysical Journal Letters, the Astronomical Journal, and the Publications of the Astronomical Society of the Pacific, and 5 years of the Astrophysical Journal on-line. The following journals are being processed and some may be on-line for this AAS meeting: Astrophysical Journal (1975-1989), Astronomy and Astrophysics, Proceedings of the Astronomical Society of Australia, Publications of the Astronomical Society of Japan, Revista Mexicana, Bulletin of the Astronomical Society of India, Obs. Reports of Skalnate Pleso, and Baltic Astronomy. We are now working with two scanning companies to speed up the scanning process and hope that by the end of the year we have all of these journals completely on-line for at least the period 1975 - 1995. Usage statistics for Jan - March 1996: Users Queries References Full Abstracts retrieved retrieved ------------------------------------------- 13,823 365,812 7,953,930 309,866
Genetic parameters of product quality and hepatic metabolism in fattened mule ducks.
Marie-Etancelin, C; Basso, B; Davail, S; Gontier, K; Fernandez, X; Vitezica, Z G; Bastianelli, D; Baéza, E; Bernadet, M-D; Guy, G; Brun, J-M; Legarra, A
2011-03-01
Genetic parameters of traits related to hepatic lipid metabolism, carcass composition, and product quality of overfed mule ducks were estimated on both parental lines of this hybrid: the common duck line for the maternal side and the Muscovy line for the paternal side. The originality of the statistical model was to include simultaneously the additive genetic effect of the common ducks and that of the Muscovy ducks, revealing a greater genetic determinism in common than in Muscovy. Plasma metabolic indicators (glucose, triglyceride, and cholesterol contents) were heritable, in particular at the end of the overfeeding period, and heritabilities increased with the overfeeding stage. Carcass composition traits were highly heritable in the common line, with values ranging from 0.15 for liver weight, 0.21 for carcass weight, and 0.25 for abdominal fat weight to 0.32 for breast muscle weight. Heritabilities of technological outputs were greater for the fatty liver (0.19 and 0.08, respectively, on common and Muscovy sides for liver melting rate) than for the pectoralis major muscle (between 0.02 and 0.05 on both parental sides for cooking losses). Fortunately, the processing industry is mainly facing problems in liver quality, such as too high of a melting rate, than in meat quality. The meat quality appraisal criteria (such as texture and cooking losses), usually dependent on pH and the rate of decline of pH, were also very lowly heritable. This study demonstrated that genetic determinism of meat quality and ability of overfeeding is not similar in the common population and in the Muscovy population; traits related to fattening, muscle development, and BW have heritability values from 2 to 4 times greater on the common line than on the Muscovy line, which is relevant for considering different selection strategies.
Quesada, Miguel A.; Blanco-Portales, Rosario; Posé, Sara; García-Gago, Juan A.; Jiménez-Bermúdez, Silvia; Muñoz-Serrano, Andrés; Caballero, José L.; Pliego-Alfaro, Fernando; Mercado, José A.; Muñoz-Blanco, Juan
2009-01-01
The strawberry (Fragaria × ananassa ‘Chandler’) fruit undergoes a fast softening during ripening. Polygalacturonase (PG) activity is low during this process, but two ripening-related PG genes, FaPG1 and FaPG2, have been cloned. Both genes were up-regulated during fruit ripening and were also negatively regulated by auxin. To further assess the role of FaPG1 on strawberry softening, transgenic plants containing an antisense sequence of this gene under the control of the 35S promoter (APG lines) were obtained. Sixteen out of 30 independent transgenic lines showed fruit yields similar to those of the control. Several quality parameters were measured in ripe fruits from these 16 lines. Fruit weight was slightly reduced in four lines, and most of them showed an increase in soluble solid content. Half of these lines yielded fruits significantly firmer than did the control. Four APG lines were selected, their ripened fruits being on average 163% firmer than the control. The postharvest softening of APG fruits was also diminished. Ripened fruits from the four selected lines showed a 90% to 95% decrease in FaPG1 transcript abundance, whereas the level of FaPG2 was not significantly altered. Total PG activity was reduced in three of these lines when compared with control fruits. Cell wall extracts from APG fruits showed a reduction in pectin solubilization and an increase in pectins covalently bound to the cell wall. A comparative transcriptomic analysis of gene expression between the ripened receptacle of the control and those of the APG fruits (comprising 1,250 receptacle expressed sequence tags) did not show any statistically significant change. These results indicate that FaPG1 plays a central role in strawberry softening. PMID:19395408
STATISTICS AND INTELLIGENCE IN DEVELOPING COUNTRIES: A NOTE.
Kodila-Tedika, Oasis; Asongu, Simplice A; Azia-Dimbu, Florentin
2017-05-01
The purpose of this study is to assess the relationship between intelligence (or human capital) and the statistical capacity of developing countries. The line of inquiry is motivated essentially by the scarce literature on poor statistics in developing countries and an evolving stream of literature on the knowledge economy. A positive association is established between intelligence quotient (IQ) and statistical capacity. The relationship is robust to alternative specifications with varying conditioning information sets and control for outliers. Policy implications are discussed.
NASA Astrophysics Data System (ADS)
Zhang, Wenjun; Deng, Weibing; Li, Wei
2018-07-01
Node properties and node importance identification of networks have been vastly studied in the last decades. While in this work, we analyze the links' properties of networks by taking the Worldwide Marine Transport Network (WMTN) as an example, i.e., statistical properties of the shipping lines of WMTN have been investigated in various aspects: Firstly, we study the feature of loops in the shipping lines by defining the line saturability. It is found that the line saturability decays exponentially with the increase of line length. Secondly, to detect the geographical community structure of shipping lines, the Label Propagation Algorithm with compression of Flow (LPAF) and Multi-Dimensional Scaling (MDS) method are employed, which show rather consistent communities. Lastly, to analyze the redundancy property of shipping lines of different marine companies, the multilayer networks are constructed by aggregating the shipping lines of different marine companies. It is observed that the topological quantities, such as average degree, average clustering coefficient, etc., increase smoothly when marine companies are randomly merged (randomly choose two marine companies, then merge the shipping lines of them together), while the relative entropy decreases when the merging sequence is determined by the Jensen-Shannon distance (choose two marine companies when the Jensen-Shannon distance between them is the lowest). This indicates the low redundancy of shipping lines among different marine companies.
NASA Astrophysics Data System (ADS)
Maneechote, T.; Luangpaiboon, P.
2010-10-01
A manufacturing process of erbium doped fibre amplifiers is complicated. It needs to meet the customers' requirements under a present economic status that products need to be shipped to customers as soon as possible after purchasing orders. This research aims to study and improve processes and production lines of erbium doped fibre amplifiers using lean manufacturing systems via an application of computer simulation. Three scenarios of lean tooled box systems are selected via the expert system. Firstly, the production schedule based on shipment date is combined with a first in first out control system. The second scenario focuses on a designed flow process plant layout. Finally, the previous flow process plant layout combines with production schedule based on shipment date including the first in first out control systems. The computer simulation with the limited data via an expected value is used to observe the performance of all scenarios. The most preferable resulted lean tooled box systems from a computer simulation are selected to implement in the real process of a production of erbium doped fibre amplifiers. A comparison is carried out to determine the actual performance measures via an analysis of variance of the response or the production time per unit achieved in each scenario. The goodness of an adequacy of the linear statistical model via experimental errors or residuals is also performed to check the normality, constant variance and independence of the residuals. The results show that a hybrid scenario of lean manufacturing system with the first in first out control and flow process plant lay out statistically leads to better performance in terms of the mean and variance of production times.
Evaluation of a Fully 3-D Bpf Method for Small Animal PET Images on Mimd Architectures
NASA Astrophysics Data System (ADS)
Bevilacqua, A.
Positron Emission Tomography (PET) images can be reconstructed using Fourier transform methods. This paper describes the performance of a fully 3-D Backprojection-Then-Filter (BPF) algorithm on the Cray T3E machine and on a cluster of workstations. PET reconstruction of small animals is a class of problems characterized by poor counting statistics. The low-count nature of these studies necessitates 3-D reconstruction in order to improve the sensitivity of the PET system: by including axially oblique Lines Of Response (LORs), the sensitivity of the system can be significantly improved by the 3-D acquisition and reconstruction. The BPF method is widely used in clinical studies because of its speed and easy implementation. Moreover, the BPF method is suitable for on-time 3-D reconstruction as it does not need any sinogram or rearranged data. In order to investigate the possibility of on-line processing, we reconstruct a phantom using the data stored in the list-mode format by the data acquisition system. We show how the intrinsically parallel nature of the BPF method makes it suitable for on-line reconstruction on a MIMD system such as the Cray T3E. Lastly, we analyze the performance of this algorithm on a cluster of workstations.
Puchades, R.; Maquieira, A.; Atienza, J.; Herrero, M. A.
1990-01-01
Flow injection analysis (FIA) has emerged as an increasingly used laboratory tool in chemical analysis. Employment of the technique for on-line sample treatment and on-line measurement in chemical process control is a growing trend. This article reviews the recent applications of FlA. Most papers refer to on-line sample treatment. Although FIA is very well suited to continuous on-line process monitoring, few examples have been found in this areamost of them have been applied to water treatment or fermentation processes. PMID:18925271
Kwak, Sehyun; Svensson, J; Brix, M; Ghim, Y-C
2016-02-01
A Bayesian model of the emission spectrum of the JET lithium beam has been developed to infer the intensity of the Li I (2p-2s) line radiation and associated uncertainties. The detected spectrum for each channel of the lithium beam emission spectroscopy system is here modelled by a single Li line modified by an instrumental function, Bremsstrahlung background, instrumental offset, and interference filter curve. Both the instrumental function and the interference filter curve are modelled with non-parametric Gaussian processes. All free parameters of the model, the intensities of the Li line, Bremsstrahlung background, and instrumental offset, are inferred using Bayesian probability theory with a Gaussian likelihood for photon statistics and electronic background noise. The prior distributions of the free parameters are chosen as Gaussians. Given these assumptions, the intensity of the Li line and corresponding uncertainties are analytically available using a Bayesian linear inversion technique. The proposed approach makes it possible to extract the intensity of Li line without doing a separate background subtraction through modulation of the Li beam.
Asquith, William H.; Vrabel, Joseph; Roussel, Meghan C.
2007-01-01
Analysts and managers of surface-water resources might have interest in selected statistics of daily mean streamflow for U.S. Geological Survey (USGS) streamflow-gaging stations in Texas. The selected statistics are the annual mean, maximum, minimum, and L-scale of daily meanstreamflow. Annual L-scale of streamflow is a robust measure of the variability of the daily mean streamflow for a given year. The USGS, in cooperation with the Texas Commission on Environmental Quality, initiated in 2006a data and reporting process to generate annual statistics for 712 USGS streamflow-gaging stations in Texas. A graphical depiction of the history of the annual statistics for most active and inactive, continuous-record gaging stations in Texas provides valuable information by conveying the historical perspective of streamflow for the watershed. Each figure consists off our time-series plots of the annual statistics of daily mean streamflow for each streamflow-gaging station. Each of the four plots is augmented with horizontal lines that depict the mean and median annual values of the corresponding statistic for the period of record. Monotonic trends for each of the four annual statistics also are identified using Kendall's T. The history of one or more streamflow-gaging stations could be used in a watershed, river basin, or other regional context by analysts and managers of surface-water resources to guide scientific, regulatory, or other inquiries of streamflow conditions in Texas.
NASA Technical Reports Server (NTRS)
Cole, H. A., Jr.
1973-01-01
Random decrement signatures of structures vibrating in a random environment are studied through use of computer-generated and experimental data. Statistical properties obtained indicate that these signatures are stable in form and scale and hence, should have wide application in one-line failure detection and damping measurement. On-line procedures are described and equations for estimating record-length requirements to obtain signatures of a prescribed precision are given.
49 CFR 1248.4 - Originating and connecting line traffic.
Code of Federal Regulations, 2014 CFR
2014-10-01
... TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS FREIGHT COMMODITY STATISTICS § 1248.4 Originating and connecting line traffic. (a) Revenue freight reported as received from... or indirectly, so far as apparent from information on the waybills or abstracts. (b) Revenue freight...
49 CFR 1248.4 - Originating and connecting line traffic.
Code of Federal Regulations, 2010 CFR
2010-10-01
... TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS FREIGHT COMMODITY STATISTICS § 1248.4 Originating and connecting line traffic. (a) Revenue freight reported as received from... or indirectly, so far as apparent from information on the waybills or abstracts. (b) Revenue freight...
49 CFR 1248.4 - Originating and connecting line traffic.
Code of Federal Regulations, 2012 CFR
2012-10-01
... TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS FREIGHT COMMODITY STATISTICS § 1248.4 Originating and connecting line traffic. (a) Revenue freight reported as received from... or indirectly, so far as apparent from information on the waybills or abstracts. (b) Revenue freight...
49 CFR 1248.4 - Originating and connecting line traffic.
Code of Federal Regulations, 2013 CFR
2013-10-01
... TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS FREIGHT COMMODITY STATISTICS § 1248.4 Originating and connecting line traffic. (a) Revenue freight reported as received from... or indirectly, so far as apparent from information on the waybills or abstracts. (b) Revenue freight...
49 CFR 1248.4 - Originating and connecting line traffic.
Code of Federal Regulations, 2011 CFR
2011-10-01
... TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS FREIGHT COMMODITY STATISTICS § 1248.4 Originating and connecting line traffic. (a) Revenue freight reported as received from... or indirectly, so far as apparent from information on the waybills or abstracts. (b) Revenue freight...
A Performance-Based Comparison of Object-Oriented Simulation Tools
1992-04-01
simulation" [Belanger 90a, 90b]. CACI Products Company markets MODSIM II as the commercial version of ModSim, which was created on a US Army contract...aim fprintf (report_file, "Line Statistics\\ nLine teller repoirt.cust interrupts; Lengt~is\
Statistical modeling of SRAM yield performance and circuit variability
NASA Astrophysics Data System (ADS)
Cheng, Qi; Chen, Yijian
2015-03-01
In this paper, we develop statistical models to investigate SRAM yield performance and circuit variability in the presence of self-aligned multiple patterning (SAMP) process. It is assumed that SRAM fins are fabricated by a positivetone (spacer is line) self-aligned sextuple patterning (SASP) process which accommodates two types of spacers, while gates are fabricated by a more pitch-relaxed self-aligned quadruple patterning (SAQP) process which only allows one type of spacer. A number of possible inverter and SRAM structures are identified and the related circuit multi-modality is studied using the developed failure-probability and yield models. It is shown that SRAM circuit yield is significantly impacted by the multi-modality of fins' spatial variations in a SRAM cell. The sensitivity of 6-transistor SRAM read/write failure probability to SASP process variations is calculated and the specific circuit type with the highest probability to fail in the reading/writing operation is identified. Our study suggests that the 6-transistor SRAM configuration may not be scalable to 7-nm half pitch and more robust SRAM circuit design needs to be researched.
Neutrino flux prediction at MiniBooNE
NASA Astrophysics Data System (ADS)
Aguilar-Arevalo, A. A.; Anderson, C. E.; Bazarko, A. O.; Brice, S. J.; Brown, B. C.; Bugel, L.; Cao, J.; Coney, L.; Conrad, J. M.; Cox, D. C.; Curioni, A.; Djurcic, Z.; Finley, D. A.; Fleming, B. T.; Ford, R.; Garcia, F. G.; Garvey, G. T.; Green, C.; Green, J. A.; Hart, T. L.; Hawker, E.; Imlay, R.; Johnson, R. A.; Karagiorgi, G.; Kasper, P.; Katori, T.; Kobilarcik, T.; Kourbanis, I.; Koutsoliotas, S.; Laird, E. M.; Linden, S. K.; Link, J. M.; Liu, Y.; Liu, Y.; Louis, W. C.; Mahn, K. B. M.; Marsh, W.; Martin, P. S.; McGregor, G.; Metcalf, W.; Meyers, P. D.; Mills, F.; Mills, G. B.; Monroe, J.; Moore, C. D.; Nelson, R. H.; Nguyen, V. T.; Nienaber, P.; Nowak, J. A.; Ouedraogo, S.; Patterson, R. B.; Perevalov, D.; Polly, C. C.; Prebys, E.; Raaf, J. L.; Ray, H.; Roe, B. P.; Russell, A. D.; Sandberg, V.; Schirato, R.; Schmitz, D.; Shaevitz, M. H.; Shoemaker, F. C.; Smith, D.; Soderberg, M.; Sorel, M.; Spentzouris, P.; Stancu, I.; Stefanski, R. J.; Sung, M.; Tanaka, H. A.; Tayloe, R.; Tzanov, M.; van de Water, R.; Wascko, M. O.; White, D. H.; Wilking, M. J.; Yang, H. J.; Zeller, G. P.; Zimmerman, E. D.
2009-04-01
The booster neutrino experiment (MiniBooNE) searches for νμ→νe oscillations using the O(1GeV) neutrino beam produced by the booster synchrotron at the Fermi National Accelerator Laboratory). The booster delivers protons with 8 GeV kinetic energy (8.89GeV/c momentum) to a beryllium target, producing neutrinos from the decay of secondary particles in the beam line. We describe the Monte Carlo simulation methods used to estimate the flux of neutrinos from the beam line incident on the MiniBooNE detector for both polarities of the focusing horn. The simulation uses the Geant4 framework for propagating particles, accounting for electromagnetic processes and hadronic interactions in the beam line materials, as well as the decay of particles. The absolute double differential cross sections of pion and kaon production in the simulation have been tuned to match external measurements, as have the hadronic cross sections for nucleons and pions. The statistical precision of the flux predictions is enhanced through reweighting and resampling techniques. Systematic errors in the flux estimation have been determined by varying parameters within their uncertainties, accounting for correlations where appropriate.
Use of a remote computer terminal during field checking of Landsat digital maps
Robinove, Charles J.; Hutchinson, C.F.
1978-01-01
Field checking of small-scale land classification maps made digitally from Landsat data is facilitated by use of a remote portable teletypewriter terminal linked by teleplume to the IDIMS (Interactive Digital Image Manipulation System) at the EDC (EROS Data Center), Sioux Falls, S. Dak. When field checking of maps 20 miles northeast of Baker, Calif., during the day showed that changes in classification were needed, the terminal was used at night to combine image statistical files, remap portions of images, and produce new alphanumeric maps for field checking during the next day. The alphanumeric maps can be used without serious difficulty in location in the field even though the scale is distorted, and statistical files created during the field check can be used for full image classification and map output at the EDC. This process makes field checking faster than normal, provides interaction with the statistical data while in the field, and reduces to a minimum the number of trips needed to work interactively with the IDIMS at the EDC, thus saving significant amounts of time and money. The only significant problem is using telephone lines which at times create spurious characters in the printout or prevent the line feed (paper advance) signal from reaching the terminal, thus overprinting lines which should be sequential. We recommend that maps for field checking be made with more spectral classes than are expected because in the field it is much easier to group classes than to reclassify or separate classes when only the remote terminal is available for display.
Application of spatial technology in malaria research & control: some new insights.
Saxena, Rekha; Nagpal, B N; Srivastava, Aruna; Gupta, S K; Dash, A P
2009-08-01
Geographical information System (GIS) has emerged as the core of the spatial technology which integrates wide range of dataset available from different sources including Remote Sensing (RS) and Global Positioning System (GPS). Literature published during the decade (1998-2007) has been compiled and grouped into six categories according to the usage of the technology in malaria epidemiology. Different GIS modules like spatial data sources, mapping and geo-processing tools, distance calculation, digital elevation model (DEM), buffer zone and geo-statistical analysis have been investigated in detail, illustrated with examples as per the derived results. These GIS tools have contributed immensely in understanding the epidemiological processes of malaria and examples drawn have shown that GIS is now widely used for research and decision making in malaria control. Statistical data analysis currently is the most consistent and established set of tools to analyze spatial datasets. The desired future development of GIS is in line with the utilization of geo-statistical tools which combined with high quality data has capability to provide new insight into malaria epidemiology and the complexity of its transmission potential in endemic areas.
The Search for Solar Gravity-Mode Oscillations: an Analysis Using ULYSSES Magnetic Field Data
NASA Astrophysics Data System (ADS)
Denison, David G. T.; Walden, Andrew T.
1999-04-01
In 1995 Thomson, Maclennon, and Lanzerotti (TML) reported on work where they carried out a time-series analysis of energetic particle fluxes measured by Ulysses and Voyager 2 and concluded that solar g-mode oscillations had been detected. The approach is based on finding significant peaks in spectra using a statistical F-test. Using three sets of 2048 hourly averages of Ulysses magnetic field magnitude data, and the same multitaper spectral estimation techniques, we obtain, on average, nine coincidences with the lines listed in the TML paper. We could not reject the hypothesis that the F-test peaks we obtained are uniformly distributed, and further statistical computations show that a sequence of uniformly distributed lines generated on the frequency grid would have, on average, nine coincidences with the lines of TML. Further, we find that a time series generated from a model with a smooth spectrum of the same form as derived from the Ulysses magnetic field magnitude data and having no true spectral lines above 2 μHz, when subjected to the multitaper F-tests, gives rise to essentially the same number of ``identified'' lines and coincident frequencies as found with our Ulysses data. We conclude that our average nine coincidences with the lines found by TML can arise by mechanisms wholly unconnected with the existence of real physical spectral lines and hence find no firm evidence that g-modes can be detected in our sample of magnetic field data.
External Reporting Lines of Academic Special Libraries: A Health Sciences Case Study
ERIC Educational Resources Information Center
Buhler, Amy G.; Ferree, Nita; Cataldo, Tara T.; Tennant, Michele R.
2010-01-01
Very little literature exists on the nature of external reporting lines and funding structures of academic special libraries. This study focuses on academic health sciences libraries. The authors analyze information gathered from statistics published by the Association of Academic Health Sciences Libraries (AAHSL) from 1977 through 2007; an…
How Many Is a Zillion? Sources of Number Distortion
ERIC Educational Resources Information Center
Rips, Lance J.
2013-01-01
When young children attempt to locate the positions of numerals on a number line, the positions are often logarithmically rather than linearly distributed. This finding has been taken as evidence that the children represent numbers on a mental number line that is logarithmically calibrated. This article reports a statistical simulation showing…
NASA Astrophysics Data System (ADS)
Bocz, Péter; Vinkó, Ákos; Posgay, Zoltán
2018-03-01
This paper presents an automatic method for detecting vertical track irregularities on tramway operation using acceleration measurements on trams. For monitoring of tramway tracks, an unconventional measurement setup is developed, which records the data of 3-axes wireless accelerometers mounted on wheel discs. Accelerations are processed to obtain the vertical track irregularities to determine whether the track needs to be repaired. The automatic detection algorithm is based on time-frequency distribution analysis and determines the defect locations. Admissible limits (thresholds) are given for detecting moderate and severe defects using statistical analysis. The method was validated on frequented tram lines in Budapest and accurately detected severe defects with a hit rate of 100%, with no false alarms. The methodology is also sensitive to moderate and small rail surface defects at the low operational speed.
Hou, Xiang-Mei; Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-07-01
To study and establish a monitoring method for macroporous resin column chromatography process of salvianolic acids by using near infrared spectroscopy (NIR) as a process analytical technology (PAT).The multivariate statistical process control (MSPC) model was developed based on 7 normal operation batches, and 2 test batches (including one normal operation batch and one abnormal operation batch) were used to verify the monitoring performance of this model. The results showed that MSPC model had a good monitoring ability for the column chromatography process. Meanwhile, NIR quantitative calibration model was established for three key quality indexes (rosmarinic acid, lithospermic acid and salvianolic acid B) by using partial least squares (PLS) algorithm. The verification results demonstrated that this model had satisfactory prediction performance. The combined application of the above two models could effectively achieve real-time monitoring for macroporous resin column chromatography process of salvianolic acids, and can be used to conduct on-line analysis of key quality indexes. This established process monitoring method could provide reference for the development of process analytical technology for traditional Chinese medicines manufacturing. Copyright© by the Chinese Pharmaceutical Association.
Resident accuracy of joint line palpation using ultrasound verification.
Rho, Monica E; Chu, Samuel K; Yang, Aaron; Hameed, Farah; Lin, Cindy Yuchin; Hurh, Peter J
2014-10-01
To determine the accuracy of knee and acromioclavicular (AC) joint line palpation in Physical Medicine and Rehabilitation (PM&R) residents using ultrasound (US) verification. Cohort study. PM&R residency program at an academic institution. Twenty-four PM&R residents participating in a musculoskeletal US course (7 PGY-2, 8 PGY-3, and 9 PGY4 residents). Twenty-four PM&R residents participating in an US course were asked to palpate the AC joint and lateral joint line of the knee in a female and male model before the start of the course. Once the presumed joint line was localized, the residents were asked to tape an 18-gauge, 1.5-inch, blunt-tip needle parallel to the joint line on the overlying skin. The accuracy of needle placement over the joint line was verified using US. US verification of correct needle placement over the joint line. Overall AC joint palpation accuracy was 16.7%, and knee lateral joint line palpation accuracy was 58.3%. Based on the resident level of education, using a value of P < .05, there were no statistically significant differences in the accuracy of joint line palpation. Residents in this study demonstrate poor accuracy of AC joint and lateral knee joint line identification by palpation, using US as the criterion standard for verification. There were no statistically significant differences in the accuracy rates of joint line palpation based on resident level of education. US may be a useful tool to use to advance the current methods of teaching the physical examination in medical education. Copyright © 2014 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.
Accountability Indicators from the Viewpoint of Statistical Method.
ERIC Educational Resources Information Center
Jordan, Larry
Few people seriously regard students as "products" coming off an educational assembly line, but notions about accountability and quality improvement in higher education are pervaded by manufacturing ideas and metaphors. Because numerical indicators of quality are inevitably expressed by trend lines or statistical control chars of some kind, they…
On-Line Analysis of Southern FIA Data
Michael P. Spinney; Paul C. Van Deusen; Francis A. Roesch
2006-01-01
The Southern On-Line Estimator (SOLE) is a web-based FIA database analysis tool designed with an emphasis on modularity. The Java-based user interface is simple and intuitive to use and the R-based analysis engine is fast and stable. Each component of the program (data retrieval, statistical analysis and output) can be individually modified to accommodate major...
Nonlinear estimation of parameters in biphasic Arrhenius plots.
Puterman, M L; Hrboticky, N; Innis, S M
1988-05-01
This paper presents a formal procedure for the statistical analysis of data on the thermotropic behavior of membrane-bound enzymes generated using the Arrhenius equation and compares the analysis to several alternatives. Data is modeled by a bent hyperbola. Nonlinear regression is used to obtain estimates and standard errors of the intersection of line segments, defined as the transition temperature, and slopes, defined as energies of activation of the enzyme reaction. The methodology allows formal tests of the adequacy of a biphasic model rather than either a single straight line or a curvilinear model. Examples on data concerning the thermotropic behavior of pig brain synaptosomal acetylcholinesterase are given. The data support the biphasic temperature dependence of this enzyme. The methodology represents a formal procedure for statistical validation of any biphasic data and allows for calculation of all line parameters with estimates of precision.
EVOLUTION OF THE MAGNETIC FIELD LINE DIFFUSION COEFFICIENT AND NON-GAUSSIAN STATISTICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snodin, A. P.; Ruffolo, D.; Matthaeus, W. H.
The magnetic field line random walk (FLRW) plays an important role in the transport of energy and particles in turbulent plasmas. For magnetic fluctuations that are transverse or almost transverse to a large-scale mean magnetic field, theories describing the FLRW usually predict asymptotic diffusion of magnetic field lines perpendicular to the mean field. Such theories often depend on the assumption that one can relate the Lagrangian and Eulerian statistics of the magnetic field via Corrsin’s hypothesis, and additionally take the distribution of magnetic field line displacements to be Gaussian. Here we take an ordinary differential equation (ODE) model with thesemore » underlying assumptions and test how well it describes the evolution of the magnetic field line diffusion coefficient in 2D+slab magnetic turbulence, by comparisons to computer simulations that do not involve such assumptions. In addition, we directly test the accuracy of the Corrsin approximation to the Lagrangian correlation. Over much of the studied parameter space we find that the ODE model is in fairly good agreement with computer simulations, in terms of both the evolution and asymptotic values of the diffusion coefficient. When there is poor agreement, we show that this can be largely attributed to the failure of Corrsin’s hypothesis rather than the assumption of Gaussian statistics of field line displacements. The degree of non-Gaussianity, which we measure in terms of the kurtosis, appears to be an indicator of how well Corrsin’s approximation works.« less
THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...
CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc
NASA Astrophysics Data System (ADS)
Rueda, Sylvia; Udupa, Jayaram K.
2011-03-01
Landmark based statistical object modeling techniques, such as Active Shape Model (ASM), have proven useful in medical image analysis. Identification of the same homologous set of points in a training set of object shapes is the most crucial step in ASM, which has encountered challenges such as (C1) defining and characterizing landmarks; (C2) ensuring homology; (C3) generalizing to n > 2 dimensions; (C4) achieving practical computations. In this paper, we propose a novel global-to-local strategy that attempts to address C3 and C4 directly and works in Rn. The 2D version starts from two initial corresponding points determined in all training shapes via a method α, and subsequently by subdividing the shapes into connected boundary segments by a line determined by these points. A shape analysis method β is applied on each segment to determine a landmark on the segment. This point introduces more pairs of points, the lines defined by which are used to further subdivide the boundary segments. This recursive boundary subdivision (RBS) process continues simultaneously on all training shapes, maintaining synchrony of the level of recursion, and thereby keeping correspondence among generated points automatically by the correspondence of the homologous shape segments in all training shapes. The process terminates when no subdividing lines are left to be considered that indicate (as per method β) that a point can be selected on the associated segment. Examples of α and β are presented based on (a) distance; (b) Principal Component Analysis (PCA); and (c) the novel concept of virtual landmarks.
Microbial ecology of Vietnamese Tra fish (Pangasius hypophthalmus) fillets during processing.
Tong Thi, Anh Ngoc; Noseda, Bert; Samapundo, Simbarashe; Nguyen, Binh Ly; Broekaert, Katrien; Rasschaert, Geertrui; Heyndrickx, Marc; Devlieghere, Frank
2013-10-15
There are numerous factors that can have an impact on the microbial ecology and quality of frozen Pangasius hypophthalmus fillets during processing in Vietnam. The presence of spoilage bacteria along the processing line can shorten the shelf-life of thawed frozen fish products. Therefore, the spoilage microbiota throughout the processing chain of two companies (BC: large scale factory, chlorine-based process, BW: large scale factory, water-based process and SC: small scale factory, chlorine-based process) was identified by culture-dependent techniques and 16S rRNA gene sequencing. The microbiological counts were observed to be insignificantly different (p>0.05) between BC and BW. Surprisingly, chlorine treated fillets from the SC line were revealed to have significantly higher microbial counts than potable water treated fillets at BW line. This was determined to be a result of temperature abuse during processing at SC, with temperatures even greater than 10 °C being recorded from skinning onwards. On the contrary, the microbiota related to spoilage for BC and BW lines was determined by 16S rRNA gene sequencing to be more diverse than that on the SC line. A total of 174 isolates, 20 genera and 38 species were identified along the processing chains. The genera Aeromonas, Acinetobacter, Lactococcus and Enterococcus were prevalent at various processing steps on all the processing lines evaluated. A diverse range of isolates belonging to the Enterobacteriaceae such as Providencia, Shigella, Klebsiella, Enterobacter and Wautersiella were isolated from fillets sampled on the SC line whereas Serratia was only observed on fillets sampled on the BC and BW lines. The results can be used to improve Good Manufacturing Practices for processed Pangasius fillets and to select effective measures to prolong the shelf-life of thawed Vietnamese Pangasius fillets products. © 2013.
Hanus, Josef; Nosek, Tomas; Zahora, Jiri; Bezrouk, Ales; Masin, Vladimir
2013-01-01
We designed and evaluated an innovative computer-aided-learning environment based on the on-line integration of computer controlled medical diagnostic devices and a medical information system for use in the preclinical medical physics education of medical students. Our learning system simulates the actual clinical environment in a hospital or primary care unit. It uses a commercial medical information system for on-line storage and processing of clinical type data acquired during physics laboratory classes. Every student adopts two roles, the role of 'patient' and the role of 'physician'. As a 'physician' the student operates the medical devices to clinically assess 'patient' colleagues and records all results in an electronic 'patient' record. We also introduced an innovative approach to the use of supportive education materials, based on the methods of adaptive e-learning. A survey of student feedback is included and statistically evaluated. The results from the student feedback confirm the positive response of the latter to this novel implementation of medical physics and informatics in preclinical education. This approach not only significantly improves learning of medical physics and informatics skills but has the added advantage that it facilitates students' transition from preclinical to clinical subjects. Copyright © 2011 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Analysis of long-term ionizing radiation effects in bipolar transistors
NASA Technical Reports Server (NTRS)
Stanley, A. G.; Martin, K. E.
1978-01-01
The ionizing radiation effects of electrons on bipolar transistors have been analyzed using the data base from the Voyager project. The data were subjected to statistical analysis, leading to a quantitative characterization of the product and to data on confidence limits which will be useful for circuit design purposes. These newly-developed methods may form the basis for a radiation hardness assurance system. In addition, an attempt was made to identify the causes of the large variations in the sensitivity observed on different product lines. This included a limited construction analysis and a determination of significant design and processes variables, as well as suggested remedies for improving the tolerance of the devices to radiation.
Liu, Ya-Juan; André, Silvère; Saint Cristau, Lydia; Lagresle, Sylvain; Hannas, Zahia; Calvosa, Éric; Devos, Olivier; Duponchel, Ludovic
2017-02-01
Multivariate statistical process control (MSPC) is increasingly popular as the challenge provided by large multivariate datasets from analytical instruments such as Raman spectroscopy for the monitoring of complex cell cultures in the biopharmaceutical industry. However, Raman spectroscopy for in-line monitoring often produces unsynchronized data sets, resulting in time-varying batches. Moreover, unsynchronized data sets are common for cell culture monitoring because spectroscopic measurements are generally recorded in an alternate way, with more than one optical probe parallelly connecting to the same spectrometer. Synchronized batches are prerequisite for the application of multivariate analysis such as multi-way principal component analysis (MPCA) for the MSPC monitoring. Correlation optimized warping (COW) is a popular method for data alignment with satisfactory performance; however, it has never been applied to synchronize acquisition time of spectroscopic datasets in MSPC application before. In this paper we propose, for the first time, to use the method of COW to synchronize batches with varying durations analyzed with Raman spectroscopy. In a second step, we developed MPCA models at different time intervals based on the normal operation condition (NOC) batches synchronized by COW. New batches are finally projected considering the corresponding MPCA model. We monitored the evolution of the batches using two multivariate control charts based on Hotelling's T 2 and Q. As illustrated with results, the MSPC model was able to identify abnormal operation condition including contaminated batches which is of prime importance in cell culture monitoring We proved that Raman-based MSPC monitoring can be used to diagnose batches deviating from the normal condition, with higher efficacy than traditional diagnosis, which would save time and money in the biopharmaceutical industry. Copyright © 2016 Elsevier B.V. All rights reserved.
Fonteyne, Margot; Vercruysse, Jurgen; De Leersnyder, Fien; Besseling, Rut; Gerich, Ad; Oostra, Wim; Remon, Jean Paul; Vervaet, Chris; De Beer, Thomas
2016-09-07
This study focuses on the twin screw granulator of a continuous from-powder-to-tablet production line. Whereas powder dosing into the granulation unit is possible from a container of preblended material, a truly continuous process uses several feeders (each one dosing an individual ingredient) and relies on a continuous blending step prior to granulation. The aim of the current study was to investigate the in-line blending capacity of this twin screw granulator, equipped with conveying elements only. The feasibility of in-line NIR (SentroPAT, Sentronic GmbH, Dresden, Germany) spectroscopy for evaluating the blend uniformity of powders after the granulator was tested. Anhydrous theophylline was used as a tracer molecule and was blended with lactose monohydrate. Theophylline and lactose were both fed from a different feeder into the twin screw granulator barrel. Both homogeneous mixtures and mixing experiments with induced errors were investigated. The in-line spectroscopic analyses showed that the twin screw granulator is a useful tool for in-line blending in different conditions. The blend homogeneity was evaluated by means of a novel statistical method being the moving F-test method in which the variance between two blocks of collected NIR spectra is evaluated. The α- and β-error of the moving F-test are controlled by using the appropriate block size of spectra. The moving F-test method showed to be an appropriate calibration and maintenance free method for blend homogeneity evaluation during continuous mixing. Copyright © 2016 Elsevier B.V. All rights reserved.
Apoptosis after gamma irradiation. Is it an important cell death modality?
Siles, E.; Villalobos, M.; Jones, L.; Guerrero, R.; Eady, J. J.; Valenzuela, M. T.; Núñez, M. I.; McMillan, T. J.; Ruiz de Almodóvar, J. M.
1998-01-01
Apoptosis and necrosis are two different forms of cell death that can be induced by cytotoxic stress, such as ionizing radiation. We have studied the importance of apoptotic death induced after treatment with 6 Gy of gamma-irradiation in a panel of eight human tumour cell lines of different radiosensitivities. Three different techniques based on the detection of DNA fragmentation have been used, a qualitative one--DNA ladder formation --and two quantitative approaches--in situ tailing and comet assay. No statistically significant relationship between the two quantitative assays was found (r= 0.327, P = 0.159) so these methods seem to show different aspects of the process of cell death. The presence of the DNA ladder related well to the end-labelling method in that the least amount of end labelling was seen in samples in which necrotic degradation rather than apoptotic ladders were seen. However, as the results obtained by the comet assay are not in agreement with the DNA ladder experiments, we suggest that the distinction between the degraded DNA produced by apoptosis and necrosis may be difficult by this technique. Finally, although apoptosis has been proposed to be dependent on p53 functionality, and this may explain differences in cellular radiosensitivity, no statistically significant relationship was found between these parameters and apoptosis in the eight cell lines studied. PMID:9862569
Factors governing particle number emissions in a waste-to-energy plant.
Ozgen, Senem; Cernuschi, Stefano; Giugliano, Michele
2015-05-01
Particle number concentration and size distribution measurements were performed on the stack gas of a waste-to-energy plant which co-incinerates municipal solid waste, sewage sludge and clinical waste in two lines. Average total number of particles was found to be 4.0·10(5)cm(-3) and 1.9·10(5)cm(-3) for the line equipped with a wet flue gas cleaning process and a dry cleaning system, respectively. Ultrafine particles (dp<100nm) accounted for about 97% of total number concentration for both lines, whereas the nanoparticle (dp<50nm) contribution differed slightly between the lines (87% and 84%). The experimental data is explored statistically through some multivariate pattern identifying methods such as factor analysis and cluster analysis to help the interpretation of the results regarding the origin of the particles in the flue gas with the objective of determining the factors governing the particle number emissions. The higher moisture of the flue gas in the wet cleaning process was found to increase the particle number emissions on average by a factor of about 2 due to increased secondary formation of nanoparticles through nucleation of gaseous precursors such as sulfuric acid, ammonia and water. The influence of flue gas dilution and cooling monitored through the variation of the sampling conditions also confirms the potential effect of the secondary new particle formation in increasing the particle number emissions. This finding shows the importance of reporting the experimental conditions in detail to enable the comparison and interpretation of particle number emissions. Regarding the fuel characteristics no difference was observed in terms of particle number concentration and size distributions between the clinical waste feed and the municipal solid waste co-incineration with sludge. Copyright © 2015 Elsevier Ltd. All rights reserved.
Comparison of DNQ/novolac resists for e-beam exposure
NASA Astrophysics Data System (ADS)
Fedynyshyn, Theodore H.; Doran, Scott P.; Lind, Michele L.; Lyszczarz, Theodore M.; DiNatale, William F.; Lennon, Donna; Sauer, Charles A.; Meute, Jeff
1999-12-01
We have surveyed the commercial resist market with the dual purpose of identifying diazoquinone/novolac based resists that have potential for use as e-beam mask making resists and baselining these resists for comparison against future mask making resist candidates. For completeness, this survey would require that each resist be compared with an optimized developer and development process. To accomplish this task in an acceptable time period, e-beam lithography modeling was employed to quickly identify the resist and developer combinations that lead to superior resist performance. We describe the verification of a method to quickly screen commercial i-line resists with different developers, by determining modeling parameters for i-line resists from e-beam exposures, modeling the resist performance, and comparing predicted performance versus actual performance. We determined the lithographic performance of several DNQ/novolac resists whose modeled performance suggests that sensitivities of less than 40 (mu) C/cm2 coupled with less than 10-nm CD change per percent change in dose are possible for target 600-nm features. This was accomplished by performing a series of statistically designed experiments on the leading resists candidates to optimize processing variables, followed by comparing experimentally determined resist sensitivities, latitudes, and profiles of the DNQ/novolac resists a their optimized process.
Skeletonization of gray-scale images by gray weighted distance transform
NASA Astrophysics Data System (ADS)
Qian, Kai; Cao, Siqi; Bhattacharya, Prabir
1997-07-01
In pattern recognition, thinning algorithms are often a useful tool to represent a digital pattern by means of a skeletonized image, consisting of a set of one-pixel-width lines that highlight the significant features interest in applying thinning directly to gray-scale images, motivated by the desire of processing images characterized by meaningful information distributed over different levels of gray intensity. In this paper, a new algorithm is presented which can skeletonize both black-white and gray pictures. This algorithm is based on the gray distance transformation and can be used to process any non-well uniformly distributed gray-scale picture and can preserve the topology of original picture. This process includes a preliminary phase of investigation in the 'hollows' in the gray-scale image; these hollows are considered not as topological constrains for the skeleton structure depending on their statistically significant depth. This algorithm can also be executed on a parallel machine as all the operations are executed in local. Some examples are discussed to illustrate the algorithm.
Line identification studies using traditional techniques and wavelength coincidence statistics
NASA Technical Reports Server (NTRS)
Cowley, Charles R.; Adelman, Saul J.
1990-01-01
Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.
Interactions Dominate the Dynamics of Visual Cognition
Stephen, Damian G.; Mirman, Daniel
2010-01-01
Many cognitive theories have described behavior as the summation of independent contributions from separate components. Contrasting views have emphasized the importance of multiplicative interactions and emergent structure. We describe a statistical approach to distinguishing additive and multiplicative processes and apply it to the dynamics of eye movements during classic visual cognitive tasks. The results reveal interaction-dominant dynamics in eye movements in each of the three tasks, and that fine-grained eye movements are modulated by task constraints. These findings reveal the interactive nature of cognitive processing and are consistent with theories that view cognition as an emergent property of processes that are broadly distributed over many scales of space and time rather than a componential assembly line. PMID:20070957
Serafim, Vlad; Shah, Ajit; Puiu, Maria; Andreescu, Nicoleta; Coricovac, Dorina; Nosyrev, Alexander; Spandidos, Demetrios A; Tsatsakis, Aristides M; Dehelean, Cristina; Pinzaru, Iulia
2017-10-01
Over the past decade, matrix-assisted laser desorption/ionization time‑of‑flight mass spectrometry (MALDI‑TOF MS) has been established as a valuable platform for microbial identification, and it is also frequently applied in biology and clinical studies to identify new markers expressed in pathological conditions. The aim of the present study was to assess the potential of using this approach for the classification of cancer cell lines as a quantifiable method for the proteomic profiling of cellular organelles. Intact protein extracts isolated from different tumor cell lines (human and murine) were analyzed using MALDI‑TOF MS and the obtained mass lists were processed using principle component analysis (PCA) within Bruker Biotyper® software. Furthermore, reference spectra were created for each cell line and were used for classification. Based on the intact protein profiles, we were able to differentiate and classify six cancer cell lines: two murine melanoma (B16‑F0 and B164A5), one human melanoma (A375), two human breast carcinoma (MCF7 and MDA‑MB‑231) and one human liver carcinoma (HepG2). The cell lines were classified according to cancer type and the species they originated from, as well as by their metastatic potential, offering the possibility to differentiate non‑invasive from invasive cells. The obtained results pave the way for developing a broad‑based strategy for the identification and classification of cancer cells.
76 FR 41756 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-15
... materials and supplies used in production. The economic census will produce basic statistics by kind of business on number of establishments, sales, payroll, employment, inventories, and operating expenses. It also will yield a variety of subject statistics, including sales by product line; sales by class of...
Selecting a Classification Ensemble and Detecting Process Drift in an Evolving Data Stream
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heredia-Langner, Alejandro; Rodriguez, Luke R.; Lin, Andy
2015-09-30
We characterize the commercial behavior of a group of companies in a common line of business using a small ensemble of classifiers on a stream of records containing commercial activity information. This approach is able to effectively find a subset of classifiers that can be used to predict company labels with reasonable accuracy. Performance of the ensemble, its error rate under stable conditions, can be characterized using an exponentially weighted moving average (EWMA) statistic. The behavior of the EWMA statistic can be used to monitor a record stream from the commercial network and determine when significant changes have occurred. Resultsmore » indicate that larger classification ensembles may not necessarily be optimal, pointing to the need to search the combinatorial classifier space in a systematic way. Results also show that current and past performance of an ensemble can be used to detect when statistically significant changes in the activity of the network have occurred. The dataset used in this work contains tens of thousands of high level commercial activity records with continuous and categorical variables and hundreds of labels, making classification challenging.« less
Locating the cortical bottleneck for slow reading in peripheral vision
Yu, Deyue; Jiang, Yi; Legge, Gordon E.; He, Sheng
2015-01-01
Yu, Legge, Park, Gage, and Chung (2010) suggested that the neural bottleneck for slow peripheral reading is located in nonretinotopic areas. We investigated the potential rate-limiting neural site for peripheral reading using fMRI, and contrasted peripheral reading with recognition of peripherally presented line drawings of common objects. We measured the BOLD responses to both text (three-letter words/nonwords) and line-drawing objects presented either in foveal or peripheral vision (10° lower right visual field) at three presentation rates (2, 4, and 8/second). The statistically significant interaction effect of visual field × presentation rate on the BOLD response for text but not for line drawings provides evidence for distinctive processing of peripheral text. This pattern of results was obtained in all five regions of interest (ROIs). At the early retinotopic cortical areas, the BOLD signal slightly increased with increasing presentation rate for foveal text, and remained fairly constant for peripheral text. In the Occipital Word-Responsive Area (OWRA), Visual Word Form Area (VWFA), and object sensitive areas (LO and PHA), the BOLD responses to text decreased with increasing presentation rate for peripheral but not foveal presentation. In contrast, there was no rate-dependent reduction in BOLD response for line-drawing objects in all the ROIs for either foveal or peripheral presentation. Only peripherally presented text showed a distinctive rate-dependence pattern. Although it is possible that the differentiation starts to emerge at the early retinotopic cortical representation, the neural bottleneck for slower reading of peripherally presented text may be a special property of peripheral text processing in object category selective cortex. PMID:26237299
Women Entrepreneurship Across Racial Lines: Current Status, Critical Issues, and Future Implications
ERIC Educational Resources Information Center
Smith-Hunter, Andrea
2004-01-01
This article begins with a look at women employment over the years and the historical place of women entrepreneurship in today's economy. It continues by analyzing data statistically on women entrepreneurs in the United States across racial lines, with a particular focus on Hispanic women entrepreneurs. The article ends by examining the critical…
NASA Astrophysics Data System (ADS)
Gu, Jian
This thesis explores how nanopatterns can be used to control the growth of single-crystal silicon on amorphous substrates at low temperature, with potential applications on flat panel liquid-crystal display and 3-dimensional (3D) integrated circuits. I first present excimer laser annealing of amorphous silicon (a-Si) nanostructures on thermally oxidized silicon wafer for controlled formation of single-crystal silicon islands. Preferential nucleation at pattern center is observed due to substrate enhanced edge heating. Single-grain silicon is obtained in a 50 nm x 100 nm rectangular pattern by super lateral growth (SLG). Narrow lines (such as 20-nm-wide) can serve as artificial heterogeneous nucleation sites during crystallization of large patterns, which could lead to the formation of single-crystal silicon islands in a controlled fashion. In addition to eximer laser annealing, NanoPAtterning and nickel-induced lateral C&barbelow;rystallization (NanoPAC) of a-Si lines is presented. Single-crystal silicon is achieved by NanoPAC. The line width of a-Si affects the grain structure of crystallized silicon lines significantly. Statistics show that single-crystal silicon is formed for all lines with width between 50 nm to 200 nm. Using in situ transmission electron microscopy (TEM), nickel-induced lateral crystallization (Ni-ILC) of a-Si inside a pattern is revealed; lithography-constrained single seeding (LISS) is proposed to explain the single-crystal formation. Intragrain line and two-dimensional defects are also studied. To test the electrical properties of NanoPAC silicon films, sub-100 nm thin-film transistors (TFTs) are fabricated using Patten-controlled crystallization of Ṯhin a-Si channel layer and H&barbelow;igh temperature (850°C) annealing, coined PaTH process. PaTH TFTs show excellent device performance over traditional solid phase crystallized (SPC) TFTs in terms of threshold voltage, threshold voltage roll-off, leakage current, subthreshold swing, on/off current ratio, device-to-device uniformity etc. Two-dimensional device simulations show that PaTH TFTs are comparable to silicon-on-insulator (SOI) devices, making it a promising candidate for the fabrication of future high performance, low-power 3D integrated circuits. Finally, an ultrafast nanolithography technique, laser-assisted direct imprint (LADI) is introduced. LADI shows the ability of patterning nanostructures directly in silicon in nanoseconds with sub-10 nm resolution. The process has potential applications in multiple disciplines, and could be extended to other materials and processes.
Rehm, Jürgen
2008-06-01
In summarizing the key themes and results of the second meeting of the German Addiction Research Network 'Understanding Addiction: Mediators and Moderators of Behaviour Change Process', the following concrete steps forward were laid out to improve knowledge. The steps included pleas to (1) redefine substance abuse disorders, especially redefine the concept of abuse and harmful use; (2) increase the use of longitudinal and life-course studies with more adequate statistical methods such as latent growth modelling; (3) empirically test more specific and theoretically derived common factors and mechanisms of behavioural change processes; (4) better exploit cross-regional and cross-cultural differences.Funding agencies are urged to support these developments by specifically supporting interdisciplinary research along the lines specified above. This may include improved forms of international funding of groups of researchers from different countries, where each national group conducts a specific part of an integrated proposal. 2008 John Wiley & Sons, Ltd
Experiences in using DISCUS for visualizing human communication
NASA Astrophysics Data System (ADS)
Groehn, Matti; Nieminen, Marko; Haho, Paeivi; Smeds, Riitta
2000-02-01
In this paper, we present further improvement to the DISCUS software that can be used to record and analyze the flow and constants of business process simulation session discussion. The tool was initially introduced in 'visual data exploration and analysis IV' conference. The initial features of the tool enabled the visualization of discussion flow in business process simulation sessions and the creation of SOM analyses. The improvements of the tool consists of additional visualization possibilities that enable quick on-line analyses and improved graphical statistics. We have also created the very first interface to audio data and implemented two ways to visualize it. We also outline additional possibilities to use the tool in other application areas: these include usability testing and the possibility to use the tool for capturing design rationale in a product development process. The data gathered with DISCUS may be used in other applications, and further work may be done with data ming techniques.
Direct evidence for a dual process model of deductive inference.
Markovits, Henry; Brunet, Marie-Laurence; Thompson, Valerie; Brisson, Janie
2013-07-01
In 2 experiments, we tested a strong version of a dual process theory of conditional inference (cf. Verschueren et al., 2005a, 2005b) that assumes that most reasoners have 2 strategies available, the choice of which is determined by situational variables, cognitive capacity, and metacognitive control. The statistical strategy evaluates inferences probabilistically, accepting those with high conditional probability. The counterexample strategy rejects inferences when a counterexample shows the inference to be invalid. To discriminate strategy use, we presented reasoners with conditional statements (if p, then q) and explicit statistical information about the relative frequency of the probability of p/q (50% vs. 90%). A statistical strategy would accept the more probable inferences more frequently, whereas the counterexample one would reject both. In Experiment 1, reasoners under time pressure used the statistical strategy more, but switched to the counterexample strategy when time constraints were removed; the former took less time than the latter. These data are consistent with the hypothesis that the statistical strategy is the default heuristic. Under a free-time condition, reasoners preferred the counterexample strategy and kept it when put under time pressure. Thus, it is not simply a lack of capacity that produces a statistical strategy; instead, it seems that time pressure disrupts the ability to make good metacognitive choices. In line with this conclusion, in a 2nd experiment, we measured reasoners' confidence in their performance; those under time pressure were less confident in the statistical than the counterexample strategy and more likely to switch strategies under free-time conditions. PsycINFO Database Record (c) 2013 APA, all rights reserved.
S.I.I.A for monitoring crop evolution and anomaly detection in Andalusia by remote sensing
NASA Astrophysics Data System (ADS)
Rodriguez Perez, Antonio Jose; Louakfaoui, El Mostafa; Munoz Rastrero, Antonio; Rubio Perez, Luis Alberto; de Pablos Epalza, Carmen
2004-02-01
A new remote sensing application was developed and incorporated to the Agrarian Integrated Information System (S.I.I.A), project which is involved on integrating the regional farming databases from a geographical point of view, adding new values and uses to the original information. The project is supported by the Studies and Statistical Service, Regional Government Ministry of Agriculture and Fisheries (CAP). The process integrates NDVI values from daily NOAA-AVHRR and monthly IRS-WIFS images, and crop classes location maps. Agrarian local information and meteorological information is being included in the working process to produce a synergistic effect. An updated crop-growing evaluation state is obtained by 10-days periods, crop class, sensor type (including data fusion) and administrative geographical borders. Last ten years crop database (1992-2002) has been organized according to these variables. Crop class database can be accessed by an application which helps users on the crop statistical analysis. Multi-temporal and multi-geographical comparative analysis can be done by the user, not only for a year but also for a historical point of view. Moreover, real time crop anomalies can be detected and analyzed. Most of the output products will be available on Internet in the near future by a on-line application.
Hertz-Schünemann, Romy; Streibel, Thorsten; Ehlert, Sven; Zimmermann, Ralf
2013-09-01
A micro-probe (μ-probe) gas sampling device for on-line analysis of gases evolving in confined, small objects by single-photon ionisation time-of-flight mass spectrometry (SPI-TOFMS) was developed. The technique is applied for the first time in a feasibility study to record the formation of volatile and flavour compounds during the roasting process within (inside) or in the direct vicinity (outside) of individual coffee beans. A real-time on-line analysis of evolving volatile and semi-volatile organic compounds (VOC and SVOC) as they are formed under the mild pyrolytic conditions of the roasting process was performed. The soft-ionisation mass spectra depict a molecular ion signature, which is well corresponding with the existing knowledge of coffee roasting and evolving compounds. Additionally, thereby it is possible to discriminate between Coffea arabica (Arabica) and Coffea canephora (Robusta). The recognized differences in the roasting gas profiles reflect the differences in the precursor composition of the coffee cultivars very well. Furthermore, a well-known set of marker compounds for Arabica and Robusta, namely the lipids kahweol and cafestol (detected in their dehydrated form at m/z 296 and m/z 298, respectively) were observed. If the variation in time of different compounds is observed, distinctly different evolution behaviours were detected. Here, phenol (m/z 94) and caffeine (m/z 194) are exemplary chosen, whereas phenol shows very sharp emission peaks, caffeine do not have this highly transient behaviour. Finally, the changes of the chemical signature as a function of the roasting time, the influence of sampling position (inside, outside) and cultivar (Arabica, Robusta) is investigated by multivariate statistics (PCA). In summary, this pilot study demonstrates the high potential of the measurement technique to enhance the fundamental knowledge of the formation processes of volatile and semi-volatile flavour compounds inside the individual coffee bean.
Novel hyperspectral prediction method and apparatus
NASA Astrophysics Data System (ADS)
Kemeny, Gabor J.; Crothers, Natalie A.; Groth, Gard A.; Speck, Kathy A.; Marbach, Ralf
2009-05-01
Both the power and the challenge of hyperspectral technologies is the very large amount of data produced by spectral cameras. While off-line methodologies allow the collection of gigabytes of data, extended data analysis sessions are required to convert the data into useful information. In contrast, real-time monitoring, such as on-line process control, requires that compression of spectral data and analysis occur at a sustained full camera data rate. Efficient, high-speed practical methods for calibration and prediction are therefore sought to optimize the value of hyperspectral imaging. A novel method of matched filtering known as science based multivariate calibration (SBC) was developed for hyperspectral calibration. Classical (MLR) and inverse (PLS, PCR) methods are combined by spectroscopically measuring the spectral "signal" and by statistically estimating the spectral "noise." The accuracy of the inverse model is thus combined with the easy interpretability of the classical model. The SBC method is optimized for hyperspectral data in the Hyper-CalTM software used for the present work. The prediction algorithms can then be downloaded into a dedicated FPGA based High-Speed Prediction EngineTM module. Spectral pretreatments and calibration coefficients are stored on interchangeable SD memory cards, and predicted compositions are produced on a USB interface at real-time camera output rates. Applications include minerals, pharmaceuticals, food processing and remote sensing.
Volatile metabolomic signature of human breast cancer cell lines
Silva, Catarina L.; Perestrelo, Rosa; Silva, Pedro; Tomás, Helena; Câmara, José S.
2017-01-01
Breast cancer (BC) remains the most prevalent oncologic pathology in women, causing huge psychological, economic and social impacts on our society. Currently, the available diagnostic tools have limited sensitivity and specificity. Metabolome analysis has emerged as a powerful tool for obtaining information about the biological processes that occur in organisms, and is a useful platform for discovering new biomarkers or make disease diagnosis using different biofluids. Volatile organic compounds (VOCs) from the headspace of cultured BC cells and normal human mammary epithelial cells, were collected by headspace solid-phase microextraction (HS-SPME) and analyzed by gas chromatography combined with mass spectrometry (GC–MS), thus defining a volatile metabolomic signature. 2-Pentanone, 2-heptanone, 3-methyl-3-buten-1-ol, ethyl acetate, ethyl propanoate and 2-methyl butanoate were detected only in cultured BC cell lines. Multivariate statistical methods were used to verify the volatomic differences between BC cell lines and normal cells in order to find a set of specific VOCs that could be associated with BC, providing comprehensive insight into VOCs as potential cancer biomarkers. The establishment of the volatile fingerprint of BC cell lines presents a powerful approach to find endogenous VOCs that could be used to improve the BC diagnostic tools and explore the associated metabolomic pathways. PMID:28256598
NASA Astrophysics Data System (ADS)
Stan Development Team
2018-01-01
Stan facilitates statistical inference at the frontiers of applied statistics and provides both a modeling language for specifying complex statistical models and a library of statistical algorithms for computing inferences with those models. These components are exposed through interfaces in environments such as R, Python, and the command line.
Multiple-Line Inference of Selection on Quantitative Traits
Riedel, Nico; Khatri, Bhavin S.; Lässig, Michael; Berg, Johannes
2015-01-01
Trait differences between species may be attributable to natural selection. However, quantifying the strength of evidence for selection acting on a particular trait is a difficult task. Here we develop a population genetics test for selection acting on a quantitative trait that is based on multiple-line crosses. We show that using multiple lines increases both the power and the scope of selection inferences. First, a test based on three or more lines detects selection with strongly increased statistical significance, and we show explicitly how the sensitivity of the test depends on the number of lines. Second, a multiple-line test can distinguish between different lineage-specific selection scenarios. Our analytical results are complemented by extensive numerical simulations. We then apply the multiple-line test to QTL data on floral character traits in plant species of the Mimulus genus and on photoperiodic traits in different maize strains, where we find a signature of lineage-specific selection not seen in two-line tests. PMID:26139839
Dongre, A R; Chacko, T V; Banu, S; Bhandary, S; Sahasrabudhe, R A; Philip, S; Deshmukh, P R
2010-11-01
In medical education, using the World Wide Web is a new approach for building the capacity of faculty. However, there is little information available on medical education researchers' needs and their collective learning outcomes in such on-line environments. Hence, the present study attempted: 1)to identify needs for capacity-building of fellows in a faculty development program on the topic of data analysis; and 2) to describe, analyze and understand the collective learning outcomes of the fellows during this need-based on-line session. The present research is based on quantitative (on-line survey for needs assessment) and qualitative (contents of e-mails exchanged in listserv discussion) data which were generated during the October 2009 Mentoring and Learning (M-L) Web discussion on the topic of data analysis. The data sources were shared e-mail responses during the process of planning and executing the M-L Web discussion. Content analysis was undertaken and the categories of discussion were presented as a simple non-hierarchical typology which represents the collective learning of the project fellows. We identified the types of learning needs on the topic 'Analysis of Data' to be addressed for faculty development in the field of education research. This need-based M-L Web discussion could then facilitate collective learning on such topics as 'basic concepts in statistics', tests of significance, Likert scale analysis, bivariate correlation, and simple regression analysis and content analysis of qualitative data. Steps like identifying the learning needs for an on-line M-L Web discussion, addressing the immediate needs of learners and creating a flexible reflective learning environment on the M-L Web facilitated the collective learning of the fellows on the topic of data analysis. Our outcomes can be useful in the design of on-line pedagogical strategies for supporting research in medical education.
The Importance of Time and Frequency Reference in Quantum Astronomy and Quantum Communications
2007-11-01
simulator, but the same general results are valid for optical fiber and also different quantum state transmission technologies (i.e. Entangled Photons ...protocols [6]). The Matlab simulation starts from a sequence of pulses of duration Ton; the number of photons per pulse has been implemented like a...astrophysical emission mechanisms or scattering processes by measuring the statistics of the arrival time of each incoming photon . This line of research will be
2016-01-01
PURPOSE The storage conditions of impressions affect the dimensional accuracy of the impression materials. The aim of the study was to assess the effects of storage time on dimensional accuracy of five different impression materials by cone beam computed tomography (CBCT). MATERIALS AND METHODS Polyether (Impregum), hydrocolloid (Hydrogum and Alginoplast), and silicone (Zetaflow and Honigum) impression materials were used for impressions taken from an acrylic master model. The impressions were poured and subjected to four different storage times: immediate use, and 1, 3, and 5 days of storage. Line 1 (between right and left first molar mesiobuccal cusp tips) and Line 2 (between right and left canine tips) were measured on a CBCT scanned model, and time dependent mean differences were analyzed by two-way univariate and Duncan's test (α=.05). RESULTS For Line 1, the total mean difference of Impregum and Hydrogum were statistically different from Alginoplast (P<.05), while Zetaflow and Honigum had smaller discrepancies. Alginoplast resulted in more difference than the other impressions (P<.05). For Line 2, the total mean difference of Impregum was statistically different from the other impressions. Significant differences were observed in Line 1 and Line 2 for the different storage periods (P<.05). CONCLUSION The dimensional accuracy of impression material is clinically acceptable if the impression material is stored in suitable conditions. PMID:27826388
Alkurt, Murat; Yeşıl Duymus, Zeynep; Dedeoglu, Numan
2016-10-01
The storage conditions of impressions affect the dimensional accuracy of the impression materials. The aim of the study was to assess the effects of storage time on dimensional accuracy of five different impression materials by cone beam computed tomography (CBCT). Polyether (Impregum), hydrocolloid (Hydrogum and Alginoplast), and silicone (Zetaflow and Honigum) impression materials were used for impressions taken from an acrylic master model. The impressions were poured and subjected to four different storage times: immediate use, and 1, 3, and 5 days of storage. Line 1 (between right and left first molar mesiobuccal cusp tips) and Line 2 (between right and left canine tips) were measured on a CBCT scanned model, and time dependent mean differences were analyzed by two-way univariate and Duncan's test (α=.05). For Line 1, the total mean difference of Impregum and Hydrogum were statistically different from Alginoplast ( P <.05), while Zetaflow and Honigum had smaller discrepancies. Alginoplast resulted in more difference than the other impressions ( P <.05). For Line 2, the total mean difference of Impregum was statistically different from the other impressions. Significant differences were observed in Line 1 and Line 2 for the different storage periods ( P <.05). The dimensional accuracy of impression material is clinically acceptable if the impression material is stored in suitable conditions.
NASA Astrophysics Data System (ADS)
Fernandez, Carlos; Platero, Carlos; Campoy, Pascual; Aracil, Rafael
1994-11-01
This paper describes some texture-based techniques that can be applied to quality assessment of flat products continuously produced (metal strips, wooden surfaces, cork, textile products, ...). Since the most difficult task is that of inspecting for product appearance, human-like inspection ability is required. A common feature to all these products is the presence of non- deterministic texture on their surfaces. Two main subjects are discussed: statistical techniques for both surface finishing determination and surface defect analysis as well as real-time implementation for on-line inspection in high-speed applications. For surface finishing determination a Gray Level Difference technique is presented to perform over low resolution images, that is, no-zoomed images. Defect analysis is performed by means of statistical texture analysis over defective portions of the surface. On-line implementation is accomplished by means of neural networks. When a defect arises, textural analysis is applied which result in a data-vector, acting as input of a neural net, previously trained in a supervised way. This approach tries to reach on-line performance in automated visual inspection applications when texture is presented in flat product surfaces.
One-dimensional turbulence modeling of a turbulent counterflow flame with comparison to DNS
Jozefik, Zoltan; Kerstein, Alan R.; Schmidt, Heiko; ...
2015-06-01
The one-dimensional turbulence (ODT) model is applied to a reactant-to-product counterflow configuration and results are compared with DNS data. The model employed herein solves conservation equations for momentum, energy, and species on a one dimensional (1D) domain corresponding to the line spanning the domain between nozzle orifice centers. The effects of turbulent mixing are modeled via a stochastic process, while the Kolmogorov and reactive length and time scales are explicitly resolved and a detailed chemical kinetic mechanism is used. Comparisons between model and DNS results for spatial mean and root-mean-square (RMS) velocity, temperature, and major and minor species profiles aremore » shown. The ODT approach shows qualitatively and quantitatively reasonable agreement with the DNS data. Scatter plots and statistics conditioned on temperature are also compared for heat release rate and all species. ODT is able to capture the range of results depicted by DNS. As a result, conditional statistics show signs of underignition.« less
Physical characteristics of welding arc ignition process
NASA Astrophysics Data System (ADS)
Shi, Linan; Song, Yonglun; Xiao, Tianjiao; Ran, Guowei
2012-07-01
The existing research of welding arc mainly focuses on the stable combustion state and the research on the mechanism of welding arc ignition process is quite lack. The tungsten inert gas(TIG) touch arc ignition process is observed via a high speed camera and the high time resolution spectral diagnosis system. The changing phenomenon of main ionized element provided the electrons in the arc ignition is found. The metallic element is the main contributor to provide the electrons at the beginning of the discharging, and then the excitated shielding gas element replaces the function of the metallic element. The electron density during the period of the arc ignition is calculated by the Stark-broadened lines of Hα. Through the discussion with the repeatability in relaxation phenomenon, the statistical regularity in the arc ignition process is analyzed. The similar rules as above are observed through the comparison with the laser-assisted arc ignition experiments and the metal inert gas(MIG) arc ignition experiments. This research is helpful to further understanding on the generation mechanism of welding arc ignition and also has a certain academic and practical significance on enriching the welding physical theoretical foundation and improving the precise monitoring on automatic arc welding process.
Metabolic phenotyping of a model of adipocyte differentiation
Roberts, Lee D.; Virtue, Sam; Vidal-Puig, Antonio; Nicholls, Andrew W.
2009-01-01
The 3T3-L1 murine cell line is a robust and widely used model for the study of adipogenesis and processes occurring in mature adipocytes. The fibroblastic like cells can be induced by hormones to differentiate into mature adipocytes. In this study, the metabolic phenotype associated with differentiation of the 3T3-L1 cell line has been studied using gas chromatography-mass spectrometry, 1H nuclear magnetic resonance spectroscopy, liquid chromatography-mass spectrometry, direct infusion-mass spectrometry, and 13C substrate labeling in conjunction with multivariate statistics. The changes in metabolite concentrations at distinct periods during differentiation have been defined including alterations in the TCA cycle, glycolysis, the production of odd chain fatty acids by α-oxidation, fatty acid synthesis, fatty acid desaturation, polyamine biosynthesis, and trans-esterification to produce complex lipids. The metabolic changes induced during differentiation of the 3T3-L1 cell line were then compared with the metabolic differences between pre- and postdifferentiation primary adipocytes. These metabolic alterations reflect the changing role of the 3T3-L1 cells during differentiation, as well as possibly providing metabolic triggers to stimulate the processes which occur during differentiation. PMID:19602617
Statistical properties of solar flares and coronal mass ejections through the solar cycle
NASA Astrophysics Data System (ADS)
Telloni, Daniele; Carbone, Vincenzo; Lepreti, Fabio; Antonucci, Ester
2016-03-01
Waiting Time Distributions (WTDs) of solar flares are investigated all through the solar cycle. The same approach applied to Coronal Mass Ejections (CMEs) in a previous work is considered here for flare occurrence. Our analysis reveals that flares and CMEs share some common statistical properties, which result dependent on the level of solar activity. Both flares and CMEs seem to independently occur during minimum solar activity phases, whilst their WTDs significantly deviate from a Poisson function at solar maximum, thus suggesting that these events are correlated. The characteristics of WTDs are constrained by the physical processes generating those eruptions associated with flares and CMEs. A scenario may be drawn in which different mechanisms are actively at work during different phases of the solar cycle. Stochastic processes, most likely related to random magnetic reconnections of the field lines, seem to play a key role during solar minimum periods. On the other hand, persistent processes, like sympathetic eruptions associated to the variability of the photospheric magnetism, are suggested to dominate during periods of high solar activity. Moreover, despite the similar statistical properties shown by flares and CMEs, as it was mentioned above, their WTDs appear different in some aspects. During solar minimum periods, the flare occurrence randomness seems to be more evident than for CMEs. Those persistent mechanisms generating interdependent events during maximum periods of solar activity can be suggested to play a more important role for CMEs than for flares, thus mitigating the competitive action of the random processes, which seem instead strong enough to weaken the correlations among flare event occurrence during solar minimum periods. However, it cannot be excluded that the physical processes at the basis of the origin of the temporal correlation between solar events are different for flares and CMEs, or that, more likely, more sophisticated effects are at work at the same time leading to an even more complex picture. This work represents a first step for further investigations.
Text line extraction in free style document
NASA Astrophysics Data System (ADS)
Shen, Xiaolu; Liu, Changsong; Ding, Xiaoqing; Zou, Yanming
2009-01-01
This paper addresses to text line extraction in free style document, such as business card, envelope, poster, etc. In free style document, global property such as character size, line direction can hardly be concluded, which reveals a grave limitation in traditional layout analysis. 'Line' is the most prominent and the highest structure in our bottom-up method. First, we apply a novel intensity function found on gradient information to locate text areas where gradient within a window have large magnitude and various directions, and split such areas into text pieces. We build a probability model of lines consist of text pieces via statistics on training data. For an input image, we group text pieces to lines using a simulated annealing algorithm with cost function based on the probability model.
NASA Astrophysics Data System (ADS)
Langer, W.
2007-10-01
Star formation activity throughout the Galactic disk depends on the thermal and dynamical state of the interstellar gas, which in turn depends on heating and cooling rates, modulated by the gravitational potential and shock and turbulent pressures. Molecular cloud formation, and thus the star formation, may be regulated by pressures in the interstellar medium (ISM). To understand these processes we need information about the properties of the diffuse atomic and diffuse molecular gas clouds, and Photon Dominated Regions (PDR). An important tracer of these regions is the CII line at 158 microns (1900.5 GHz). We propose a "pencil-beam" survey of CII with HIFI band 7b, based on deep integrations and systematic sparse sampling of the Galactic disk plus selected targets, totaling over 900 lines of sight. We will detect both emission and, against the bright inner Galaxy and selected continuum sources, absorption lines. These spectra will provide the astronomical community with a large rich statistical database of the diffuse cloud properties throughout the Galaxy for understanding the Milky Way ISM and, by extension, other galaxies. It will be extremely valuable for determining the properties of the atomic gas, the role of barometric pressure and turbulence in cloud evolution, and the properties of the interface between the atomic and molecular clouds. The CII line is one of the major ISM cooling lines and is present throughout the Galactic plane. It is the strongest far-IR emission line in the Galaxy, with a total luminosity about a 1000 times that of the CO J=1-0 line. Combined with other data, it can be used to determine density, pressure, and radiation environment in gas clouds, and PDRs, and their dynamics via velocity fields. HSO is the best opportunity over the next several years to probe the ISM in this tracer and will provide a template for large-scale surveys with dedicated small telescopes and future surveys of other important ISM tracers.
SDP_wlanger_3: State of the Diffuse ISM: Galactic Observations of the Terahertz CII Line (GOT CPlus)
NASA Astrophysics Data System (ADS)
Langer, W.
2011-09-01
Star formation activity throughout the Galactic disk depends on the thermal and dynamical state of the interstellar gas, which in turn depends on heating and cooling rates, modulated by the gravitational potential and shock and turbulent pressures. Molecular cloud formation, and thus the star formation, may be regulated by pressures in the interstellar medium (ISM). To understand these processes we need information about the properties of the diffuse atomic and diffuse molecular gas clouds, and Photon Dominated Regions (PDR). An important tracer of these regions is the CII line at 158 microns (1900.5 GHz). We propose a "pencil-beam" survey of CII with HIFI band 7b, based on deep integrations and systematic sparse sampling of the Galactic disk plus selected targets, totaling over 900 lines of sight. We will detect both emission and, against the bright inner Galaxy and selected continuum sources, absorption lines. These spectra will provide the astronomical community with a large rich statistical database of the diffuse cloud properties throughout the Galaxy for understanding the Milky Way ISM and, by extension, other galaxies. It will be extremely valuable for determining the properties of the atomic gas, the role of barometric pressure and turbulence in cloud evolution, and the properties of the interface between the atomic and molecular clouds. The CII line is one of the major ISM cooling lines and is present throughout the Galactic plane. It is the strongest far-IR emission line in the Galaxy, with a total luminosity about a 1000 times that of the CO J=1-0 line. Combined with other data, it can be used to determine density, pressure, and radiation environment in gas clouds, and PDRs, and their dynamics via velocity fields. HSO is the best opportunity over the next several years to probe the ISM in this tracer and will provide a template for large-scale surveys with dedicated small telescopes and future surveys of other important ISM tracers.
Large signal design - Performance and simulation of a 3 W C-band GaAs power MMIC
NASA Astrophysics Data System (ADS)
White, Paul M.; Hendrickson, Mary A.; Chang, Wayne H.; Curtice, Walter R.
1990-04-01
This paper describes a C-band GaAs power MMIC amplifier that achieved a gain of 17 dB and 1 dB compressed CW power output of 34 dBm across a 4.5-6.25-GHz frequency range, without design iteration. The first-pass design success was achieved due to the application of a harmonic balance simulator to define the optimum output load, using a large-signal FET model determined statistically on a well controlled foundry-ready process line. The measured performance was close to that predicted by a full harmonic balance circuit analysis.
A Comprehensive Computer Package for Ambulatory Surgical Facilities
Kessler, Robert R.
1980-01-01
Ambulatory surgical centers are a cost effective alternative to hospital surgery. Their increasing popularity has contributed to heavy case loads, an accumulation of vast amounts of medical and financial data and economic pressures to maintain a tight control over “cash flow”. Computerization is now a necessity to aid ambulatory surgical centers to maintain their competitive edge. An on-line system is especially necessary as it allows interactive scheduling of surgical cases, immediate access to financial data and rapid gathering of medical and statistical information. This paper describes the significant features of the computer package in use at the Salt Lake Surgical Center, which processes 500 cases per month.
Developing an EEG-based on-line closed-loop lapse detection and mitigation system
Wang, Yu-Te; Huang, Kuan-Chih; Wei, Chun-Shu; Huang, Teng-Yi; Ko, Li-Wei; Lin, Chin-Teng; Cheng, Chung-Kuan; Jung, Tzyy-Ping
2014-01-01
In America, 60% of adults reported that they have driven a motor vehicle while feeling drowsy, and at least 15–20% of fatal car accidents are fatigue-related. This study translates previous laboratory-oriented neurophysiological research to design, develop, and test an On-line Closed-loop Lapse Detection and Mitigation (OCLDM) System featuring a mobile wireless dry-sensor EEG headgear and a cell-phone based real-time EEG processing platform. Eleven subjects participated in an event-related lane-keeping task, in which they were instructed to manipulate a randomly deviated, fixed-speed cruising car on a 4-lane highway. This was simulated in a 1st person view with an 8-screen and 8-projector immersive virtual-reality environment. When the subjects experienced lapses or failed to respond to events during the experiment, auditory warning was delivered to rectify the performance decrements. However, the arousing auditory signals were not always effective. The EEG spectra exhibited statistically significant differences between effective and ineffective arousing signals, suggesting that EEG spectra could be used as a countermeasure of the efficacy of arousing signals. In this on-line pilot study, the proposed OCLDM System was able to continuously detect EEG signatures of fatigue, deliver arousing warning to subjects suffering momentary cognitive lapses, and assess the efficacy of the warning in near real-time to rectify cognitive lapses. The on-line testing results of the OCLDM System validated the efficacy of the arousing signals in improving subjects' response times to the subsequent lane-departure events. This study may lead to a practical on-line lapse detection and mitigation system in real-world environments. PMID:25352773
Patel, Bhumit A; Pinto, Nuno D S; Gospodarek, Adrian; Kilgore, Bruce; Goswami, Kudrat; Napoli, William N; Desai, Jayesh; Heo, Jun H; Panzera, Dominick; Pollard, David; Richardson, Daisy; Brower, Mark; Richardson, Douglas D
2017-11-07
Combining process analytical technology (PAT) with continuous production provides a powerful tool to observe and control monoclonal antibody (mAb) fermentation and purification processes. This work demonstrates on-line liquid chromatography (on-line LC) as a PAT tool for monitoring a continuous biologics process and forced degradation studies. Specifically, this work focused on ion exchange chromatography (IEX), which is a critical separation technique to detect charge variants. Product-related impurities, including charge variants, that impact function are classified as critical quality attributes (CQAs). First, we confirmed no significant differences were observed in the charge heterogeneity profile of a mAb through both at-line and on-line sampling and that the on-line method has the ability to rapidly detect changes in protein quality over time. The robustness and versatility of the PAT methods were tested by sampling from two purification locations in a continuous mAb process. The PAT IEX methods used with on-line LC were a weak cation exchange (WCX) separation and a newly developed shorter strong cation exchange (SCX) assay. Both methods provided similar results with the distribution of percent acidic, main, and basic species remaining unchanged over a 2 week period. Second, a forced degradation study showed an increase in acidic species and a decrease in basic species when sampled on-line over 7 days. These applications further strengthen the use of on-line LC to monitor CQAs of a mAb continuously with various PAT IEX analytical methods. Implementation of on-line IEX will enable faster decision making during process development and could potentially be applied to control in biomanufacturing.
Reuter, Martin; Wolter, Franz-Erich; Shenton, Martha; Niethammer, Marc
2009-01-01
This paper proposes the use of the surface based Laplace-Beltrami and the volumetric Laplace eigenvalues and -functions as shape descriptors for the comparison and analysis of shapes. These spectral measures are isometry invariant and therefore allow for shape comparisons with minimal shape pre-processing. In particular, no registration, mapping, or remeshing is necessary. The discriminatory power of the 2D surface and 3D solid methods is demonstrated on a population of female caudate nuclei (a subcortical gray matter structure of the brain, involved in memory function, emotion processing, and learning) of normal control subjects and of subjects with schizotypal personality disorder. The behavior and properties of the Laplace-Beltrami eigenvalues and -functions are discussed extensively for both the Dirichlet and Neumann boundary condition showing advantages of the Neumann vs. the Dirichlet spectra in 3D. Furthermore, topological analyses employing the Morse-Smale complex (on the surfaces) and the Reeb graph (in the solids) are performed on selected eigenfunctions, yielding shape descriptors, that are capable of localizing geometric properties and detecting shape differences by indirectly registering topological features such as critical points, level sets and integral lines of the gradient field across subjects. The use of these topological features of the Laplace-Beltrami eigenfunctions in 2D and 3D for statistical shape analysis is novel. PMID:20161035
ERIC Educational Resources Information Center
Guthrie, Gerry D.
The objective of this study was to provide the library community with basic statistical data from on-line activity in the Ohio State University Libraries' Circulation System. Over 1.6 million archive records in the circulation system for 1972 were investigated to produce subject reports of circulation activity, activity reports by collection…
Singer, Heike; Walier, Maja; Nüsgen, Nicole; Meesters, Christian; Schreiner, Felix; Woelfle, Joachim; Fimmers, Rolf; Wienker, Thomas; Kalscheuer, Vera M; Becker, Tim; Schwaab, Rainer; Oldenburg, Johannes; El-Maarri, Osman
2012-01-01
LINE-1 repeats account for ~17% of the human genome. Little is known about their individual methylation patterns, because their repetitive, almost identical sequences make them difficult to be individually targeted. Here, we used bisulfite conversion to study methylation at individual LINE-1 repeats. The loci studied included 39 X-linked loci and 5 autosomal loci. On the X chromosome in women, we found statistically significant less methylation at almost all L1Hs compared with men. Methylation at L1P and L1M did not correlate with the inactivation status of the host DNA, while the majority of L1Hs that were possible to be studied lie in inactivated regions. To investigate whether the male-female differences at L1Hs on the X are linked to the inactivation process itself rather than to a mere influence of gender, we analyzed six of the L1Hs loci on the X chromosome in Turners and Klinefelters which have female and male phenotype, respectively, but with reversed number of X chromosomes. We could confirm that all samples with two X chromosomes are hypomethylated at the L1Hs loci. Therefore, the inactive X is hypomethylated at L1Hs; the latter could play an exclusive role in the X chromosome inactivation process. At autosomal L1Hs, methylation levels showed a correlation tendency between methylation level and genome size, with higher methylation observed at most loci in individuals with one X chromosome and the lowest in XXY individuals. In summary, loci-specific LINE-1 methylation levels show considerable plasticity and depend on genomic position and constitution.
Singer, Heike; Walier, Maja; Nüsgen, Nicole; Meesters, Christian; Schreiner, Felix; Woelfle, Joachim; Fimmers, Rolf; Wienker, Thomas; Kalscheuer, Vera M.; Becker, Tim; Schwaab, Rainer; Oldenburg, Johannes; El-Maarri, Osman
2012-01-01
LINE-1 repeats account for ∼17% of the human genome. Little is known about their individual methylation patterns, because their repetitive, almost identical sequences make them difficult to be individually targeted. Here, we used bisulfite conversion to study methylation at individual LINE-1 repeats. The loci studied included 39 X-linked loci and 5 autosomal loci. On the X chromosome in women, we found statistically significant less methylation at almost all L1Hs compared with men. Methylation at L1P and L1M did not correlate with the inactivation status of the host DNA, while the majority of L1Hs that were possible to be studied lie in inactivated regions. To investigate whether the male–female differences at L1Hs on the X are linked to the inactivation process itself rather than to a mere influence of gender, we analyzed six of the L1Hs loci on the X chromosome in Turners and Klinefelters which have female and male phenotype, respectively, but with reversed number of X chromosomes. We could confirm that all samples with two X chromosomes are hypomethylated at the L1Hs loci. Therefore, the inactive X is hypomethylated at L1Hs; the latter could play an exclusive role in the X chromosome inactivation process. At autosomal L1Hs, methylation levels showed a correlation tendency between methylation level and genome size, with higher methylation observed at most loci in individuals with one X chromosome and the lowest in XXY individuals. In summary, loci-specific LINE-1 methylation levels show considerable plasticity and depend on genomic position and constitution. PMID:21972244
Microjets in the penumbra of a sunspot
NASA Astrophysics Data System (ADS)
Drews, Ainar; Rouppe van der Voort, Luc
2017-06-01
Context. Penumbral microjets (PMJs) are short-lived jets found in the penumbra of sunspots, first observed in wide-band Ca II H line observations as localized brightenings, and are thought to be caused by magnetic reconnection. Earlier work on PMJs has focused on smaller samples of by-eye selected events and case studies. Aims: It is our goal to present an automated study of a large sample of PMJs to place the basic statistics of PMJs on a sure footing and to study the PMJ Ca II 8542 Å spectral profile in detail. Methods: High spatial resolution and spectrally well-sampled observations in the Ca II 8542 Å line obtained from the Swedish 1-m Solar Telescope (SST) were reduced by a principle component analysis and subsequently used in the automated detection of PMJs using the simple machine learning algorithm k-nearest neighbour. PMJ detections were verified with co-temporal Ca II H line observations. Results: We find a total of 453 tracked PMJ events, 4253 PMJs detections tallied over all timeframes, and a detection rate of 21 events per timestep. From these, an average length, width and lifetime of 640 km, 210 km and 90 s are obtained. The average PMJ Ca II 8542 Å line profile is characterized by enhanced inner wings, often in the form of one or two distinct peaks, and a brighter line core as compared to the quiet-Sun average. Average blue and red peak positions are determined at - 10.4 km s-1 and + 10.2 km s-1 offsets from the Ca II 8542 Å line core. We find several clusters of PMJ hot-spots within the sunspot penumbra, in which PMJ events occur in the same general area repeatedly over time. Conclusions: Our results indicate smaller average PMJs sizes and longer lifetimes compared to previously published values, but with statistics still in the same orders of magnitude. The investigation and analysis of the PMJ line profiles strengthens the proposed heating of PMJs to transition region temperatures. The presented statistics on PMJs form a solid basis for future investigations and numerical modelling of PMJs.
NASA Astrophysics Data System (ADS)
Taboada, B.; Vega-Alvarado, L.; Córdova-Aguilar, M. S.; Galindo, E.; Corkidi, G.
2006-09-01
Characterization of multiphase systems occurring in fermentation processes is a time-consuming and tedious process when manual methods are used. This work describes a new semi-automatic methodology for the on-line assessment of diameters of oil drops and air bubbles occurring in a complex simulated fermentation broth. High-quality digital images were obtained from the interior of a mechanically stirred tank. These images were pre-processed to find segments of edges belonging to the objects of interest. The contours of air bubbles and oil drops were then reconstructed using an improved Hough transform algorithm which was tested in two, three and four-phase simulated fermentation model systems. The results were compared against those obtained manually by a trained observer, showing no significant statistical differences. The method was able to reduce the total processing time for the measurements of bubbles and drops in different systems by 21-50% and the manual intervention time for the segmentation procedure by 80-100%.
NASA Astrophysics Data System (ADS)
Szyjka, Sebastian P.
The purpose of this study was to determine the extent to which six cognitive and attitudinal variables predicted pre-service elementary teachers' performance on line graphing. Predictors included Illinois teacher education basic skills sub-component scores in reading comprehension and mathematics, logical thinking performance scores, as well as measures of attitudes toward science, mathematics and graphing. This study also determined the strength of the relationship between each prospective predictor variable and the line graphing performance variable, as well as the extent to which measures of attitude towards science, mathematics and graphing mediated relationships between scores on mathematics, reading, logical thinking and line graphing. Ninety-four pre-service elementary education teachers enrolled in two different elementary science methods courses during the spring 2009 semester at Southern Illinois University Carbondale participated in this study. Each subject completed five different instruments designed to assess science, mathematics and graphing attitudes as well as logical thinking and graphing ability. Sixty subjects provided copies of primary basic skills score reports that listed subset scores for both reading comprehension and mathematics. The remaining scores were supplied by a faculty member who had access to a database from which the scores were drawn. Seven subjects, whose scores could not be found, were eliminated from final data analysis. Confirmatory factor analysis (CFA) was conducted in order to establish validity and reliability of the Questionnaire of Attitude Toward Line Graphs in Science (QALGS) instrument. CFA tested the statistical hypothesis that the five main factor structures within the Questionnaire of Attitude Toward Statistical Graphs (QASG) would be maintained in the revised QALGS. Stepwise Regression Analysis with backward elimination was conducted in order to generate a parsimonious and precise predictive model. This procedure allowed the researcher to explore the relationships among the affective and cognitive variables that were included in the regression analysis. The results for CFA indicated that the revised QALGS measure was sound in its psychometric properties when tested against the QASG. Reliability statistics indicated that the overall reliability for the 32 items in the QALGS was .90. The learning preferences construct had the lowest reliability (.67), while enjoyment (.89), confidence (.86) and usefulness (.77) constructs had moderate to high reliabilities. The first four measurement models fit the data well as indicated by the appropriate descriptive and statistical indices. However, the fifth measurement model did not fit the data well statistically, and only fit well with two descriptive indices. The results addressing the research question indicated that mathematical and logical thinking ability were significant predictors of line graph performance among the remaining group of variables. These predictors accounted for 41% of the total variability on the line graph performance variable. Partial correlation coefficients indicated that mathematics ability accounted for 20.5% of the variance on the line graphing performance variable when removing the effect of logical thinking. The logical thinking variable accounted for 4.7% of the variance on the line graphing performance variable when removing the effect of mathematics ability.
Lip line preference for variant face types.
Anwar, Nabila; Fida, Mubassar
2012-06-01
To determine the effect of altered lip line on attractiveness and to find preferred lip line for vertical face types in both genders. Cross-sectional analytical study. The Aga Khan University Hospital, Karachi, from May to July 2009. Photographs of two selected subjects were altered to produce three face types for the same individual with the aim of keeping the frame of the smile constant. Lip line was then altered for both the subjects as: both dentitions visible, upper incisors visible, upper incisors and 2 mm gum and 4 mm gum visible. The pictures were rated by different professionals for attractiveness. Descriptive statistics for the raters and multiple factor ANOVA was used to find the most attractive lip line. The total number of raters was 100 with the mean age of 30.3 ± 8 years. The alterations in the smile parameters produced statistically significant difference in the attractiveness of faces, whereas the perception difference was found to be insignificant amongst raters of different professions. Preferred lip line was the one showing only the upper incisors in dolico and mesofacial male and female genders whereas 2 mm gum show was preferred in brachyfacial subjects. The variability in lip line showed significant difference in the perceived attractiveness. Preferred lip lines as the one showing only the upper incisors in dolico and mesofacial male and female genders whereas 2 mm gum show was preferred in brachyfacial subjects.
NASA Astrophysics Data System (ADS)
Zhu, Hao
Sparsity plays an instrumental role in a plethora of scientific fields, including statistical inference for variable selection, parsimonious signal representations, and solving under-determined systems of linear equations - what has led to the ground-breaking result of compressive sampling (CS). This Thesis leverages exciting ideas of sparse signal reconstruction to develop sparsity-cognizant algorithms, and analyze their performance. The vision is to devise tools exploiting the 'right' form of sparsity for the 'right' application domain of multiuser communication systems, array signal processing systems, and the emerging challenges in the smart power grid. Two important power system monitoring tasks are addressed first by capitalizing on the hidden sparsity. To robustify power system state estimation, a sparse outlier model is leveraged to capture the possible corruption in every datum, while the problem nonconvexity due to nonlinear measurements is handled using the semidefinite relaxation technique. Different from existing iterative methods, the proposed algorithm approximates well the global optimum regardless of the initialization. In addition, for enhanced situational awareness, a novel sparse overcomplete representation is introduced to capture (possibly multiple) line outages, and develop real-time algorithms for solving the combinatorially complex identification problem. The proposed algorithms exhibit near-optimal performance while incurring only linear complexity in the number of lines, which makes it possible to quickly bring contingencies to attention. This Thesis also accounts for two basic issues in CS, namely fully-perturbed models and the finite alphabet property. The sparse total least-squares (S-TLS) approach is proposed to furnish CS algorithms for fully-perturbed linear models, leading to statistically optimal and computationally efficient solvers. The S-TLS framework is well motivated for grid-based sensing applications and exhibits higher accuracy than existing sparse algorithms. On the other hand, exploiting the finite alphabet of unknown signals emerges naturally in communication systems, along with sparsity coming from the low activity of each user. Compared to approaches only accounting for either one of the two, joint exploitation of both leads to statistically optimal detectors with improved error performance.
NASA Astrophysics Data System (ADS)
Ouriev, Boris; Windhab, Erich; Braun, Peter; Zeng, Yuantong; Birkhofer, Beat
2003-12-01
In the present work an in-line ultrasonic method for investigation of the rheological flow behavior of concentrated suspensions was created. It is based on a nondestructive rheological measuring technique for pilot plant and industrial scale applications. Elsewhere the author discusses a tremendous need for in-line rheological characterization of highly concentrated suspensions exposed to pressure driven shear flow conditions. Most existing on-line methods are based on destructive macro actuators, which are not suitable for materials with sensitive to applied deformation structure. Since the process of our basic interest influences the structure of suspension it would be difficult to separate the effects of rheometric measurement and weakly pronounced structural changes arising from a fine adjustment of the process parameters. The magnitude of these effects is usually associated with the complex flow dynamics of structured liquids and is sensitive to density or temperature fluctuations around the moving rheometric actuator. Interpretation of the results of such measurements can be hindered by process parameter influences on liquid product structure. Therefore, the author introduces an in-line noninvasive rheometric method, which is implemented in a pre-crystallization process of chocolate suspension. Use of ultrasound velocity profile pressure difference (UVP-PD) technique enabled process monitoring of the chocolate pre-crystallization process. Influence of seeded crystals on Rheology of chocolate suspension was recorded and monitored on line. It was shown that even slight velocity pulsations in chocolate mainstream can strongly influence rheological properties besides influencing flow velocity profiles. Based on calculations of power law fit in raw velocity profiles and calculation of wall shear stress from pressure difference measurement, a viscosity function was calculated and monitored on line. On-line results were found to be in a good agreement with off-line data. The results of the industrial test of the UVP-PD system brought practical knowledge and stipulated further development of a Smart UVP-PD noninventive on-line rheometer.
Collecting maple sap with unvented spouts, using aerial and ground lines
H. Clay Smith; Carter B. Gibbs
1971-01-01
Two methods of using plastic tubing to collect sugar maple sap were tried: aerial lines and ground lines. Unvented spouts were used in both. We found that the sap yields collected from the aerial and ground lines were not statistically different from each other.
ERIC Educational Resources Information Center
Stevenson, Jill L.; Moore, Dale A.; Newman, Jerry; Schmidt, Janet L.; Smith, Sarah M.; Smith, Jean; Kerr, Susan; Wallace, Michael; BoyEs, Pat
2011-01-01
An on-line module on disease prevention was created for 4-H volunteer leaders who work with livestock projects in Washington to better prepare them to teach youth about bio-security and its importance in 4-H livestock projects. Evaluation of the module and usage statistics since the module's debut were collected and evaluated. The module increases…
Plagiarism governance in nurse education; dispositions, dimensions and tensions.
Welsh, Marion
2017-11-01
The reality of managing plagiarism in nurse education is indicative of multilayered and cumulative governance processes, which exist to fit with the needs of both the higher education institution and that of the Professional Statutory and Regulatory Body. However, the relationship between these entities is diffuse, particularly when this involves major plagiarism by post-qualified learners. This study sought to explore the strategic governance of plagiarism in Scottish higher education institutions offering nurse education and its articulation with the professional requirements of nurse education. The design involved a retrospective quantitative documentary analysis of plagiarism policies within 11 Scottish higher education institutions and a national on-line survey involving nurse educators with an active teaching role (n = 187). The documentary analysis demonstrated deficits and variations in how Scottish higher education institutions communicated the dimensions of plagiarism, and its subsequent management. Statistically significant findings from the on-line survey provided a clear mandate for educational providers to make visible the connectivity between organisational and professional governance processes to support responsive and proportional approaches to managing plagiarism by nurse learners. Significant findings also confirmed role implications and responsibilities, which nurse educators in this study, viewed as primarily pedagogical but crucially remain professionally centric. Copyright © 2017 Elsevier Ltd. All rights reserved.
An On-Line Virtual Environment for Teaching Statistical Sampling and Analysis
ERIC Educational Resources Information Center
Marsh, Michael T.
2009-01-01
Regardless of the related discipline, students in statistics courses invariably have difficulty understanding the connection between the numerical values calculated for end-of-the-chapter exercises and their usefulness in decision making. This disconnect is, in part, due to the lack of time and opportunity to actually design the experiments and…
Seismicity map tools for earthquake studies
NASA Astrophysics Data System (ADS)
Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos
2014-05-01
We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.
Geometry-based across wafer process control in a dual damascene scenario
NASA Astrophysics Data System (ADS)
Krause, Gerd; Hofmann, Detlef; Habets, Boris; Buhl, Stefan; Gutsch, Manuela; Lopez-Gomez, Alberto; Thrun, Xaver
2018-03-01
Dual damascene is an established patterning process for back-end-of-line to generate copper interconnects and lines. One of the critical output parameters is the electrical resistance of the metal lines. In our 200 mm line, this is currently being controlled by a feed-forward control from the etch process to the final step in the CMP process. In this paper, we investigate the impact of alternative feed-forward control using a calibrated physical model that estimates the impact on electrical resistance of the metal lines* . This is done by simulation on a large set of wafers. Three different approaches are evaluated, one of which uses different feed-forward settings for different radial zones in the CMP process.
NASA Astrophysics Data System (ADS)
Cai, Yuanji; Guan, Yonggang; Liu, Weidong
2017-06-01
Transient enclosure voltage (TEV), which is a phenomenon induced by the inner dielectric breakdown of SF6 during disconnector operations in a gas-insulated switchgear (GIS), may cause issues relating to shock hazard and electromagnetic interference to secondary equipment. This is a critical factor regarding the electromagnetic compatibility of ultra-high-voltage (UHV) substations. In this paper, the statistical characteristics of TEV at UHV level are collected from field experiments, and are analyzed and compared to those from a repeated strike process. The TEV waveforms during disconnector operations are recorded by a self-developed measurement system first. Then, statistical characteristics, such as the pulse number, duration of pulses, frequency components, magnitude and single pulse duration, are extracted. The transmission line theory is introduced to analyze the TEV and is validated by the experimental results. Finally, the relationship between the TEV and the repeated strike process is analyzed. This proves that the pulse voltage of the TEV is proportional to the corresponding breakdown voltage. The results contribute to the definition of the standard testing waveform of the TEV, and can aid the protection of electronic devices in substations by minimizing the threat of this phenomenon.
Pinning time statistics for vortex lines in disordered environments.
Dobramysl, Ulrich; Pleimling, Michel; Täuber, Uwe C
2014-12-01
We study the pinning dynamics of magnetic flux (vortex) lines in a disordered type-II superconductor. Using numerical simulations of a directed elastic line model, we extract the pinning time distributions of vortex line segments. We compare different model implementations for the disorder in the surrounding medium: discrete, localized pinning potential wells that are either attractive and repulsive or purely attractive, and whose strengths are drawn from a Gaussian distribution; as well as continuous Gaussian random potential landscapes. We find that both schemes yield power-law distributions in the pinned phase as predicted by extreme-event statistics, yet they differ significantly in their effective scaling exponents and their short-time behavior.
A search for spectral lines in gamma-ray bursts using TGRS
NASA Astrophysics Data System (ADS)
Kurczynski, P.; Palmer, D.; Seifert, H.; Teegarden, B. J.; Gehrels, N.; Cline, T. L.; Ramaty, R.; Hurley, K.; Madden, N. W.; Pehl, R. H.
1998-05-01
We present the results of an ongoing search for narrow spectral lines in gamma-ray burst data. TGRS, the Transient Gamma-Ray Spectrometer aboard the Wind satellite is a high energy-resolution Ge device. Thus it is uniquely situated among the array of space-based, burst sensitive instruments to look for line features in gamma-ray burst spectra. Our search strategy adopts a two tiered approach. An automated `quick look' scan searches spectra for statistically significant deviations from the continuum. We analyzed all possible time accumulations of spectra as well as individual spectra for each burst. Follow-up analysis of potential line candidates uses model fitting with F-test and χ2 tests for statistical significance.
Interactions dominate the dynamics of visual cognition.
Stephen, Damian G; Mirman, Daniel
2010-04-01
Many cognitive theories have described behavior as the summation of independent contributions from separate components. Contrasting views have emphasized the importance of multiplicative interactions and emergent structure. We describe a statistical approach to distinguishing additive and multiplicative processes and apply it to the dynamics of eye movements during classic visual cognitive tasks. The results reveal interaction-dominant dynamics in eye movements in each of the three tasks, and that fine-grained eye movements are modulated by task constraints. These findings reveal the interactive nature of cognitive processing and are consistent with theories that view cognition as an emergent property of processes that are broadly distributed over many scales of space and time rather than a componential assembly line. Copyright 2009 Elsevier B.V. All rights reserved.
1988-01-29
Electronic Origin of Pentacene in p-Terphenyl by T. P. Carter, M. Manavi, and W. E. Moerner Prepared for Publication inDTIC Journal of Chemical Physics...Classification) Statistical Fine Structure in the Inhomogeneously Broadened Electronic Origin of Pentacene in p-Terphenyl 12. PERSONAL AUTHOR(S) T. P...of pentacene in p-terphenyl using laser FM spectroscopy. Statistical fine structure is time-independent structure on the inhomogeneous line caused by
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming, E-mail: liyanrong@mail.ihep.ac.cn
Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (i.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function ismore » expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.« less
Excursion Processes Associated with Elliptic Combinatorics
NASA Astrophysics Data System (ADS)
Baba, Hiroya; Katori, Makoto
2018-06-01
Researching elliptic analogues for equalities and formulas is a new trend in enumerative combinatorics which has followed the previous trend of studying q-analogues. Recently Schlosser proposed a lattice path model in the square lattice with a family of totally elliptic weight-functions including several complex parameters and discussed an elliptic extension of the binomial theorem. In the present paper, we introduce a family of discrete-time excursion processes on Z starting from the origin and returning to the origin in a given time duration 2 T associated with Schlosser's elliptic combinatorics. The processes are inhomogeneous both in space and time and hence expected to provide new models in non-equilibrium statistical mechanics. By numerical calculation we show that the maximum likelihood trajectories on the spatio-temporal plane of the elliptic excursion processes and of their reduced trigonometric versions are not straight lines in general but are nontrivially curved depending on parameters. We analyze asymptotic probability laws in the long-term limit T → ∞ for a simplified trigonometric version of excursion process. Emergence of nontrivial curves of trajectories in a large scale of space and time from the elementary elliptic weight-functions exhibits a new aspect of elliptic combinatorics.
Excursion Processes Associated with Elliptic Combinatorics
NASA Astrophysics Data System (ADS)
Baba, Hiroya; Katori, Makoto
2018-04-01
Researching elliptic analogues for equalities and formulas is a new trend in enumerative combinatorics which has followed the previous trend of studying q-analogues. Recently Schlosser proposed a lattice path model in the square lattice with a family of totally elliptic weight-functions including several complex parameters and discussed an elliptic extension of the binomial theorem. In the present paper, we introduce a family of discrete-time excursion processes on Z starting from the origin and returning to the origin in a given time duration 2T associated with Schlosser's elliptic combinatorics. The processes are inhomogeneous both in space and time and hence expected to provide new models in non-equilibrium statistical mechanics. By numerical calculation we show that the maximum likelihood trajectories on the spatio-temporal plane of the elliptic excursion processes and of their reduced trigonometric versions are not straight lines in general but are nontrivially curved depending on parameters. We analyze asymptotic probability laws in the long-term limit T → ∞ for a simplified trigonometric version of excursion process. Emergence of nontrivial curves of trajectories in a large scale of space and time from the elementary elliptic weight-functions exhibits a new aspect of elliptic combinatorics.
Analysis of grinding of superalloys and ceramics for off-line process optimization
NASA Astrophysics Data System (ADS)
Sathyanarayanan, G.
The present study has compared the performances of resinoid, vitrified, and electroplated CBN wheels in creep feed grinding of M42 and D2 tool steels. Responses such as a specific energy, normal and tangential forces, and surface roughness were used as measures of performance. It was found that creep feed grinding with resinoid, vitrified, and electroplated CBN wheels has its own advantages, but no single wheel could provide good finish, lower specific energy, and high material removal rates simultaneously. To optimize the CBN grinding with different bonded wheels, a Multiple Criteria Decision Making (MCDM) methodology was used. Creep feed grinding of superalloys, Ti-6Al-4V and Inconel 718, has been modeled by utilizing neural networks to optimize the grinding process. A parallel effort was directed at creep feed grinding of alumina ceramics with diamond wheels to investigate the influence of process variables on responses based on experimental results and statistical analysis. The conflicting influence of variables was observed. This led to the formulation of ceramic grinding process as a multi-objective nonlinear mixed integer problem.
Modelling the line-of-sight contribution in substructure lensing
NASA Astrophysics Data System (ADS)
Despali, Giulia; Vegetti, Simona; White, Simon D. M.; Giocoli, Carlo; van den Bosch, Frank C.
2018-04-01
We investigate how Einstein rings and magnified arcs are affected by small-mass dark-matter haloes placed along the line of sight to gravitational lens systems. By comparing the gravitational signature of line-of-sight haloes with that of substructures within the lensing galaxy, we derive a mass-redshift relation that allows us to rescale the detection threshold (i.e. lowest detectable mass) for substructures to a detection threshold for line-of-sight haloes at any redshift. We then quantify the line-of-sight contribution to the total number density of low-mass objects that can be detected through strong gravitational lensing. Finally, we assess the degeneracy between substructures and line-of-sight haloes of different mass and redshift to provide a statistical interpretation of current and future detections, with the aim of distinguishing between cold dark matter and warm dark matter. We find that line-of-sight haloes statistically dominate with respect to substructures, by an amount that strongly depends on the source and lens redshifts, and on the chosen dark-matter model. Substructures represent about 30 percent of the total number of perturbers for low lens and source redshifts (as for the SLACS lenses), but less than 10 per cent for high-redshift systems. We also find that for data with high enough signal-to-noise ratio and angular resolution, the non-linear effects arising from a double-lens-plane configuration are such that one is able to observationally recover the line-of-sight halo redshift with an absolute error precision of 0.15 at the 68 per cent confidence level.
Effect of the image resolution on the statistical descriptors of heterogeneous media.
Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime
2018-02-01
The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost and the error becomes dependent on the decimation procedure. These results may help us to restrict the amount of information that one can afford to lose during a decimation process, in order to reduce the computational and memory cost, when one aims to diminish the time consumed by a characterization or reconstruction technique, yet maintaining the statistical quality of the digitized sample.
Effect of the image resolution on the statistical descriptors of heterogeneous media
NASA Astrophysics Data System (ADS)
Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime
2018-02-01
The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost and the error becomes dependent on the decimation procedure. These results may help us to restrict the amount of information that one can afford to lose during a decimation process, in order to reduce the computational and memory cost, when one aims to diminish the time consumed by a characterization or reconstruction technique, yet maintaining the statistical quality of the digitized sample.
Statistical Investigation of Supersonic Downflows in the Transition Region above Sunspots
NASA Astrophysics Data System (ADS)
Samanta, Tanmoy; Tian, Hui; Prasad Choudhary, Debi
2018-06-01
Downflows at supersonic speeds have been observed in the transition region (TR) above sunspots for more than three decades. These downflows are often seen in different TR spectral lines above sunspots. We have performed a statistical investigation of these downflows using a large sample that was missing previously. The Interface Region Imaging Spectrograph (IRIS) has provided a wealth of observational data of sunspots at high spatial and spectral resolutions in the past few years. We have identified 60 data sets obtained with IRIS raster scans. Using an automated code, we identified the locations of strong downflows within these sunspots. We found that around 80% of our sample shows supersonic downflows in the Si IV 1403 Å line. These downflows mostly appear in the penumbral regions, though some of them are found in the umbrae. We also found that almost half of these downflows show signatures in chromospheric lines. Furthermore, a detailed spectral analysis was performed by selecting a small spectral window containing the O IV 1400/1401 Å and Si IV 1403 Å lines. Six Gaussian functions were simultaneously fitted to these three spectral lines and their satellite lines associated with the supersonic downflows. We calculated the intensity, Doppler velocity, and line width for these lines. Using the O IV 1400/1401 Å line ratio, we find that the downflow components are around one order of magnitude less dense than the regular components. Results from our statistical analysis suggest that these downflows may originate from the corona and that they are independent of the background TR plasma.
Fuzzy self-learning control for magnetic servo system
NASA Technical Reports Server (NTRS)
Tarn, J. H.; Kuo, L. T.; Juang, K. Y.; Lin, C. E.
1994-01-01
It is known that an effective control system is the key condition for successful implementation of high-performance magnetic servo systems. Major issues to design such control systems are nonlinearity; unmodeled dynamics, such as secondary effects for copper resistance, stray fields, and saturation; and that disturbance rejection for the load effect reacts directly on the servo system without transmission elements. One typical approach to design control systems under these conditions is a special type of nonlinear feedback called gain scheduling. It accommodates linear regulators whose parameters are changed as a function of operating conditions in a preprogrammed way. In this paper, an on-line learning fuzzy control strategy is proposed. To inherit the wealth of linear control design, the relations between linear feedback and fuzzy logic controllers have been established. The exercise of engineering axioms of linear control design is thus transformed into tuning of appropriate fuzzy parameters. Furthermore, fuzzy logic control brings the domain of candidate control laws from linear into nonlinear, and brings new prospects into design of the local controllers. On the other hand, a self-learning scheme is utilized to automatically tune the fuzzy rule base. It is based on network learning infrastructure; statistical approximation to assign credit; animal learning method to update the reinforcement map with a fast learning rate; and temporal difference predictive scheme to optimize the control laws. Different from supervised and statistical unsupervised learning schemes, the proposed method learns on-line from past experience and information from the process and forms a rule base of an FLC system from randomly assigned initial control rules.
Li, Yang; Wu, Zhi-Sheng; Pan, Xiao-Ning; Shi, Xin-Yuan; Guo, Ming-Ye; Xu, Bing; Qiao, Yan-Jiang
2014-10-01
The quality of Chinese materia medica (CMM) is affected by every process in CMM manufacturing. According to multi-unit complex features in the production of CMM, on-line near infrared spectroscopy (NIR) is used as an evaluating technology with its rapid, non-destructive and non-pollution etc. advantages. With the research in institutions, the on-line NIR applied in process analysis and control of CMM was described systematically, and the on-line NIR platform building was used as an example to clarify the feasibility of on-line NIR technology in CMM manufacturing process. Then, from the point of application by pharmaceutical companies, the current on-line NIR research on CMM and its production in pharmaceutical companies was relatively comprehensively summarized. Meanwhile, the types of CMM productions were classified in accordance with two formulations (liquid and solid dosage formulations). The different production processes (extraction, concentration and alcohol precipitation, etc. ) were used as liquid formulation diacritical points; the different types (tablets, capsules and plasters, etc.) were used as solid dosage formulation diacritical points, and the reliability of on-line NIR used in the whole process in CMM production was proved in according to the summary of literatures in recent 10 years, which could support the modernization of CMM production.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rasca, Anthony P.; Chen, James; Pevtsov, Alexei A., E-mail: anthony.rasca.ctr@nrl.navy.mil
Recent observations of the photosphere using high spatial and temporal resolution show small dynamic features at or below the current resolving limits. A new pixel dynamics method has been developed to analyze spectral profiles and quantify changes in line displacement, width, asymmetry, and peakedness of photospheric absorption lines. The algorithm evaluates variations of line profile properties in each pixel and determines the statistics of such fluctuations averaged over all pixels in a given region. The method has been used to derive statistical characteristics of pixel fluctuations in observed quiet-Sun regions, an active region with no eruption, and an active regionmore » with an ongoing eruption. Using Stokes I images from the Vector Spectromagnetograph (VSM) of the Synoptic Optical Long-term Investigations of the Sun (SOLIS) telescope on 2012 March 13, variations in line width and peakedness of Fe i 6301.5 Å are shown to have a distinct spatial and temporal relationship with an M7.9 X-ray flare in NOAA 11429. This relationship is observed as stationary and contiguous patches of pixels adjacent to a sunspot exhibiting intense flattening in the line profile and line-center displacement as the X-ray flare approaches peak intensity, which is not present in area scans of the non-eruptive active region. The analysis of pixel dynamics allows one to extract quantitative information on differences in plasma dynamics on sub-pixel scales in these photospheric regions. The analysis can be extended to include the Stokes parameters and study signatures of vector components of magnetic fields and coupled plasma properties.« less
Hu, Tao; Zuo, Yu-ling; Zhou, Xue-dong
2004-08-01
It has been demonstrated that when a high-speed handpiece stops rotating, negative pressure will form. Thus, contaminating fluid in which there are many kinds of bacteria and viruses from the external environment will retract into various compartments of the handpiece and the dental unit. The purpose of the study is to compare the preventing effect of antisuction designed handpiece and conventional handpiece on viral contamination at different rotating times. Twenty handpieces with or without antisuction device (10 of each) were used in the study. Each handpiece was submerged into 10(-6) microg/microl HBV particle solution rotating 5 and 10 times respectively (every time rotating for 10 seconds). Samples were obtained from the water line and chip air line of the handpieces and examined by RT-PCR. At the same rotating times, there was statistical significance of the viral concentration between the two kinds of handpieces (P < 0.05) . However, there was no statistical significance of the viral concentration between different rotating times in each group (P > 0.05). Contamination taking place in both water and air lines of dental handpiece was not enhanced by increasing the number of rotating times of the handpiece. The antisuction devices installed into the water line and chip air line were demonstrated to prevent viral contamination effectively.
Prediction of Multiple-Trait and Multiple-Environment Genomic Data Using Recommender Systems.
Montesinos-López, Osval A; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José C; Mota-Sanchez, David; Estrada-González, Fermín; Gillberg, Jussi; Singh, Ravi; Mondal, Suchismita; Juliana, Philomin
2018-01-04
In genomic-enabled prediction, the task of improving the accuracy of the prediction of lines in environments is difficult because the available information is generally sparse and usually has low correlations between traits. In current genomic selection, although researchers have a large amount of information and appropriate statistical models to process it, there is still limited computing efficiency to do so. Although some statistical models are usually mathematically elegant, many of them are also computationally inefficient, and they are impractical for many traits, lines, environments, and years because they need to sample from huge normal multivariate distributions. For these reasons, this study explores two recommender systems: item-based collaborative filtering (IBCF) and the matrix factorization algorithm (MF) in the context of multiple traits and multiple environments. The IBCF and MF methods were compared with two conventional methods on simulated and real data. Results of the simulated and real data sets show that the IBCF technique was slightly better in terms of prediction accuracy than the two conventional methods and the MF method when the correlation was moderately high. The IBCF technique is very attractive because it produces good predictions when there is high correlation between items (environment-trait combinations) and its implementation is computationally feasible, which can be useful for plant breeders who deal with very large data sets. Copyright © 2018 Montesinos-Lopez et al.
NASA Astrophysics Data System (ADS)
Pollard, D.; Chang, W.; Haran, M.; Applegate, P.; DeConto, R.
2015-11-01
A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ~ 20 000 years. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree quite well with the more advanced techniques, but only for a large ensemble with full factorial parameter sampling. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds. Each run is extended 5000 years into the "future" with idealized ramped climate warming. In the majority of runs with reasonable scores, this produces grounding-line retreat deep into the West Antarctic interior, and the analysis provides sea-level-rise envelopes with well defined parametric uncertainty bounds.
Prediction of Multiple-Trait and Multiple-Environment Genomic Data Using Recommender Systems
Montesinos-López, Osval A.; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José C.; Mota-Sanchez, David; Estrada-González, Fermín; Gillberg, Jussi; Singh, Ravi; Mondal, Suchismita; Juliana, Philomin
2018-01-01
In genomic-enabled prediction, the task of improving the accuracy of the prediction of lines in environments is difficult because the available information is generally sparse and usually has low correlations between traits. In current genomic selection, although researchers have a large amount of information and appropriate statistical models to process it, there is still limited computing efficiency to do so. Although some statistical models are usually mathematically elegant, many of them are also computationally inefficient, and they are impractical for many traits, lines, environments, and years because they need to sample from huge normal multivariate distributions. For these reasons, this study explores two recommender systems: item-based collaborative filtering (IBCF) and the matrix factorization algorithm (MF) in the context of multiple traits and multiple environments. The IBCF and MF methods were compared with two conventional methods on simulated and real data. Results of the simulated and real data sets show that the IBCF technique was slightly better in terms of prediction accuracy than the two conventional methods and the MF method when the correlation was moderately high. The IBCF technique is very attractive because it produces good predictions when there is high correlation between items (environment–trait combinations) and its implementation is computationally feasible, which can be useful for plant breeders who deal with very large data sets. PMID:29097376
Mozumdar, Biswita C; Hornsby, Douglas Neal; Gogate, Adheet S; Intriere, Lisa A; Hanson, Richard; McGreal, Karen; Kelly, Pauline; Ros, Pablo
2003-08-01
To study end-user attitudes and preferences with respect to radiology scheduling systems and to assess implications for retention and extension of the referral base. A study of the institution's historical data indicated reduced satisfaction with the process of patient scheduling in recent years. Sixty physicians who referred patients to a single, large academic radiology department received the survey. The survey was designed to identify (A) the preferred vehicle for patient scheduling (on-line versus telephone scheduling) and (B) whether ease of scheduling was a factor in physicians referring patients to other providers. Referring physicians were asked to forward the survey to any appropriate office staff member in case the latter scheduled appointments for patients. Users were asked to provide comments and suggestions for improvement. The statistical method used was the analysis of proportions. Thirty-three responses were received, corresponding to a return rate of 55%. Twenty-six of the 33 respondents (78.8%, P < .01) stated they were willing to try an online scheduling system; 16 of which tried the system. Twelve of the 16 (75%, P < .05) preferred the on-line application to the telephone system, stating logistical simplification as the primary reason for preference. Three (18.75%) did not consider online scheduling to be more convenient than traditional telephone scheduling. One respondent did not indicate any preference. Eleven of 33 users (33.33%, P < .001) stated that they would change radiology service providers if expectations of scheduling ease are not met. On-line scheduling applications are becoming the preferred scheduling vehicle. Augmenting their capabilities and availability can simplify the scheduling process, improve referring physician satisfaction, and provide a competitive advantage. Referrers are willing to change providers if scheduling expectations are not met.
Ion temperatures in HIP-1 and SUMMA from charge-exchange neutral optical emission spectra
NASA Technical Reports Server (NTRS)
Patch, R. W.; Lauver, M. R.
1976-01-01
Ion temperatures were obtained from observations of the H sub alpha, D sub alpha, and He 587.6 nm lines emitted from hydrogen, deuterium, and helium plasmas in the SUMMA and HIP-1 mirror devices at Lewis Research Center. Steady state discharges were formed by applying a radially inward dc electric field between cylindrical or annular anodes and hollow cathodes located at the peaks of the mirrors. The ion temperatures were found from the Doppler broadening of the charge-exchange components of spectral lines. A statistical method was developed for obtaining scaling relations of ion temperature as a function of current, voltage, and magnetic flux density. Derivations are given that take into account triangular monochromator slit functions, loss cones, and superimposed charge-exchange processes. In addition, the Doppler broadening was found to be sensitive to the influence of drift on charge-exchange cross section. The effects of finite ion-cyclotron radius, cascading, and delayed emission are reviewed.
NASA Technical Reports Server (NTRS)
Sundqvist, Jon O.; Owocki, Stanley P.; Cohen, David H.; Leutenegger, Maurice A.; Townsend, Richard H. D.
2002-01-01
We present a generalised formalism for treating the porosity-associated reduction in continuum opacity that occurs when individual clumps in a stochastic medium become optically thick. As in previous work, we concentrate on developing bridging laws between the limits of optically thin and thick clumps. We consider geometries resulting in either isotropic or anisotropic effective opacity, and, in addition to an idealised model in which all clumps have the same local overdensity and scale, we also treat an ensemble of clumps with optical depths set by Markovian statistics. This formalism is then applied to the specific case of bound-free absorption of X- rays in hot star winds, a process not directly affected by clumping in the optically thin limit. We find that the Markov model gives surprisingly similar results to those found previously for the single clump model, suggesting that porous opacity is not very sensitive to details of the assumed clump distribution function. Further, an anisotropic effective opacity favours escape of X-rays emitted in the tangential direction (the venetian blind effect), resulting in a bump of higher flux close to line centre as compared to profiles computed from isotropic porosity models. We demonstrate how this characteristic line shape may be used to diagnose the clump geometry, and we confirm previous results that for optically thick clumping to significantly influence X-ray line profiles, very large porosity lengths, defined as the mean free path between clumps, are required. Moreover, we present the first X-ray line profiles computed directly from line-driven instability simulations using a 3-D patch method, and find that porosity effects from such models also are very small. This further supports the view that porosity has, at most, a marginal effect on X-ray line diagnostics in O stars, and therefore that these diagnostics do indeed provide a good clumping insensitive method for deriving O star mass-loss rates.
NASA Technical Reports Server (NTRS)
Sundqvist, Jon O.; Owocki, Stanley P.; Cohen, David H.; Leutenegger, Maurice A.
2011-01-01
We present a generalised formalism for treating the porosity-associated reduction in continuum opacity that occurs when individual clumps in a stochastic medium become optically thick. As in previous work, we concentrate on developing bridging laws between the limits of optically thin and thick clumps. We consider geometries resulting in either isotropic or anisotropic effective opacity, and, in addition to an idealised model in which all clumps have the same local overdensity and scale, we also treat an ensemble of clumps with optical depths set by Markovian statistics. This formalism is then applied to the specific case of bound-free absorption of X- rays in hot star winds, a process not directly affected by clumping in the optically thin limit. We find that the Markov model gives surprisingly similar results to those found previously for the single clump model, suggesting that porous opacity is not very sensitive to details of the assumed clump distribution function. Further, an anisotropic effective opacity favours escape of X-rays emitted in the tangential direction (the venetian blind effect), resulting in a bump of higher flux close to line centre as compared to profiles computed from isotropic porosity models. We demonstrate how this characteristic line shape may be used to diagnose the clump geometry, and we confirm previous results that for optically thick clumping to significantly influence X-ray line profiles, very large porosity lengths, defined as the mean free path between clumps, are required. Moreover, we present the first X-ray line profiles computed directly from line-driven instability simulations using a 3-D patch method, and find that porosity effects from such models also are very small. This further supports the view that porosity has, at most, a marginal effect on X-ray line diagnostics in O stars, and therefore that these diagnostics do indeed provide a good clumping insensitive method for deriving O star mass-loss rates.
Malisch, Jessica L; deWolski, Karen; Meek, Thomas H; Acosta, Wendy; Middleton, Kevin M; Crino, Ondi L; Garland, Theodore
In vertebrates, acute stressors-although short in duration-can influence physiology and behavior over a longer time course, which might have important ramifications under natural conditions. In laboratory rats, for example, acute stress has been shown to increase anxiogenic behaviors for days after a stressor. In this study, we quantified voluntary wheel-running behavior for 22 h following a restraint stress and glucocorticoid levels 24 h postrestraint. We utilized mice from four replicate lines that have been selectively bred for high voluntary wheel-running activity (HR mice) for 60 generations and their nonselected control (C) lines to examine potential interactions between exercise propensity and sensitivity to stress. Following 6 d of wheel access on a 12L∶12D photo cycle (0700-1900 hours, as during the routine selective breeding protocol), 80 mice were physically restrained for 40 min, beginning at 1400 hours, while another 80 were left undisturbed. Relative to unrestrained mice, wheel running increased for both HR and C mice during the first hour postrestraint (P < 0.0001) but did not differ 2 or 3 h postrestraint. Wheel running was also examined at four distinct phases of the photoperiod. Running in the period of 1600-1840 hours was unaffected by restraint stress and did not differ statistically between HR and C mice. During the period of peak wheel running (1920-0140 hours), restrained mice tended to run fewer revolutions (-11%; two-tailed P = 0.0733), while HR mice ran 473% more than C (P = 0.0008), with no restraint × line type interaction. Wheel running declined for all mice in the latter part of the scotophase (0140-0600 hours), restraint had no statistical effect on wheel running, but HR again ran more than C (+467%; P = 0.0122). Finally, during the start of the photophase (0720-1200 hours), restraint increased running by an average of 53% (P = 0.0443) in both line types, but HR and C mice did not differ statistically. Mice from HR lines had statistically higher plasma corticosterone concentrations than C mice, with no statistical effect of restraint and no interaction between line type and restraint. Overall, these results indicate that acute stress can affect locomotor activity (or activity patterns) for many hours, with the most prominent effect being an increase in activity during a period of typical inactivity at the start of the photophase, 15-20 h poststressor.
Prevalence and molecular profiles of Salmonella collected at a commercial turkey processing plant.
Nde, Chantal W; Sherwood, Julie S; Doetkott, Curt; Logue, Catherine M
2006-08-01
In this study, whole carcasses were sampled at eight stages on a turkey-processing line and Salmonella prevalence was determined using enrichment techniques. Recovered Salmonella was further characterized using serotyping and the molecular profiles were determined using pulsed-field gel electrophoresis (PFGE). Prevalence data showed that contamination rates varied along the line and were greatest after defeathering and after chilling. Analysis of contamination in relation to serotypes and PFGE profiles found that on some visits the same serotype was present all along the processing line while on other days, additional serotypes were recovered that were not detected earlier on the line, suggesting that the birds harbored more than one serotype of Salmonella or there was cross-contamination occurring during processing. Overall, this study found fluctuations in Salmonella prevalence along a turkey-processing line. Following washing, Salmonella prevalence was significantly reduced, suggesting that washing is critical for Salmonella control in turkey processing and has significant application for controlling Salmonella at the postdefeathering and prechill stages where prevalence increased.
Identification of nuclear genes controlling chlorophyll synthesis in barley by RNA-seq.
Shmakov, Nickolay A; Vasiliev, Gennadiy V; Shatskaya, Natalya V; Doroshkov, Alexey V; Gordeeva, Elena I; Afonnikov, Dmitry A; Khlestkina, Elena K
2016-11-16
Albinism in plants is characterized by lack of chlorophyll and results in photosynthesis impairment, abnormal plant development and premature death. These abnormalities are frequently encountered in interspecific crosses and tissue culture experiments. Analysis of albino mutant phenotypes with full or partial chlorophyll deficiency can shed light on genetic determinants and molecular mechanisms of albinism. Here we report analysis of RNA-seq transcription profiling of barley (Hordeum vulgare L.) near-isogenic lines, one of which is a carrier of mutant allele of the Alm gene for albino lemma and pericarp phenotype (line i:BwAlm). 1221 genome fragments have statistically significant changes in expression levels between lines i:BwAlm and Bowman, with 148 fragments having increased expression levels in line i:BwAlm, and 1073 genome fragments, including 42 plastid operons, having decreased levels of expression in line i:BwAlm. We detected functional dissimilarity between genes with higher and lower levels of expression in i:BwAlm line. Genes with lower level of expression in the i:BwAlm line are mostly associated with photosynthesis and chlorophyll synthesis, while genes with higher expression level are functionally associated with vesicle transport. Differentially expressed genes are shown to be involved in several metabolic pathways; the largest fraction of such genes was observed for the Calvin-Benson-Bassham cycle. Finally, de novo assembly of transcriptome contains several transcripts, not annotated in current H. vulgare genome version. Our results provide the new information about genes which could be involved in formation of albino lemma and pericarp phenotype. They demonstrate the interplay between nuclear and chloroplast genomes in this physiological process.
NASA Astrophysics Data System (ADS)
Zhao, Yinan; Ge, Jian; Yuan, Xiaoyong; Li, Xiaolin; Zhao, Tiffany; Wang, Cindy
2018-01-01
Metal absorption line systems in the distant quasar spectra have been used as one of the most powerful tools to probe gas content in the early Universe. The MgII λλ 2796, 2803 doublet is one of the most popular metal absorption lines and has been used to trace gas and global star formation at redshifts between ~0.5 to 2.5. In the past, machine learning algorithms have been used to detect absorption lines systems in the large sky survey, such as Principle Component Analysis, Gaussian Process and decision tree, but the overall detection process is not only complicated, but also time consuming. It usually takes a few months to go through the entire quasar spectral dataset from each of the Sloan Digital Sky Survey (SDSS) data release. In this work, we applied the deep neural network, or “ deep learning” algorithms, in the most recently SDSS DR14 quasar spectra and were able to randomly search 20000 quasar spectra and detect 2887 strong Mg II absorption features in just 9 seconds. Our detection algorithms were verified with previously released DR12 and DR7 data and published Mg II catalog and the detection accuracy is 90%. This is the first time that deep neural network has demonstrated its promising power in both speed and accuracy in replacing tedious, repetitive human work in searching for narrow absorption patterns in a big dataset. We will present our detection algorithms and also statistical results of the newly detected Mg II absorption lines.
Searching for the 3.5 keV Line in the Stacked Suzaku Observations of Galaxy Clusters
NASA Technical Reports Server (NTRS)
Bulbul, Esra; Markevitch, Maxim; Foster, Adam; Miller, Eric; Bautz, Mark; Lowenstein, Mike; Randall, Scott W.; Smith, Randall K.
2016-01-01
We perform a detailed study of the stacked Suzaku observations of 47 galaxy clusters, spanning a redshift range of 0.01-0.45, to search for the unidentified 3.5 keV line. This sample provides an independent test for the previously detected line. We detect a 2sigma-significant spectral feature at 3.5 keV in the spectrum of the full sample. When the sample is divided into two subsamples (cool-core and non-cool core clusters), the cool-core subsample shows no statistically significant positive residuals at the line energy. A very weak (approx. 2sigma confidence) spectral feature at 3.5 keV is permitted by the data from the non-cool-core clusters sample. The upper limit on a neutrino decay mixing angle of sin(sup 2)(2theta) = 6.1 x 10(exp -11) from the full Suzaku sample is consistent with the previous detections in the stacked XMM-Newton sample of galaxy clusters (which had a higher statistical sensitivity to faint lines), M31, and Galactic center, at a 90% confidence level. However, the constraint from the present sample, which does not include the Perseus cluster, is in tension with previously reported line flux observed in the core of the Perseus cluster with XMM-Newton and Suzaku.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morioka, A.; Misawa, H.; Obara, T.
Solar micro-type III radio bursts are elements of the so-called type III storms and are characterized by short-lived, continuous, and weak emissions. Their frequency of occurrence with respect to radiation power is quite different from that of ordinary type III bursts, suggesting that the generation process is not flare-related, but due to some recurrent acceleration processes around the active region. We examine the relationship of micro-type III radio bursts with coronal streamers. We also explore the propagation channel of bursts in the outer corona, the acceleration process, and the escape route of electron beams. It is observationally confirmed that micro-typemore » III bursts occur near the edge of coronal streamers. The magnetic field line of the escaping electron beams is tracked on the basis of the frequency drift rate of micro-type III bursts and the electron density distribution model. The results demonstrate that electron beams are trapped along closed dipolar field lines in the outer coronal region, which arise from the interface region between the active region and the coronal hole. A 22 year statistical study reveals that the apex altitude of the magnetic loop ranges from 15 to 50 R{sub S}. The distribution of the apex altitude has a sharp upper limit around 50 R{sub S} suggesting that an unknown but universal condition regulates the upper boundary of the streamer dipolar field.« less
Relationship of the functional movement screen in-line lunge to power, speed, and balance measures.
Hartigan, Erin H; Lawrence, Michael; Bisson, Brian M; Torgerson, Erik; Knight, Ryan C
2014-05-01
The in-line lunge of the Functional Movement Screen (FMS) evaluates lateral stability, balance, and movement asymmetries. Athletes who score poorly on the in-line lunge should avoid activities requiring power or speed until scores are improved, yet relationships between the in-line lunge scores and other measures of balance, power, and speed are unknown. (1) Lunge scores will correlate with center of pressure (COP), maximum jump height (MJH), and 36.6-meter sprint time and (2) there will be no differences between limbs on lunge scores, MJH, or COP. Descriptive laboratory study. Level 3. Thirty-seven healthy, active participants completed the first 3 tasks of the FMS (eg, deep squat, hurdle step, in-line lunge), unilateral drop jumps, and 36.6-meter sprints. A 3-dimensional motion analysis system captured MJH. Force platforms measured COP excursion. A laser timing system measured 36.6-m sprint time. Statistical analyses were used to determine whether a relationship existed between lunge scores and COP, MJH, and 36.6-m speed (Spearman rho tests) and whether differences existed between limbs in lunge scores (Wilcoxon signed-rank test), MJH, and COP (paired t tests). Lunge scores were not significantly correlated with COP, MJH, or 36.6-m sprint time. Lunge scores, COP excursion, and MJH were not statistically different between limbs. Performance on the FMS in-line lunge was not related to balance, power, or speed. Healthy participants were symmetrical in lunging measures and MJH. Scores on the FMS in-line lunge should not be attributed to power, speed, or balance performance without further examination. However, assessing limb symmetry appears to be clinically relevant.
Zero Autocorrelation Waveforms: A Doppler Statistic and Multifunction Problems
2006-01-01
by ANSI Std Z39-18 It is natural to refer to A as the ambiguity function of u, since in the usual setting on the real line R, the analogue ambiguity...Doppler statistic |Cu,uek(j)| is excellent and provable for detecting deodorized Doppler frequency shift [11] (see Fig. 2). Also, if one graphs only
Consistent Tolerance Bounds for Statistical Distributions
NASA Technical Reports Server (NTRS)
Mezzacappa, M. A.
1983-01-01
Assumption that sample comes from population with particular distribution is made with confidence C if data lie between certain bounds. These "confidence bounds" depend on C and assumption about distribution of sampling errors around regression line. Graphical test criteria using tolerance bounds are applied in industry where statistical analysis influences product development and use. Applied to evaluate equipment life.
Nodal portraits of quantum billiards: Domains, lines, and statistics
NASA Astrophysics Data System (ADS)
Jain, Sudhir Ranjan; Samajdar, Rhine
2017-10-01
This is a comprehensive review of the nodal domains and lines of quantum billiards, emphasizing a quantitative comparison of theoretical findings to experiments. The nodal statistics are shown to distinguish not only between regular and chaotic classical dynamics but also between different geometric shapes of the billiard system itself. How a random superposition of plane waves can model chaotic eigenfunctions is discussed and the connections of the complex morphology of the nodal lines thereof to percolation theory and Schramm-Loewner evolution are highlighted. Various approaches to counting the nodal domains—using trace formulas, graph theory, and difference equations—are also illustrated with examples. The nodal patterns addressed pertain to waves on vibrating plates and membranes, acoustic and electromagnetic modes, wave functions of a "particle in a box" as well as to percolating clusters, and domains in ferromagnets, thus underlining the diversity and far-reaching implications of the problem.
49 CFR Schedule G to Subpart B of... - Selected Statistical Data
Code of Federal Regulations, 2010 CFR
2010-10-01
... 49 Transportation 8 2010-10-01 2010-10-01 false Selected Statistical Data G Schedule G to Subpart... Statistical Data [Dollars in thousands] () Greyhound Lines, Inc. () Trailways combined () All study carriers.... 9002, L. 9, col. (b) Other Statistics: 25Number of regulator route intercity passenger miles Sch. 9002...
Statistical equilibrium calculations for silicon in early-type model stellar atmospheres
NASA Technical Reports Server (NTRS)
Kamp, L. W.
1976-01-01
Line profiles of 36 multiplets of silicon (Si) II, III, and IV were computed for a grid of model atmospheres covering the range from 15,000 to 35,000 K in effective temperature and 2.5 to 4.5 in log (gravity). The computations involved simultaneous solution of the steady-state statistical equilibrium equations for the populations and of the equation of radiative transfer in the lines. The variables were linearized, and successive corrections were computed until a minimal accuracy of 1/1000 in the line intensities was reached. The common assumption of local thermodynamic equilibrium (LTE) was dropped. The model atmospheres used also were computed by non-LTE methods. Some effects that were incorporated into the calculations were the depression of the continuum by free electrons, hydrogen and ionized helium line blocking, and auto-ionization and dielectronic recombination, which later were found to be insignificant. Use of radiation damping and detailed electron (quadratic Stark) damping constants had small but significant effects on the strong resonance lines of Si III and IV. For weak and intermediate-strength lines, large differences with respect to LTE computations, the results of which are also presented, were found in line shapes and strengths. For the strong lines the differences are generally small, except for the models at the hot, low-gravity extreme of our range. These computations should be useful in the interpretation of the spectra of stars in the spectral range B0-B5, luminosity classes III, IV, and V.
Self-tuning digital Mössbauer detection system
NASA Astrophysics Data System (ADS)
Veiga, A.; Grunfeld, C. M.; Pasquevich, G. A.; Mendoza Zélis, P.; Martínez, N.; Sánchez, F. H.
2014-01-01
Long term gamma spectroscopy experiments involving single-channel analyzer equipment depend upon thermal stability of the detector and its associated high-voltage supply. Assuming constant discrimination levels, a drift in the detector gain impacts the output rate, producing an effect on the output spectrum. In some cases (e.g. single-energy resonant absorption experiments) data of interest can be completely lost. We present a digital self-adapting discrimination strategy that tracks emission line shifts using statistical measurements on a predefined region-of-interest of the spectrum. It is developed in the form of a synthesizable module that can be intercalated in the digital processing chain. It requires a moderate to small amount of digital resources and can be easily activated and deactivated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Songaila, A.; Cowie, L. L., E-mail: acowie@ifa.hawaii.edu
2014-10-01
The unequivocal demonstration of temporal or spatial variability in a fundamental constant of nature would be of enormous significance. Recent attempts to measure the variability of the fine-structure constant α over cosmological time, using high-resolution spectra of high-redshift quasars observed with 10 m class telescopes, have produced conflicting results. We use the many multiplet (MM) method with Mg II and Fe II lines on very high signal-to-noise, high-resolution (R = 72, 000) Keck HIRES spectra of eight narrow quasar absorption systems. We consider both systematic uncertainties in spectrograph wavelength calibration and also velocity offsets introduced by complex velocity structure inmore » even apparently simple and weak narrow lines and analyze their effect on claimed variations in α. We find no significant change in α, Δα/α = (0.43 ± 0.34) × 10{sup –5}, in the redshift range z = 0.7-1.5, where this includes both statistical and systematic errors. We also show that the scatter in measurements of Δα/α arising from absorption line structure can be considerably larger than assigned statistical errors even for apparently simple and narrow absorption systems. We find a null result of Δα/α = (– 0.59 ± 0.55) × 10{sup –5} in a system at z = 1.7382 using lines of Cr II, Zn II, and Mn II, whereas using Cr II and Zn II lines in a system at z = 1.6614 we find a systematic velocity trend that, if interpreted as a shift in α, would correspond to Δα/α = (1.88 ± 0.47) × 10{sup –5}, where both results include both statistical and systematic errors. This latter result is almost certainly caused by varying ionic abundances in subcomponents of the line: using Mn II, Ni II, and Cr II in the analysis changes the result to Δα/α = (– 0.47 ± 0.53) × 10{sup –5}. Combining the Mg II and Fe II results with estimates based on Mn II, Ni II, and Cr II gives Δα/α = (– 0.01 ± 0.26) × 10{sup –5}. We conclude that spectroscopic measurements of quasar absorption lines are not yet capable of unambiguously detecting variation in α using the MM method.« less
NASA Astrophysics Data System (ADS)
Lee, S.; Oh, S.; Lee, J.; Hong, S.
2013-12-01
We have investigated the statistical relationship of the solar active region to predict the solar flare event analyzing the sunspot catalogue, which has been newly constructed from the SOHO MDI observation data during the period from 1996 to 2011 (Solar Cycle 23 & 24) by ASSA(Automatic Solar Synoptic Analyzer) algorithms. The prediction relation has been made by machine-learning algorithms to establish a short- term flare prediction model for operational use in near future. In this study, continuum and magnetogram images observed by SOHO has been processed to yield 15-year sunspot group catalogue that contains various physical parameters such as sunspot area, extent, asymmetry measure of largest penumbral sunspot, roughness of magnetic neutral line as well as McIntosh and Mt. Wilson classification results.The latest result of our study will be presented and the new approach to the prediction of the solar flare will be discussed.
Jopp, Eilin; Scheffler, Christiane; Hermanussen, Michael
2014-01-01
Screening is an important issue in medicine and is used to early identify unrecognised diseases in persons who are apparently in good health. Screening strongly relies on the concept of "normal values". Normal values are defined as values that are frequently observed in a population and usually range within certain statistical limits. Screening for obesity should start early as the prevalence of obesity consolidates already at early school age. Though widely practiced, measuring BMI is not the ultimate solution for detecting obesity. Children with high BMI may be "robust" in skeletal dimensions. Assessing skeletal robustness and in particularly assessing developmental tempo in adolescents are also important issues in health screening. Yet, in spite of the necessity of screening investigations, appropriate reference values are often missing. Meanwhile, new concepts of growth diagrams have been developed. Stage line diagrams are useful for tracking developmental processes over time. Functional data analyses have efficiently been used for analysing longitudinal growth in height and assessing the tempo of maturation. Convenient low-cost statistics have also been developed for generating synthetic national references.
Stand-off transmission lines and method for making same
Tuckerman, David B.
1991-01-01
Standoff transmission lines in an integrated circuit structure are formed by etching away or removing the portion of the dielectric layer separating the microstrip metal lines and the ground plane from the regions that are not under the lines. The microstrip lines can be fabricated by a subtractive process of etching a metal layer, an additive process of direct laser writing fine lines followed by plating up the lines or a subtractive/additive process in which a trench is etched over a nucleation layer and the wire is electrolytically deposited. Microstrip lines supported on freestanding posts of dielectric material surrounded by air gaps are produced. The average dielectric constant between the lines and ground plane is reduced, resulting in higher characteristic impedance, less crosstalk between lines, increased signal propagation velocities, and reduced wafer stress.
Are some BL Lac objects artefacts of gravitational lensing?
NASA Technical Reports Server (NTRS)
Ostriker, J. P.; Vietri, M.
1985-01-01
It is proposed here that a significant fraction of BL Lac objects are optically violently variable quasars whose continuum emission has been greatly amplified, relative to the line emission, by pointlike gravitational lenses in intervening galaxies. Several anomalous physical and statistical properties of BL Lacs can be understood on the basis of this model, which is immediately testable on the basis of absorption line studies and by direct imaging.
Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output
NASA Astrophysics Data System (ADS)
Milroy, D.; Hammerling, D.; Baker, A. H.
2017-12-01
Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.
Yang, Chan; Xu, Bing; Zhang, Zhi-Qiang; Wang, Xin; Shi, Xin-Yuan; Fu, Jing; Qiao, Yan-Jiang
2016-10-01
Blending uniformity is essential to ensure the homogeneity of Chinese medicine formula particles within each batch. This study was based on the blending process of ebony spray dried powder and dextrin(the proportion of dextrin was 10%),in which the analysis of near infrared (NIR) diffuse reflectance spectra was collected from six different sampling points in combination with moving window F test method in order to assess the blending uniformity of the blending process.The method was validated by the changes of citric acid content determined by the HPLC. The results of moving window F test method showed that the ebony spray dried powder and dextrin was homogeneous during 200-300 r and was segregated during 300-400 r. An advantage of this method is that the threshold value is defined statistically, not empirically and thus does not suffer from threshold ambiguities in common with the moving block standard deviatiun (MBSD). And this method could be employed to monitor other blending process of Chinese medicine powders on line. Copyright© by the Chinese Pharmaceutical Association.
Etzioni, Ruth; Gulati, Roman
2013-04-01
In our article about limitations of basing screening policy on screening trials, we offered several examples of ways in which modeling, using data from large screening trials and population trends, provided insights that differed somewhat from those based only on empirical trial results. In this editorial, we take a step back and consider the general question of whether randomized screening trials provide the strongest evidence for clinical guidelines concerning population screening programs. We argue that randomized trials provide a process that is designed to protect against certain biases but that this process does not guarantee that inferences based on empirical results from screening trials will be unbiased. Appropriate quantitative methods are key to obtaining unbiased inferences from screening trials. We highlight several studies in the statistical literature demonstrating that conventional survival analyses of screening trials can be misleading and list a number of key questions concerning screening harms and benefits that cannot be answered without modeling. Although we acknowledge the centrality of screening trials in the policy process, we maintain that modeling constitutes a powerful tool for screening trial interpretation and screening policy development.
VHSIC Electronics and the Cost of Air Force Avionics in the 1990s
1990-11-01
circuit. LRM Line replaceable module. LRU Line replaceable unit. LSI Large-scale integration. LSTTL Tow-power Schottky Transitor -to-Transistor Logic...displays, communications/navigation/identification, electronic combat equipment, dispensers, and computers. These CERs, which statistically relate the...some of the reliability numbers, and adding the F-15 and F-16 to obtain the data sample shown in Table 6. Both suite costs and reliability statistics
Basu, Anindya; Leong, Susanna Su Jan
2012-02-03
The Hepatitis B Virus X (HBx) protein is a potential therapeutic target for the treatment of hepatocellular carcinoma. However, consistent expression of the protein as insoluble inclusion bodies in bacteria host systems has largely hindered HBx manufacturing via economical biosynthesis routes, thereby impeding the development of anti-HBx therapeutic strategies. To eliminate this roadblock, this work reports the development of the first 'chromatography refolding'-based bioprocess for HBx using immobilised metal affinity chromatography (IMAC). This process enabled production of HBx at quantities and purity that facilitate their direct use in structural and molecular characterization studies. In line with the principles of quality by design (QbD), we used a statistical design of experiments (DoE) methodology to design the optimum process which delivered bioactive HBx at a productivity of 0.21 mg/ml/h at a refolding yield of 54% (at 10 mg/ml refolding concentration), which was 4.4-fold higher than that achieved in dilution refolding. The systematic DoE methodology adopted for this study enabled us to obtain important insights into the effect of different bioprocess parameters like the effect of buffer exchange gradients on HBx productivity and quality. Such a bioprocess design approach can play a pivotal role in developing intensified processes for other novel proteins, and hence helping to resolve validation and speed-to-market challenges faced by the biopharmaceutical industry today. Copyright © 2011 Elsevier B.V. All rights reserved.
Statistics of Magnetic Reconnection X-Lines in Kinetic Turbulence
NASA Astrophysics Data System (ADS)
Haggerty, C. C.; Parashar, T.; Matthaeus, W. H.; Shay, M. A.; Wan, M.; Servidio, S.; Wu, P.
2016-12-01
In this work we examine the statistics of magnetic reconnection (x-lines) and their associated reconnection rates in intermittent current sheets generated in turbulent plasmas. Although such statistics have been studied previously for fluid simulations (e.g. [1]), they have not yet been generalized to fully kinetic particle-in-cell (PIC) simulations. A significant problem with PIC simulations, however, is electrostatic fluctuations generated due to numerical particle counting statistics. We find that analyzing gradients of the magnetic vector potential from the raw PIC field data identifies numerous artificial (or non-physical) x-points. Using small Orszag-Tang vortex PIC simulations, we analyze x-line identification and show that these artificial x-lines can be removed using sub-Debye length filtering of the data. We examine how turbulent properties such as the magnetic spectrum and scale dependent kurtosis are affected by particle noise and sub-Debye length filtering. We subsequently apply these analysis methods to a large scale kinetic PIC turbulent simulation. Consistent with previous fluid models, we find a range of normalized reconnection rates as large as ½ but with the bulk of the rates being approximately less than to 0.1. [1] Servidio, S., W. H. Matthaeus, M. A. Shay, P. A. Cassak, and P. Dmitruk (2009), Magnetic reconnection and two-dimensional magnetohydrodynamic turbulence, Phys. Rev. Lett., 102, 115003.
Capture cross sections on unstable nuclei
NASA Astrophysics Data System (ADS)
Tonchev, A. P.; Escher, J. E.; Scielzo, N.; Bedrossian, P.; Ilieva, R. S.; Humby, P.; Cooper, N.; Goddard, P. M.; Werner, V.; Tornow, W.; Rusev, G.; Kelley, J. H.; Pietralla, N.; Scheck, M.; Savran, D.; Löher, B.; Yates, S. W.; Crider, B. P.; Peters, E. E.; Tsoneva, N.; Goriely, S.
2017-09-01
Accurate neutron-capture cross sections on unstable nuclei near the line of beta stability are crucial for understanding the s-process nucleosynthesis. However, neutron-capture cross sections for short-lived radionuclides are difficult to measure due to the fact that the measurements require both highly radioactive samples and intense neutron sources. Essential ingredients for describing the γ decays following neutron capture are the γ-ray strength function and level densities. We will compare different indirect approaches for obtaining the most relevant observables that can constrain Hauser-Feshbach statistical-model calculations of capture cross sections. Specifically, we will consider photon scattering using monoenergetic and 100% linearly polarized photon beams. Challenges that exist on the path to obtaining neutron-capture cross sections for reactions on isotopes near and far from stability will be discussed.
Bringing a transgenic crop to market: where compositional analysis fits.
Privalle, Laura S; Gillikin, Nancy; Wandelt, Christine
2013-09-04
In the process of developing a biotechnology product, thousands of genes and transformation events are evaluated to select the event that will be commercialized. The ideal event is identified on the basis of multiple characteristics including trait efficacy, the molecular characteristics of the insert, and agronomic performance. Once selected, the commercial event is subjected to a rigorous safety evaluation taking a multipronged approach including examination of the safety of the gene and gene product - the protein, plant performance, impact of cultivating the crop on the environment, agronomic performance, and equivalence of the crop/food to conventional crops/food - by compositional analysis. The compositional analysis is composed of a comparison of the nutrient and antinutrient composition of the crop containing the event, its parental line (variety), and other conventional lines (varieties). Different geographies have different requirements for the compositional analysis studies. Parameters that vary include the number of years (seasons) and locations (environments) to be evaluated, the appropriate comparator(s), analytes to be evaluated, and statistical analysis. Specific examples of compositional analysis results will be presented.
Monte Carlo simulations within avalanche rescue
NASA Astrophysics Data System (ADS)
Reiweger, Ingrid; Genswein, Manuel; Schweizer, Jürg
2016-04-01
Refining concepts for avalanche rescue involves calculating suitable settings for rescue strategies such as an adequate probing depth for probe line searches or an optimal time for performing resuscitation for a recovered avalanche victim in case of additional burials. In the latter case, treatment decisions have to be made in the context of triage. However, given the low number of incidents it is rarely possible to derive quantitative criteria based on historical statistics in the context of evidence-based medicine. For these rare, but complex rescue scenarios, most of the associated concepts, theories, and processes involve a number of unknown "random" parameters which have to be estimated in order to calculate anything quantitatively. An obvious approach for incorporating a number of random variables and their distributions into a calculation is to perform a Monte Carlo (MC) simulation. We here present Monte Carlo simulations for calculating the most suitable probing depth for probe line searches depending on search area and an optimal resuscitation time in case of multiple avalanche burials. The MC approach reveals, e.g., new optimized values for the duration of resuscitation that differ from previous, mainly case-based assumptions.
The effect of CNC and manual laser machining on electrical resistance of HDPE/MWCNT composite
NASA Astrophysics Data System (ADS)
Mohammadi, Fatemeh; Farshbaf Zinati, Reza; Fattahi, A. M.
2018-05-01
In this study, electrical conductivity of high-density polyethylene (HDPE)/multi-walled carbon nanotube (MWCNT) composite was investigated after laser machining. To this end, produced using plastic injection process, nano-composite samples were laser machined with various combinations of input parameters such as feed rate (35, 45, and 55 mm/min), feed angle with injection flow direction (0°, 45°, and 90°), and MWCNT content (0.5, 1, and 1.5 wt%). The angle between laser feed and injected flow direction was set via either of two different methods: CNC programming and manual setting. The results showed that the parameters of angle between laser line and melt flow direction and feed rate were both found to have statistically significance and physical impacts on electrical resistance of the samples in manual setting. Also, maximum conductivity was seen when the angle between laser line and melt flow direction was set to 90° in manual setting, and maximum conductivity was seen at feed rate of 55 mm/min in both of CNC programming and manual setting.
Marital satisfaction and break-ups differ across on-line and off-line meeting venues
Cacioppo, John T.; Cacioppo, Stephanie; Gonzaga, Gian C.; Ogburn, Elizabeth L.; VanderWeele, Tyler J.
2013-01-01
Marital discord is costly to children, families, and communities. The advent of the Internet, social networking, and on-line dating has affected how people meet future spouses, but little is known about the prevalence or outcomes of these marriages or the demographics of those involved. We addressed these questions in a nationally representative sample of 19,131 respondents who married between 2005 and 2012. Results indicate that more than one-third of marriages in America now begin on-line. In addition, marriages that began on-line, when compared with those that began through traditional off-line venues, were slightly less likely to result in a marital break-up (separation or divorce) and were associated with slightly higher marital satisfaction among those respondents who remained married. Demographic differences were identified between respondents who met their spouse through on-line vs. traditional off-line venues, but the findings for marital break-up and marital satisfaction remained significant after statistically controlling for these differences. These data suggest that the Internet may be altering the dynamics and outcomes of marriage itself. PMID:23733955
NASA Astrophysics Data System (ADS)
García-Romero, Leví; Hernández-Cordero, Antonio; Hernández-Calvento, Luis; Hesp, Patrick A.
2017-04-01
In recent decades, important environmental changes have been detected in dune systems around the world. Vegetation on the foredune provides stability to the coastal dunefields, capturing and accumulating sediments, which is an important function among other ecosystem services. For this reason, vegetation has been used as an indicator when studying anthropogenic and natural processes in the foredunes, especially when an increase of the vulnerability has been detected. Foredunes of arid dunefields have been little studied. They present significant differences with respect to the foredune of other climatic zones. Traganum moquinii is the predominant plant species in the foredune of arid dunefields around the Canary Islands (including South Morocco, Mauritania and other close archipelagos, like Cape Verde). This bush species plays an important geomorphological role: its interaction with the aeolian sedimentary processes generates nebkhas, shadow dunes and arid parabolic shaped dunes. The objective of this work is to show the morphometric evolution of the foredune of an arid dunefield of the Canary Islands, Maspalomas (Gran Canaria), as well as explaining the function of Traganum moquinii on it. One morphometric variable (number of nebkhas) and six morphologic variables of Traganum moquinii species (density, mean distance between Traganum moquinii individuals, number of Traganum moquinii individuals in line one, mean diameter of Traganum moquinii individuals in line one, mean distance between Traganum moquinii individuals in line one, density Traganum moquinii individuals in line one) have been measured in ten observation plots, from the 1960s to the present, through detailed historical aerial photographs and orthophotos, using GIS. The morphometric changes have been identified, and the variables have been related from statistical analysis to detect the function exerted by Traganum moquinii species in the foredune. The change in the number of nebkhas enables the characterization of three types of foredune environments, which lie N-S. Measured variables in the first line of the foredune present significant relations with the number of nebkhas. The changes detected and the relationships observed between variables are related with natural processes and antrophogenic impacts. This information can be useful for arid coastal dune systems management, as well as restoration tasks in arid foredunes.
A new statistical PCA-ICA algorithm for location of R-peaks in ECG.
Chawla, M P S; Verma, H K; Kumar, Vinod
2008-09-16
The success of ICA to separate the independent components from the mixture depends on the properties of the electrocardiogram (ECG) recordings. This paper discusses some of the conditions of independent component analysis (ICA) that could affect the reliability of the separation and evaluation of issues related to the properties of the signals and number of sources. Principal component analysis (PCA) scatter plots are plotted to indicate the diagnostic features in the presence and absence of base-line wander in interpreting the ECG signals. In this analysis, a newly developed statistical algorithm by authors, based on the use of combined PCA-ICA for two correlated channels of 12-channel ECG data is proposed. ICA technique has been successfully implemented in identifying and removal of noise and artifacts from ECG signals. Cleaned ECG signals are obtained using statistical measures like kurtosis and variance of variance after ICA processing. This analysis also paper deals with the detection of QRS complexes in electrocardiograms using combined PCA-ICA algorithm. The efficacy of the combined PCA-ICA algorithm lies in the fact that the location of the R-peaks is bounded from above and below by the location of the cross-over points, hence none of the peaks are ignored or missed.
Ladegaard, Yun; Skakon, Janne; Elrond, Andreas Friis; Netterstrøm, Bo
2017-08-28
To examine how line managers experience and manage the return to work process of employees on sick leave due to work-related stress and to identify supportive and inhibiting factors. Semi-structured interviews with 15 line managers who have had employees on sick leave due to work-related stress. The grounded theory approach was employed. Even though managers may accept the overall concept of work-related stress, they focus on personality and individual circumstances when an employee is sick-listed due to work-related stress. The lack of a common understanding of stress creates room for this focus. Line managers experience cross-pressure, discrepancies between strategic and human-relationship perspectives and a lack of organizational support in the return to work process. Organizations should aim to provide support for line managers. Research-based knowledge and guidelines on work-related stress and return to work process are essential, as is the involvement of coworkers. A commonly accepted definition of stress and a systematic risk assessment is also important. Cross-pressure on line managers should be minimized and room for adequate preventive actions should be provided as such an approach could support both the return to work process and the implementation of important interventions in the work environment. Implication for rehabilitation Organizations should aim to provide support for line managers handling the return to work process. Cross-pressure on line managers should be minimized and adequate preventive actions should be provided in relation to the return to work process. Research-based knowledge and guidelines on work-related stress and return to work are essential. A common and formal definition of stress should be emphasized in the workplace.
OT1_ipascucc_1: Understanding the Origin of Transition Disks via Disk Mass Measurements
NASA Astrophysics Data System (ADS)
Pascucci, I.
2010-07-01
Transition disks are a distinguished group of few Myr-old systems caught in the phase of dispersing their inner dust disk. Three different processes have been proposed to explain this inside-out clearing: grain growth, photoevaporation driven by the central star, and dynamical clearing by a forming giant planet. Which of these processes lead to a transition disk? Distinguishing between them requires the combined knowledge of stellar accretion rates and disk masses. We propose here to use 43.8 hours of PACS spectroscopy to detect the [OI] 63 micron emission line from a sample of 21 well-known transition disks with measured mass accretion rates. We will use this line, in combination with ancillary CO millimeter lines, to measure their gas disk mass. Because gas dominates the mass of protoplanetary disks our approach and choice of lines will enable us to trace the bulk of the disk mass that resides beyond tens of AU from young stars. Our program will quadruple the number of transition disks currently observed with Herschel in this setting and for which disk masses can be measured. We will then place the transition and the ~100 classical/non-transition disks of similar age (from the Herschel KP "Gas in Protoplanetary Systems") in the mass accretion rate-disk mass diagram with two main goals: 1) reveal which gaps have been created by grain growth, photoevaporation, or giant planet formation and 2) from the statistics, determine the main disk dispersal mechanism leading to a transition disk.
ERIC Educational Resources Information Center
Schirmeier, Matthias K.; Derwing, Bruce L.; Libben, Gary
2004-01-01
Two types of experiments investigate the visual on-line and off-line processing of German ver-verbs (e.g., verbittern "to embitte"). In Experiments 1 and 2 (morphological priming), latency patterns revealed the existence of facilitation effects for the morphological conditions (BITTER-VERBITTERN and BITTERN-VERBITTERN) as compared to the neutral…
Draelos, Zoe Diana; Kononov, Tatiana; Fox, Theresa
2016-09-01
A 14-week single-center clinical usage study was conducted to test the efficacy of a peptide treatment serum and supporting skincare regimen in 29 women with mild to moderately photodamaged facial skin. The peptide treatment serum contained gamma-aminobutyric acid (GABA) and various peptides with neurotransmitter inhibiting and cell signaling properties. It was hypothesized that the peptide treatment serum would ameliorate eye and facial expression lines including crow's feet and forehead lines. The efficacy of the supporting skincare regimen was also evaluated. An expert investigator examined the subjects at rest and at maximum smile. Additionally, the subjects completed self-assessment questionnaires. At week 14, the expert investigator found a statistically significant improvement in facial lines, facial wrinkles, eye lines, and eye wrinkles at rest when compared to baseline results. The expert investigator also found statistically significant improvement at week 14 in facial lines, eye lines, and eye wrinkles when compared to baseline results at maximum smile. In addition, there was continued highly statistically significant improvement in smoothness, softness, firmness, radiance, luminosity, and overall appearance at rest when compared to baseline results at the 14-week time point. The test regimen was well perceived by the subjects for efficacy and product attributes. The products were well tolerated with no adverse events.
J Drugs Dermatol. 2016;15(9):1100-1106.
NASA Astrophysics Data System (ADS)
Esbrand, C.; Royle, G.; Griffiths, J.; Speller, R.
2009-07-01
The integration of technology with healthcare has undoubtedly propelled the medical imaging sector well into the twenty first century. The concept of digital imaging introduced during the 1970s has since paved the way for established imaging techniques where digital mammography, phase contrast imaging and CT imaging are just a few examples. This paper presents a prototype intelligent digital mammography system designed and developed by a European consortium. The final system, the I-ImaS system, utilises CMOS monolithic active pixel sensor (MAPS) technology promoting on-chip data processing, enabling the acts of data processing and image acquisition to be achieved simultaneously; consequently, statistical analysis of tissue is achievable in real-time for the purpose of x-ray beam modulation via a feedback mechanism during the image acquisition procedure. The imager implements a dual array of twenty 520 pixel × 40 pixel CMOS MAPS sensing devices with a 32μm pixel size, each individually coupled to a 100μm thick thallium doped structured CsI scintillator. This paper presents the first intelligent images of real breast tissue obtained from the prototype system of real excised breast tissue where the x-ray exposure was modulated via the statistical information extracted from the breast tissue itself. Conventional images were experimentally acquired where the statistical analysis of the data was done off-line, resulting in the production of simulated real-time intelligently optimised images. The results obtained indicate real-time image optimisation using the statistical information extracted from the breast as a means of a feedback mechanisms is beneficial and foreseeable in the near future.
Palanichamy, A; Jayas, D S; Holley, R A
2008-01-01
The Canadian Food Inspection Agency required the meat industry to ensure Escherichia coli O157:H7 does not survive (experiences > or = 5 log CFU/g reduction) in dry fermented sausage (salami) during processing after a series of foodborne illness outbreaks resulting from this pathogenic bacterium occurred. The industry is in need of an effective technique like predictive modeling for estimating bacterial viability, because traditional microbiological enumeration is a time-consuming and laborious method. The accuracy and speed of artificial neural networks (ANNs) for this purpose is an attractive alternative (developed from predictive microbiology), especially for on-line processing in industry. Data from a study of interactive effects of different levels of pH, water activity, and the concentrations of allyl isothiocyanate at various times during sausage manufacture in reducing numbers of E. coli O157:H7 were collected. Data were used to develop predictive models using a general regression neural network (GRNN), a form of ANN, and a statistical linear polynomial regression technique. Both models were compared for their predictive error, using various statistical indices. GRNN predictions for training and test data sets had less serious errors when compared with the statistical model predictions. GRNN models were better and slightly better for training and test sets, respectively, than was the statistical model. Also, GRNN accurately predicted the level of allyl isothiocyanate required, ensuring a 5-log reduction, when an appropriate production set was created by interpolation. Because they are simple to generate, fast, and accurate, ANN models may be of value for industrial use in dry fermented sausage manufacture to reduce the hazard associated with E. coli O157:H7 in fresh beef and permit production of consistently safe products from this raw material.
Effects of drain bias on the statistical variation of double-gate tunnel field-effect transistors
NASA Astrophysics Data System (ADS)
Choi, Woo Young
2017-04-01
The effects of drain bias on the statistical variation of double-gate (DG) tunnel field-effect transistors (TFETs) are discussed in comparison with DG metal-oxide-semiconductor FETs (MOSFETs). Statistical variation corresponds to the variation of threshold voltage (V th), subthreshold swing (SS), and drain-induced barrier thinning (DIBT). The unique statistical variation characteristics of DG TFETs and DG MOSFETs with the variation of drain bias are analyzed by using full three-dimensional technology computer-aided design (TCAD) simulation in terms of the three dominant variation sources: line-edge roughness (LER), random dopant fluctuation (RDF) and workfunction variation (WFV). It is observed than DG TFETs suffer from less severe statistical variation as drain voltage increases unlike DG MOSFETs.
Two-dimensional signal processing with application to image restoration
NASA Technical Reports Server (NTRS)
Assefi, T.
1974-01-01
A recursive technique for modeling and estimating a two-dimensional signal contaminated by noise is presented. A two-dimensional signal is assumed to be an undistorted picture, where the noise introduces the distortion. Both the signal and the noise are assumed to be wide-sense stationary processes with known statistics. Thus, to estimate the two-dimensional signal is to enhance the picture. The picture representing the two-dimensional signal is converted to one dimension by scanning the image horizontally one line at a time. The scanner output becomes a nonstationary random process due to the periodic nature of the scanner operation. Procedures to obtain a dynamical model corresponding to the autocorrelation function of the scanner output are derived. Utilizing the model, a discrete Kalman estimator is designed to enhance the image.
Stand-off transmission lines and method for making same
Tuckerman, D.B.
1991-05-21
Standoff transmission lines in an integrated circuit structure are formed by etching away or removing the portion of the dielectric layer separating the microstrip metal lines and the ground plane from the regions that are not under the lines. The microstrip lines can be fabricated by a subtractive process of etching a metal layer, an additive process of direct laser writing fine lines followed by plating up the lines or a subtractive/additive process in which a trench is etched over a nucleation layer and the wire is electrolytically deposited. Microstrip lines supported on freestanding posts of dielectric material surrounded by air gaps are produced. The average dielectric constant between the lines and ground plane is reduced, resulting in higher characteristic impedance, less crosstalk between lines, increased signal propagation velocities, and reduced wafer stress. 16 figures.
Sobkowiak, Alicja; Jończyk, Maciej; Jarochowska, Emilia; Biecek, Przemysław; Trzcinska-Danielewicz, Joanna; Leipner, Jörg; Fronk, Jan; Sowiński, Paweł
2014-06-01
Maize, despite being thermophyllic due to its tropical origin, demonstrates high intraspecific diversity in cold-tolerance. To search for molecular mechanisms of this diversity, transcriptomic response to cold was studied in two inbred lines of contrasting cold-tolerance. Microarray analysis was followed by extensive statistical elaboration of data, literature data mining, and gene ontology-based classification. The lines used had been bred earlier specifically for determination of QTLs for cold-performance of photosynthesis. This allowed direct comparison of present transcriptomic data with the earlier QTL mapping results. Cold-treated (14 h at 8/6 °C) maize seedlings of cold-tolerant ETH-DH7 and cold-sensitive ETH-DL3 lines at V3 stage showed strong, consistent response of the third leaf transcriptome: several thousand probes showed similar, statistically significant change in both lines, while only tens responded differently in the two lines. The most striking difference between the responses of the two lines to cold was the induction of expression of ca. twenty genes encoding membrane/cell wall proteins exclusively in the cold-tolerant ETH-DH7 line. The common response comprised mainly repression of numerous genes related to photosynthesis and induction of genes related to basic biological activity: transcription, regulation of gene expression, protein phosphorylation, cell wall organization. Among the genes showing differential response, several were close to the QTL regions identified in earlier studies with the same inbred lines and associated with biometrical, physiological or biochemical parameters. These transcripts, including two apparently non-protein-coding ones, are particularly attractive candidates for future studies on mechanisms determining divergent cold-tolerance of inbred maize lines.
NASA Astrophysics Data System (ADS)
Csáki, Endre; Csörgő, Miklós; Földes, Antónia; Révész, Pál
2018-04-01
We consider random walks on the square lattice of the plane along the lines of Heyde (J Stat Phys 27:721-730, 1982, Stochastic processes, Springer, New York, 1993) and den Hollander (J Stat Phys 75:891-918, 1994), whose studies have in part been inspired by the so-called transport phenomena of statistical physics. Two-dimensional anisotropic random walks with anisotropic density conditions á la Heyde (J Stat Phys 27:721-730, 1982, Stochastic processes, Springer, New York, 1993) yield fixed column configurations and nearest-neighbour random walks in a random environment on the square lattice of the plane as in den Hollander (J Stat Phys 75:891-918, 1994) result in random column configurations. In both cases we conclude simultaneous weak Donsker and strong Strassen type invariance principles in terms of appropriately constructed anisotropic Brownian motions on the plane, with self-contained proofs in both cases. The style of presentation throughout will be that of a semi-expository survey of related results in a historical context.
NASA Astrophysics Data System (ADS)
Csáki, Endre; Csörgő, Miklós; Földes, Antónia; Révész, Pál
2018-06-01
We consider random walks on the square lattice of the plane along the lines of Heyde (J Stat Phys 27:721-730, 1982, Stochastic processes, Springer, New York, 1993) and den Hollander (J Stat Phys 75:891-918, 1994), whose studies have in part been inspired by the so-called transport phenomena of statistical physics. Two-dimensional anisotropic random walks with anisotropic density conditions á la Heyde (J Stat Phys 27:721-730, 1982, Stochastic processes, Springer, New York, 1993) yield fixed column configurations and nearest-neighbour random walks in a random environment on the square lattice of the plane as in den Hollander (J Stat Phys 75:891-918, 1994) result in random column configurations. In both cases we conclude simultaneous weak Donsker and strong Strassen type invariance principles in terms of appropriately constructed anisotropic Brownian motions on the plane, with self-contained proofs in both cases. The style of presentation throughout will be that of a semi-expository survey of related results in a historical context.
On-line estimation of nonlinear physical systems
Christakos, G.
1988-01-01
Recursive algorithms for estimating states of nonlinear physical systems are presented. Orthogonality properties are rediscovered and the associated polynomials are used to linearize state and observation models of the underlying random processes. This requires some key hypotheses regarding the structure of these processes, which may then take account of a wide range of applications. The latter include streamflow forecasting, flood estimation, environmental protection, earthquake engineering, and mine planning. The proposed estimation algorithm may be compared favorably to Taylor series-type filters, nonlinear filters which approximate the probability density by Edgeworth or Gram-Charlier series, as well as to conventional statistical linearization-type estimators. Moreover, the method has several advantages over nonrecursive estimators like disjunctive kriging. To link theory with practice, some numerical results for a simulated system are presented, in which responses from the proposed and extended Kalman algorithms are compared. ?? 1988 International Association for Mathematical Geology.
Computerized EEG analysis for studying the effect of drugs on the central nervous system.
Rosadini, G; Cavazza, B; Rodriguez, G; Sannita, W G; Siccardi, A
1977-11-01
Samples of our experience in quantitative pharmaco-EEG are reviewed to discuss and define its applicability and limits. Simple processing systems, such as the computation of Hjorth's descriptors, are useful for on-line monitoring of drug-induced EEG modifications which are evident also at the visual visual analysis. Power spectral analysis is suitable to identify and quantify EEG effects not evident at the visual inspection. It demonstrated how the EEG effects of compounds in a long-acting formulation vary according to the sampling time and the explored cerebral area. EEG modifications not detected by power spectral analysis can be defined by comparing statistically (F test) the spectral values of the EEG from a single lead at the different samples (longitudinal comparison), or the spectral values from different leads at any sample (intrahemispheric comparison). The presently available procedures of quantitative pharmaco-EEG are effective when applied to the study of mutltilead EEG recordings in a statistically significant sample of population. They do not seem reliable in the monitoring of directing of neuropyschiatric therapies in single patients, due to individual variability of drug effects.
Comprehensive analysis of line-edge and line-width roughness for EUV lithography
NASA Astrophysics Data System (ADS)
Bonam, Ravi; Liu, Chi-Chun; Breton, Mary; Sieg, Stuart; Seshadri, Indira; Saulnier, Nicole; Shearer, Jeffrey; Muthinti, Raja; Patlolla, Raghuveer; Huang, Huai
2017-03-01
Pattern transfer fidelity is always a major challenge for any lithography process and needs continuous improvement. Lithographic processes in semiconductor industry are primarily driven by optical imaging on photosensitive polymeric material (resists). Quality of pattern transfer can be assessed by quantifying multiple parameters such as, feature size uniformity (CD), placement, roughness, sidewall angles etc. Roughness in features primarily corresponds to variation of line edge or line width and has gained considerable significance, particularly due to shrinking feature sizes and variations of features in the same order. This has caused downstream processes (Etch (RIE), Chemical Mechanical Polish (CMP) etc.) to reconsider respective tolerance levels. A very important aspect of this work is relevance of roughness metrology from pattern formation at resist to subsequent processes, particularly electrical validity. A major drawback of current LER/LWR metric (sigma) is its lack of relevance across multiple downstream processes which effects material selection at various unit processes. In this work we present a comprehensive assessment of Line Edge and Line Width Roughness at multiple lithographic transfer processes. To simulate effect of roughness a pattern was designed with periodic jogs on the edges of lines with varying amplitudes and frequencies. There are numerous methodologies proposed to analyze roughness and in this work we apply them to programmed roughness structures to assess each technique's sensitivity. This work also aims to identify a relevant methodology to quantify roughness with relevance across downstream processes.
Persistant Spectral Hole-Burning: Photon-Gating and Fundamental Statistical Limits
1989-11-03
pentacene inhomogeneous line that results from tile statistics of independent, additive random variables. For this data, Nil - 10’. The rms amplitude...features in inhomogencous lines. To illustrate this, Figure 5 shows a portion of the optical spectrum of pentacene in p-terphenyl before and after a...contained in each irradiated spot of recording medium. The stress-induced variations in the local environment of the storage centers are random in nature
A Categorization of Dynamic Analyzers
NASA Technical Reports Server (NTRS)
Lujan, Michelle R.
1997-01-01
Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.
A Process for Assessing NASA's Capability in Aircraft Noise Prediction Technology
NASA Technical Reports Server (NTRS)
Dahl, Milo D.
2008-01-01
An acoustic assessment is being conducted by NASA that has been designed to assess the current state of the art in NASA s capability to predict aircraft related noise and to establish baselines for gauging future progress in the field. The process for determining NASA s current capabilities includes quantifying the differences between noise predictions and measurements of noise from experimental tests. The computed noise predictions are being obtained from semi-empirical, analytical, statistical, and numerical codes. In addition, errors and uncertainties are being identified and quantified both in the predictions and in the measured data to further enhance the credibility of the assessment. The content of this paper contains preliminary results, since the assessment project has not been fully completed, based on the contributions of many researchers and shows a select sample of the types of results obtained regarding the prediction of aircraft noise at both the system and component levels. The system level results are for engines and aircraft. The component level results are for fan broadband noise, for jet noise from a variety of nozzles, and for airframe noise from flaps and landing gear parts. There are also sample results for sound attenuation in lined ducts with flow and the behavior of acoustic lining in ducts.
Is NeII a Tracer for X-Rays in Disks around Tauri Stars?
NASA Astrophysics Data System (ADS)
Guedel, Manuel
2007-10-01
Although dust grains dominate the appearance of protoplanetary disks because of their high opacity, the key processes for disk evolution and planetesimal formation are driven through the dynamical state of the gas. In contrast to the dust component, we do not have a similar knowledge of the gas component. One of the Spitzer breakthroughs was the detection of the [Ne II] 12.8um line. Glassgold et al. (2007) proposed that this line provides diagnostics for a warm disk surface layer that is heated and ionized by stellar X-rays. A correlation of the [Ne II] luminosity with the X-ray luminosity is expected. The statistical sample so far available is insufficient to test this hypothesis. We aim at significantly enlarging the sample, with the goal of confirming or refuting this model.
Detection of Interstellar Urea with Carma
NASA Astrophysics Data System (ADS)
Kuo, H.-L.; Snyder, L. E.; Friedel, D. N.; Looney, L. W.; McCall, B. J.; Remijan, A. J.; Lovas, F. J.; Hollis, J. M.
2010-06-01
Urea, a molecule discovered in human urine by H. M. Rouelle in 1773, has a significant role in prebiotic chemistry. Previous BIMA observations have suggested that interstellar urea [(NH_2)_2CO] is a compact hot core molecule such as other large molecules, e.g. methyl formate and acetic acid (2009, 64th OSU Symposium On Molecular Spectroscopy, WI05). We have conducted an extensive search for urea toward the high mass hot molecular core Sgr B2(N-LMH) using CARMA and the IRAM 30 m. Because the spectral lines of heavy molecules like urea tend to be weak and hot cores display lines from a wide range of molecules, a major problem in identifying urea lines is confusion with lines of other molecules. Therefore, it is necessary to detect a number of urea lines and apply sophisticated statistical tests before having confidence in an identification. The 1 mm resolution of CARMA enables favorable coupling of the source size and synthesized beam size, which was found to be essential for the detection of weak signals. The 2.5^"×2^" synthesized beam of CARMA significantly resolves out the contamination by extended emission and reveals the eight weak urea lines that were previously blended with nearby transitions. Our analysis indicates that these lines are likely to be urea since the resulting observed line frequencies are coincident with a set of overlapping connecting urea lines, and the observed line intensities are consistent with the expected line strengths of urea. In addition, we have developed a new statistical approach to examine the spatial correlation between the observed lines by applying the Student T-test to the high resolution channel maps obtained from CARMA. The T-test shows similar spatial distributions from all eight candidate lines, suggesting a common molecular origin, urea. Our T-test method could have a broad impact on the next generation of arrays, such as ALMA, because the new arrays will require a method to systematically determine the credibility of detections of weaker signals from new and larger interstellar molecules.
Musicians' edge: A comparison of auditory processing, cognitive abilities and statistical learning.
Mandikal Vasuki, Pragati Rao; Sharma, Mridula; Demuth, Katherine; Arciuli, Joanne
2016-12-01
It has been hypothesized that musical expertise is associated with enhanced auditory processing and cognitive abilities. Recent research has examined the relationship between musicians' advantage and implicit statistical learning skills. In the present study, we assessed a variety of auditory processing skills, cognitive processing skills, and statistical learning (auditory and visual forms) in age-matched musicians (N = 17) and non-musicians (N = 18). Musicians had significantly better performance than non-musicians on frequency discrimination, and backward digit span. A key finding was that musicians had better auditory, but not visual, statistical learning than non-musicians. Performance on the statistical learning tasks was not correlated with performance on auditory and cognitive measures. Musicians' superior performance on auditory (but not visual) statistical learning suggests that musical expertise is associated with an enhanced ability to detect statistical regularities in auditory stimuli. Copyright © 2016 Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-23
...-02] RIN 0694-AE98 Simplified Network Application Processing System, On-Line Registration and Account...'') electronically via BIS's Simplified Network Application Processing (SNAP-R) system. Currently, parties must... Network Applications Processing System (SNAP-R) in October 2006. The SNAP-R system provides a Web based...
Taxonomy and clustering in collaborative systems: The case of the on-line encyclopedia Wikipedia
NASA Astrophysics Data System (ADS)
Capocci, A.; Rao, F.; Caldarelli, G.
2008-01-01
In this paper we investigate the nature and structure of the relation between imposed classifications and real clustering in a particular case of a scale-free network given by the on-line encyclopedia Wikipedia. We find a statistical similarity in the distributions of community sizes both by using the top-down approach of the categories division present in the archive and in the bottom-up procedure of community detection given by an algorithm based on the spectral properties of the graph. Regardless of the statistically similar behaviour, the two methods provide a rather different division of the articles, thereby signaling that the nature and presence of power laws is a general feature for these systems and cannot be used as a benchmark to evaluate the suitability of a clustering method.
The Japanese and the American First-Line Supervisor.
ERIC Educational Resources Information Center
Bryan, Leslie A., Jr.
1982-01-01
Compares the American and Japanese first-line supervisor: production statistics, supervisory style, company loyalty, management style, and communication. Also suggests what Americans might learn from the Japanese methods. (CT)
Using the Properties of Broad Absorption Line Quasars to Illuminate Quasar Structure
NASA Astrophysics Data System (ADS)
Yong, Suk Yee; King, Anthea L.; Webster, Rachel L.; Bate, Nicholas F.; O'Dowd, Matthew J.; Labrie, Kathleen
2018-06-01
A key to understanding quasar unification paradigms is the emission properties of broad absorption line quasars (BALQs). The fact that only a small fraction of quasar spectra exhibit deep absorption troughs blueward of the broad permitted emission lines provides a crucial clue to the structure of quasar emitting regions. To learn whether it is possible to discriminate between the BALQ and non-BALQ populations given the observed spectral properties of a quasar, we employ two approaches: one based on statistical methods and the other supervised machine learning classification, applied to quasar samples from the Sloan Digital Sky Survey. The features explored include continuum and emission line properties, in particular the absolute magnitude, redshift, spectral index, line width, asymmetry, strength, and relative velocity offsets of high-ionisation C IV λ1549 and low-ionisation Mg II λ2798 lines. We consider a complete population of quasars, and assume that the statistical distributions of properties represent all angles where the quasar is viewed without obscuration. The distributions of the BALQ and non-BALQ sample properties show few significant differences. None of the observed continuum and emission line features are capable of differentiating between the two samples. Most published narrow disk-wind models are inconsistent with these observations, and an alternative disk-wind model is proposed. The key feature of the proposed model is a disk-wind filling a wide opening angle with multiple radial streams of dense clumps.
THREE-POINT PHASE CORRELATIONS: A NEW MEASURE OF NONLINEAR LARGE-SCALE STRUCTURE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolstenhulme, Richard; Bonvin, Camille; Obreschkow, Danail
2015-05-10
We derive an analytical expression for a novel large-scale structure observable: the line correlation function. The line correlation function, which is constructed from the three-point correlation function of the phase of the density field, is a robust statistical measure allowing the extraction of information in the nonlinear and non-Gaussian regime. We show that, in perturbation theory, the line correlation is sensitive to the coupling kernel F{sub 2}, which governs the nonlinear gravitational evolution of the density field. We compare our analytical expression with results from numerical simulations and find a 1σ agreement for separations r ≳ 30 h{sup −1} Mpc.more » Fitting formulae for the power spectrum and the nonlinear coupling kernel at small scales allow us to extend our prediction into the strongly nonlinear regime, where we find a 1σ agreement with the simulations for r ≳ 2 h{sup −1} Mpc. We discuss the advantages of the line correlation relative to standard statistical measures like the bispectrum. Unlike the latter, the line correlation is independent of the bias, in the regime where the bias is local and linear. Furthermore, the variance of the line correlation is independent of the Gaussian variance on the modulus of the density field. This suggests that the line correlation can probe more precisely the nonlinear regime of gravity, with less contamination from the power spectrum variance.« less
Baldwin Effect and Additional BLR Component in AGN with Superluminal Jets
NASA Astrophysics Data System (ADS)
Patiño Álvarez, Víctor; Torrealba, Janet; Chavushyan, Vahram; Cruz González, Irene; Arshakian, Tigran; León Tavares, Jonathan; Popovic, Luka
2016-06-01
We study the Baldwin Effect (BE) in 96 core-jet blazars with optical and ultraviolet spectroscopic data from a radio-loud AGN sample obtained from the MOJAVE 2cm survey. A statistical analysis is presented of the equivalent widths W_lambda of emission lines H beta 4861, Mg II 2798, C IV 1549, and continuum luminosities at 5100, 3000, and 1350 angstroms. The BE is found statistically significant (with confidence level c.l. > 95%) in H beta and C IV emission lines, while for Mg II the trend is slightly less significant (c.l. = 94.5%). The slopes of the BE in the studied samples for H beta and Mg II are found steeper and with statistically significant difference than those of a comparison radio-quiet sample. We present simulations of the expected BE slopes produced by the contribution to the total continuum of the non-thermal boosted emission from the relativistic jet, and by variability of the continuum components. We find that the slopes of the BE between radio-quiet and radio-loud AGN should not be different, under the assumption that the broad line is only being emitted by the canonical broad line region around the black hole. We discuss that the BE slope steepening in radio AGN is due to a jet associated broad-line region.
Duraipandian, Shiyamala; Sylvest Bergholt, Mads; Zheng, Wei; Yu Ho, Khek; Teh, Ming; Guan Yeoh, Khay; Bok Yan So, Jimmy; Shabbir, Asim; Huang, Zhiwei
2012-08-01
Optical spectroscopic techniques including reflectance, fluorescence and Raman spectroscopy have shown promising potential for in vivo precancer and cancer diagnostics in a variety of organs. However, data-analysis has mostly been limited to post-processing and off-line algorithm development. In this work, we develop a fully automated on-line Raman spectral diagnostics framework integrated with a multimodal image-guided Raman technique for real-time in vivo cancer detection at endoscopy. A total of 2748 in vivo gastric tissue spectra (2465 normal and 283 cancer) were acquired from 305 patients recruited to construct a spectral database for diagnostic algorithms development. The novel diagnostic scheme developed implements on-line preprocessing, outlier detection based on principal component analysis statistics (i.e., Hotelling's T2 and Q-residuals) for tissue Raman spectra verification as well as for organ specific probabilistic diagnostics using different diagnostic algorithms. Free-running optical diagnosis and processing time of < 0.5 s can be achieved, which is critical to realizing real-time in vivo tissue diagnostics during clinical endoscopic examination. The optimized partial least squares-discriminant analysis (PLS-DA) models based on the randomly resampled training database (80% for learning and 20% for testing) provide the diagnostic accuracy of 85.6% [95% confidence interval (CI): 82.9% to 88.2%] [sensitivity of 80.5% (95% CI: 71.4% to 89.6%) and specificity of 86.2% (95% CI: 83.6% to 88.7%)] for the detection of gastric cancer. The PLS-DA algorithms are further applied prospectively on 10 gastric patients at gastroscopy, achieving the predictive accuracy of 80.0% (60/75) [sensitivity of 90.0% (27/30) and specificity of 73.3% (33/45)] for in vivo diagnosis of gastric cancer. The receiver operating characteristics curves further confirmed the efficacy of Raman endoscopy together with PLS-DA algorithms for in vivo prospective diagnosis of gastric cancer. This work successfully moves biomedical Raman spectroscopic technique into real-time, on-line clinical cancer diagnosis, especially in routine endoscopic diagnostic applications.
NASA Astrophysics Data System (ADS)
Duraipandian, Shiyamala; Sylvest Bergholt, Mads; Zheng, Wei; Yu Ho, Khek; Teh, Ming; Guan Yeoh, Khay; Bok Yan So, Jimmy; Shabbir, Asim; Huang, Zhiwei
2012-08-01
Optical spectroscopic techniques including reflectance, fluorescence and Raman spectroscopy have shown promising potential for in vivo precancer and cancer diagnostics in a variety of organs. However, data-analysis has mostly been limited to post-processing and off-line algorithm development. In this work, we develop a fully automated on-line Raman spectral diagnostics framework integrated with a multimodal image-guided Raman technique for real-time in vivo cancer detection at endoscopy. A total of 2748 in vivo gastric tissue spectra (2465 normal and 283 cancer) were acquired from 305 patients recruited to construct a spectral database for diagnostic algorithms development. The novel diagnostic scheme developed implements on-line preprocessing, outlier detection based on principal component analysis statistics (i.e., Hotelling's T2 and Q-residuals) for tissue Raman spectra verification as well as for organ specific probabilistic diagnostics using different diagnostic algorithms. Free-running optical diagnosis and processing time of < 0.5 s can be achieved, which is critical to realizing real-time in vivo tissue diagnostics during clinical endoscopic examination. The optimized partial least squares-discriminant analysis (PLS-DA) models based on the randomly resampled training database (80% for learning and 20% for testing) provide the diagnostic accuracy of 85.6% [95% confidence interval (CI): 82.9% to 88.2%] [sensitivity of 80.5% (95% CI: 71.4% to 89.6%) and specificity of 86.2% (95% CI: 83.6% to 88.7%)] for the detection of gastric cancer. The PLS-DA algorithms are further applied prospectively on 10 gastric patients at gastroscopy, achieving the predictive accuracy of 80.0% (60/75) [sensitivity of 90.0% (27/30) and specificity of 73.3% (33/45)] for in vivo diagnosis of gastric cancer. The receiver operating characteristics curves further confirmed the efficacy of Raman endoscopy together with PLS-DA algorithms for in vivo prospective diagnosis of gastric cancer. This work successfully moves biomedical Raman spectroscopic technique into real-time, on-line clinical cancer diagnosis, especially in routine endoscopic diagnostic applications.
Estimating the Propagation of Interdependent Cascading Outages with Multi-Type Branching Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qi, Junjian; Ju, Wenyun; Sun, Kai
In this paper, the multi-type branching process is applied to describe the statistics and interdependencies of line outages, the load shed, and isolated buses. The offspring mean matrix of the multi-type branching process is estimated by the Expectation Maximization (EM) algorithm and can quantify the extent of outage propagation. The joint distribution of two types of outages is estimated by the multi-type branching process via the Lagrange-Good inversion. The proposed model is tested with data generated by the AC OPA cascading simulations on the IEEE 118-bus system. The largest eigenvalues of the offspring mean matrix indicate that the system ismore » closer to criticality when considering the interdependence of different types of outages. Compared with empirically estimating the joint distribution of the total outages, good estimate is obtained by using the multitype branching process with a much smaller number of cascades, thus greatly improving the efficiency. It is shown that the multitype branching process can effectively predict the distribution of the load shed and isolated buses and their conditional largest possible total outages even when there are no data of them.« less
ERIC Educational Resources Information Center
Sisto, Michelle
2009-01-01
Students increasingly need to learn to communicate statistical results clearly and effectively, as well as to become competent consumers of statistical information. These two learning goals are particularly important for business students. In line with reform movements in Statistics Education and the GAISE guidelines, we are working to implement…
Kail, Robert V.
2013-01-01
According to dual-process models that include analytic and heuristic modes of processing, analytic processing is often expected to become more common with development. Consistent with this view, on reasoning problems, adolescents are more likely than children to select alternatives that are backed by statistical evidence. It is shown here that this pattern depends on the quality of the statistical evidence and the quality of the testimonial that is the typical alternative to statistical evidence. In Experiment 1, 9- and 13-year-olds (N = 64) were presented with scenarios in which solid statistical evidence was contrasted with casual or expert testimonial evidence. When testimony was casual, children relied on it but adolescents did not; when testimony was expert, both children and adolescents relied on it. In Experiment 2, 9- and 13-year-olds (N = 83) were presented with scenarios in which casual testimonial evidence was contrasted with weak or strong statistical evidence. When statistical evidence was weak, children and adolescents relied on both testimonial and statistical evidence; when statistical evidence was strong, most children and adolescents relied on it. Results are discussed in terms of their implications for dual-process accounts of cognitive development. PMID:23735681
The One Micron Fe II Lines in Active Galaxies and Emission Line Stars
NASA Astrophysics Data System (ADS)
Rudy, R. J.; Mazuk, S.; Puetter, R. C.; Hamann, F. W.
1999-05-01
The infrared multiplet of Fe II lines at 0.9997, 1.0501, 1.0863, and 1.1126 microns are particularly strong relative to other red and infrared Fe II features. They reach their greatest strength, relative to the hydrogen lines, in the Seyfert 1 galaxy I Zw 1, and are a common, although not ubiquitous feature, in the broad line regions of active galaxies. In addition, they are seen in a diverse assortment of Galactic sources including young stars, Herbig Ae and Be stars, luminous blue variables, proto-planetary nebulae, and symbiotic novae. They are probably excited by Lyman alpha florescence but the exact path of the cascade to their upper levels is uncertain. They arise in dense, sheltered regions of low ionization and are frequently observed together with the infrared Ca II triplet and the Lyman beta excited O I lines 8446 and 11287. The strengths of the four Fe II features, relative to each other, are nearly constant from object to object suggesting a statistical population of their common upper multiplet. Their intensities, in comparison to the Paschen lines, indicate that they can be important coolants for regions with high optical depths in the hydrogen lines. In addition to I Zw 1 and other active galaxies, we present spectra for the Galactic sources MWC 17, MWC 84, MWC 340, MWC 922, PU Vul, and M 1-92. We review the status of the Fe II observations and discuss the excitation process and possible implications. This work was supported by the IR&D program of the Aerospace Corporation. RCP and FWH acknowledge support from NASA.
Prediction of strontium bromide laser efficiency using cluster and decision tree analysis
NASA Astrophysics Data System (ADS)
Iliev, Iliycho; Gocheva-Ilieva, Snezhana; Kulin, Chavdar
2018-01-01
Subject of investigation is a new high-powered strontium bromide (SrBr2) vapor laser emitting in multiline region of wavelengths. The laser is an alternative to the atom strontium lasers and electron free lasers, especially at the line 6.45 μm which line is used in surgery for medical processing of biological tissues and bones with minimal damage. In this paper the experimental data from measurements of operational and output characteristics of the laser are statistically processed by means of cluster analysis and tree-based regression techniques. The aim is to extract the more important relationships and dependences from the available data which influence the increase of the overall laser efficiency. There are constructed and analyzed a set of cluster models. It is shown by using different cluster methods that the seven investigated operational characteristics (laser tube diameter, length, supplied electrical power, and others) and laser efficiency are combined in 2 clusters. By the built regression tree models using Classification and Regression Trees (CART) technique there are obtained dependences to predict the values of efficiency, and especially the maximum efficiency with over 95% accuracy.
Ferraz, Eduardo Gomes; Andrade, Lucio Costa Safira; dos Santos, Aline Rode; Torregrossa, Vinicius Rabelo; Rubira-Bullen, Izabel Regina Fischer; Sarmento, Viviane Almeida
2013-12-01
The aim of this study was to evaluate the accuracy of virtual three-dimensional (3D) reconstructions of human dry mandibles, produced from two segmentation protocols ("outline only" and "all-boundary lines"). Twenty virtual three-dimensional (3D) images were built from computed tomography exam (CT) of 10 dry mandibles, in which linear measurements between anatomical landmarks were obtained and compared to an error probability of 5 %. The results showed no statistically significant difference among the dry mandibles and the virtual 3D reconstructions produced from segmentation protocols tested (p = 0,24). During the designing of a virtual 3D reconstruction, both "outline only" and "all-boundary lines" segmentation protocols can be used. Virtual processing of CT images is the most complex stage during the manufacture of the biomodel. Establishing a better protocol during this phase allows the construction of a biomodel with characteristics that are closer to the original anatomical structures. This is essential to ensure a correct preoperative planning and a suitable treatment.
NASA Astrophysics Data System (ADS)
Mohandas, Gopakumar; Pessah, Martin E.; Heng, Kevin
2018-05-01
We apply the picket fence treatment to model the effects brought about by spectral lines on the thermal structure of irradiated atmospheres. The lines may be due to pure absorption processes, pure coherent scattering processes, or some combination of absorption and scattering. If the lines arise as a pure absorption process, the surface layers of the atmosphere are cooler, whereas this surface cooling is completely absent if the lines are due to pure coherent isotropic scattering. The lines also lead to a warming of the deeper atmosphere. The warming of the deeper layers is, however, independent of the nature of line formation. Accounting for coherent isotropic scattering in the shortwave and longwave continuum results in anti-greenhouse cooling and greenhouse warming on an atmosphere-wide scale. The effects of coherent isotropic scattering in the line and continuum operate in tandem to determine the resulting thermal structure of the irradiated atmosphere.
NASA Astrophysics Data System (ADS)
Qiu, Hao; Mizutani, Tomoko; Saraya, Takuya; Hiramoto, Toshiro
2015-04-01
The commonly used four metrics for write stability were measured and compared based on the same set of 2048 (2k) six-transistor (6T) static random access memory (SRAM) cells by the 65 nm bulk technology. The preferred one should be effective for yield estimation and help predict edge of stability. Results have demonstrated that all metrics share the same worst SRAM cell. On the other hand, compared to butterfly curve with non-normality and write N-curve where no cell state flip happens, bit-line and word-line margins have good normality as well as almost perfect correlation. As a result, both bit line method and word line method prove themselves preferred write stability metrics.
On-line estimation of error covariance parameters for atmospheric data assimilation
NASA Technical Reports Server (NTRS)
Dee, Dick P.
1995-01-01
A simple scheme is presented for on-line estimation of covariance parameters in statistical data assimilation systems. The scheme is based on a maximum-likelihood approach in which estimates are produced on the basis of a single batch of simultaneous observations. Simple-sample covariance estimation is reasonable as long as the number of available observations exceeds the number of tunable parameters by two or three orders of magnitude. Not much is known at present about model error associated with actual forecast systems. Our scheme can be used to estimate some important statistical model error parameters such as regionally averaged variances or characteristic correlation length scales. The advantage of the single-sample approach is that it does not rely on any assumptions about the temporal behavior of the covariance parameters: time-dependent parameter estimates can be continuously adjusted on the basis of current observations. This is of practical importance since it is likely to be the case that both model error and observation error strongly depend on the actual state of the atmosphere. The single-sample estimation scheme can be incorporated into any four-dimensional statistical data assimilation system that involves explicit calculation of forecast error covariances, including optimal interpolation (OI) and the simplified Kalman filter (SKF). The computational cost of the scheme is high but not prohibitive; on-line estimation of one or two covariance parameters in each analysis box of an operational bozed-OI system is currently feasible. A number of numerical experiments performed with an adaptive SKF and an adaptive version of OI, using a linear two-dimensional shallow-water model and artificially generated model error are described. The performance of the nonadaptive versions of these methods turns out to depend rather strongly on correct specification of model error parameters. These parameters are estimated under a variety of conditions, including uniformly distributed model error and time-dependent model error statistics.
Kück, Patrick; Struck, Torsten H
2014-01-01
BaCoCa (BAse COmposition CAlculator) is a user-friendly software that combines multiple statistical approaches (like RCFV and C value calculations) to identify biases in aligned sequence data which potentially mislead phylogenetic reconstructions. As a result of its speed and flexibility, the program provides the possibility to analyze hundreds of pre-defined gene partitions and taxon subsets in one single process run. BaCoCa is command-line driven and can be easily integrated into automatic process pipelines of phylogenomic studies. Moreover, given the tab-delimited output style the results can be easily used for further analyses in programs like Excel or statistical packages like R. A built-in option of BaCoCa is the generation of heat maps with hierarchical clustering of certain results using R. As input files BaCoCa can handle FASTA and relaxed PHYLIP, which are commonly used in phylogenomic pipelines. BaCoCa is implemented in Perl and works on Windows PCs, Macs and Linux operating systems. The executable source code as well as example test files and a detailed documentation of BaCoCa are freely available at http://software.zfmk.de. Copyright © 2013 Elsevier Inc. All rights reserved.
Buda, N; Piskunowicz, M; Porzezińska, M; Kosiak, W; Zdrojewski, Z
2016-08-01
Patients with a diagnosed systemic connective tissue disease require regular monitoring from the point of view of interstitial lung disease. The main aim of this work is a description of the criteria for pulmonary fibrosis and the degree of the severity of the fibrosis during the course of interstitial lung disease through the TLU (transthoracic lung ultrasound). 52 patients with diagnosed diffuse interstitial lung disease were qualified for this research, together with 50 volunteers in the control group. The patients in both groups were over 18 years of age and were of both sexes. The results of the TLU of the patients underwent statistical analysis and were compared to High-Resolution Computed Tomography (HRCT) results. As a consequence of the statistical analysis, we defined our own criteria for pulmonary fibrosis in TLU: irregularity of the pleura line, tightening of the pleura line, the fragmentary nature of the pleura line, blurring of the pleura line, thickening of the pleura line, artifacts of line B ≤ 3 and ≥ 4, artifacts of Am line and subpleural consolidations < 5 mm. As a result of the conducted research, a scale of severity of pulmonary fibrosis in TLU was devised (UFI - Ultrasound Fibrosis Index), enabling a division to be made into mild, moderate and severe cases. Transthoracic Lung Ultrasonography (TLU) gives a new outlook on the diagnostic possibilities, non-invasive and devoid of ionising radiation, of pulmonary fibrosis. This research work has allowed to discover two new ultrasound symptoms of pulmonary fibrosis (blurred pleural line and Am lines). © Georg Thieme Verlag KG Stuttgart · New York.
Real-Time Measurement of Width and Height of Weld Beads in GMAW Processes.
Pinto-Lopera, Jesús Emilio; S T Motta, José Mauricio; Absi Alfaro, Sadek Crisostomo
2016-09-15
Associated to the weld quality, the weld bead geometry is one of the most important parameters in welding processes. It is a significant requirement in a welding project, especially in automatic welding systems where a specific width, height, or penetration of weld bead is needed. This paper presents a novel technique for real-time measuring of the width and height of weld beads in gas metal arc welding (GMAW) using a single high-speed camera and a long-pass optical filter in a passive vision system. The measuring method is based on digital image processing techniques and the image calibration process is based on projective transformations. The measurement process takes less than 3 milliseconds per image, which allows a transfer rate of more than 300 frames per second. The proposed methodology can be used in any metal transfer mode of a gas metal arc welding process and does not have occlusion problems. The responses of the measurement system, presented here, are in a good agreement with off-line data collected by a common laser-based 3D scanner. Each measurement is compare using a statistical Welch's t-test of the null hypothesis, which, in any case, does not exceed the threshold of significance level α = 0.01, validating the results and the performance of the proposed vision system.
Diffusive processes in a stochastic magnetic field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, H.; Vlad, M.; Vanden Eijnden, E.
1995-05-01
The statistical representation of a fluctuating (stochastic) magnetic field configuration is studied in detail. The Eulerian correlation functions of the magnetic field are determined, taking into account all geometrical constraints: these objects form a nondiagonal matrix. The Lagrangian correlations, within the reasonable Corrsin approximation, are reduced to a single scalar function, determined by an integral equation. The mean square perpendicular deviation of a geometrical point moving along a perturbed field line is determined by a nonlinear second-order differential equation. The separation of neighboring field lines in a stochastic magnetic field is studied. We find exponentiation lengths of both signs describing,more » in particular, a decay (on the average) of any initial anisotropy. The vanishing sum of these exponentiation lengths ensures the existence of an invariant which was overlooked in previous works. Next, the separation of a particle`s trajectory from the magnetic field line to which it was initially attached is studied by a similar method. Here too an initial phase of exponential separation appears. Assuming the existence of a final diffusive phase, anomalous diffusion coefficients are found for both weakly and strongly collisional limits. The latter is identical to the well known Rechester-Rosenbluth coefficient, which is obtained here by a more quantitative (though not entirely deductive) treatment than in earlier works.« less
A Non-parametric Approach to Constrain the Transfer Function in Reverberation Mapping
NASA Astrophysics Data System (ADS)
Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming
2016-11-01
Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (I.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.
NASA Astrophysics Data System (ADS)
Chunder, Anindarupa; Latypov, Azat; Chen, Yulu; Biafore, John J.; Levinson, Harry J.; Bailey, Todd
2017-03-01
Minimization and control of line-edge roughness (LER) and contact-edge roughness (CER) is one of the current challenges limiting EUV line-space and contact hole printability. One significant contributor to feature roughness and CD variability in EUV is photon shot noise (PSN); others are the physical and chemical processes in photoresists, known as resist stochastic effect. Different approaches are available to mitigate each of these contributions. In order to facilitate this mitigation, it is important to assess the magnitude of each of these contributions separately from others. In this paper, we present and test a computational approach based on the concept of an `ideal resist'. An ideal resist is assumed to be devoid of all resist stochastic effects. Hence, such an ideal resist can only be simulated as an `ideal resist model' (IRM) through explicit utilization of the Poisson statistics of PSN2 or direct Monte Carlo simulation of photon absorption in resist. LER estimated using IRM, thus quantifies the exclusive contribution of PSN to LER. The result of the simulation study done using IRM indicates higher magnitude of contribution (60%) from PSN to LER with respect to total or final LER for a sufficiently optimized high dose `state of the art' EUV chemically amplified resist (CAR) model.
NASA Astrophysics Data System (ADS)
Min, M.
2017-10-01
Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.
Structure of small-scale magnetic fields in the kinematic dynamo theory.
Schekochihin, Alexander; Cowley, Steven; Maron, Jason; Malyshkin, Leonid
2002-01-01
A weak fluctuating magnetic field embedded into a a turbulent conducting medium grows exponentially while its characteristic scale decays. In the interstellar medium and protogalactic plasmas, the magnetic Prandtl number is very large, so a broad spectrum of growing magnetic fluctuations is excited at small (subviscous) scales. The condition for the onset of nonlinear back reaction depends on the structure of the field lines. We study the statistical correlations that are set up in the field pattern and show that the magnetic-field lines possess a folding structure, where most of the scale decrease is due to the field variation across itself (rapid transverse direction reversals), while the scale of the field variation along itself stays approximately constant. Specifically, we find that, though both the magnetic energy and the mean-square curvature of the field lines grow exponentially, the field strength and the field-line curvature are anticorrelated, i.e., the curved field is relatively weak, while the growing field is relatively flat. The detailed analysis of the statistics of the curvature shows that it possesses a stationary limiting distribution with the bulk located at the values of curvature comparable to the characteristic wave number of the velocity field and a power tail extending to large values of curvature where it is eventually cut off by the resistive regularization. The regions of large curvature, therefore, occupy only a small fraction of the total volume of the system. Our theoretical results are corroborated by direct numerical simulations. The implication of the folding effect is that the advent of the Lorentz back reaction occurs when the magnetic energy approaches that of the smallest turbulent eddies. Our results also directly apply to the problem of statistical geometry of the material lines in a random flow.
Explorations in Statistics: Correlation
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2010-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This sixth installment of "Explorations in Statistics" explores correlation, a familiar technique that estimates the magnitude of a straight-line relationship between two variables. Correlation is meaningful only when the…
Effect of open rhinoplasty on the smile line.
Tabrizi, Reza; Mirmohamadsadeghi, Hoori; Daneshjoo, Danadokht; Zare, Samira
2012-05-01
Open rhinoplasty is an esthetic surgical technique that is becoming increasingly popular, and can affect the nose and upper lip compartments. The aim of this study was to evaluate the effect of open rhinoplasty on tooth show and the smile line. The study participants were 61 patients with a mean age of 24.3 years (range, 17.2 to 39.6 years). The surgical procedure consisted of an esthetic open rhinoplasty without alar resection. Analysis of tooth show was limited to pre- and postoperative (at 12 months) tooth show measurements at rest and the maximum smile with a ruler (when participants held their heads naturally). Statistical analyses were performed with SPSS 13.0, and paired-sample t tests were used to compare tooth show means before and after the operation. Analysis of the rest position showed no statistically significant change in tooth show (P = .15), but analysis of participants' maximum smile data showed a statistically significant increase in tooth show after surgery (P < .05). In contrast, Pearson correlation analysis showed a positive relation between rhinoplasty and tooth show increases in maximum smile, especially in subjects with high smile lines. This study shows that the nasolabial compartment is a single unit and any change in 1 part may influence the other parts. Further studies should be conducted to investigate these interactions. Copyright © 2012 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ghasemi Nejhad, M. N.
1993-04-01
The on-line consolidation of thermoplastic composites is a relatively new technology that can be used to manufacture composite parts with complex geometries. The localized melting/solidification technique employed in this process can reduce the residual stresses and allow for improved dimensional stability and performance. An additional advantage of this technique is the elimination of the curing steps which are necessary in the processing of thermoset-matrix composites. This article presents the effects of processing parameters on processability in on-line consolidation of thermoplastic composites for tape-laying and filament-winding processes employing anisotropic thermal analyses. The results show that the heater size, preheating conditions, and tow thickness can significantly affect the processing window which, in turn, affects the production rate and the quality of the parts.
Optimization and Improvement of Test Processes on a Production Line
NASA Astrophysics Data System (ADS)
Sujová, Erika; Čierna, Helena
2018-06-01
The paper deals with increasing processes efficiency at a production line of cylinder heads of engines in a production company operating in the automotive industry. The goal is to achieve improvement and optimization of test processes on a production line. It analyzes options for improving capacity, availability and productivity of processes of an output test by using modern technology available on the market. We have focused on analysis of operation times before and after optimization of test processes at specific production sections. By analyzing measured results we have determined differences in time before and after improvement of the process. We have determined a coefficient of efficiency OEE and by comparing outputs we have confirmed real improvement of the process of the output test of cylinder heads.
Parallel Electric Field on Auroral Magnetic Field Lines.
NASA Astrophysics Data System (ADS)
Yeh, Huey-Ching Betty
1982-03-01
The interaction of Birkeland (magnetic-field-aligned) current carriers and the Earth's magnetic field results in electrostatic potential drops along magnetic field lines. The statistical distributions of the field-aligned potential difference (phi)(,(PARLL)) were determined from the energy spectra of electron inverted "V" events observed at ionospheric altitude for different conditions of geomagnetic activity as indicated by the AE index. Data of 1270 electron inverted "V"'s were obtained from Low-Energy Electron measurements of the Atmosphere Explorer-C and -D Satellite (despun mode) in the interval January 1974-April 1976. In general, (phi)(,(PARLL)) is largest in the dusk to pre-midnight sector, smaller in the post-midnight to dawn sector, and smallest in the near noon sector during quiet and disturbed geomagnetic conditions; there is a steady dusk-dawn-noon asymmetry of the global (phi)(,(PARLL)) distribution. As the geomagnetic activity level increases, the (phi)(,(PARLL)) pattern expands to lower invariant latitudes, and the magnitude of (phi)(,(PARLL)) in the 13-24 magnetic local time sector increases significantly. The spatial structure and intensity variation of the global (phi)(,(PARLL)) distribution are statistically more variable, and the magnitudes of (phi)(,(PARLL)) have smaller correlation with the AE-index, in the post-midnight to dawn sector. A strong correlation is found to exist between upward Birkeland current systems and global parallel potential drops, and between auroral electron precipitation patterns and parallel potential drops, regarding their mophology, their intensity and their dependence of geomagnetic activity. An analysis of the fine-scale simultaneous current-voltage relationship for upward Birkeland currents in Region 1 shows that typical field-aligned potential drops are consistent with model predictions based on linear acceleration of the charge carriers through an electrostatic potential drop along convergent magnetic field lines to maintain current continuity. In a steady state, this model of simple electrostatic acceleration without anomalous resistivity also predicts observable relations between global parallel currents and parallel potential drops and between global energy deposition and parallel potential drops. The temperature, density, and species of the unaccelerated charge carriers are the relevant parameters of the model. The dusk-dawn -noon asymmetry of the global (phi)(,(PARLL)) distribution can be explained by the above steady-state (phi)(,(PARLL)) process if we associate the source regions of upward Birkeland current carriers in Region 1, Region 2, and the cusp region with the plasma sheet boundary layer, the near-Earth plasma sheet, and the magnetosheath, respectively. The results of this study provide observational information on the global distribution of parallel potential drops and the prevailing process of generating and maintaining potential gradients (parallel electric fields) along auroral magnetic field lines.
Multivariable Time Series Prediction for the Icing Process on Overhead Power Transmission Line
Li, Peng; Zhao, Na; Zhou, Donghua; Cao, Min; Li, Jingjie; Shi, Xinling
2014-01-01
The design of monitoring and predictive alarm systems is necessary for successful overhead power transmission line icing. Given the characteristics of complexity, nonlinearity, and fitfulness in the line icing process, a model based on a multivariable time series is presented here to predict the icing load of a transmission line. In this model, the time effects of micrometeorology parameters for the icing process have been analyzed. The phase-space reconstruction theory and machine learning method were then applied to establish the prediction model, which fully utilized the history of multivariable time series data in local monitoring systems to represent the mapping relationship between icing load and micrometeorology factors. Relevant to the characteristic of fitfulness in line icing, the simulations were carried out during the same icing process or different process to test the model's prediction precision and robustness. According to the simulation results for the Tao-Luo-Xiong Transmission Line, this model demonstrates a good accuracy of prediction in different process, if the prediction length is less than two hours, and would be helpful for power grid departments when deciding to take action in advance to address potential icing disasters. PMID:25136653
NASA Technical Reports Server (NTRS)
Scholtz, P.; Smyth, P.
1992-01-01
This article describes an investigation of a statistical hypothesis testing method for detecting changes in the characteristics of an observed time series. The work is motivated by the need for practical automated methods for on-line monitoring of Deep Space Network (DSN) equipment to detect failures and changes in behavior. In particular, on-line monitoring of the motor current in a DSN 34-m beam waveguide (BWG) antenna is used as an example. The algorithm is based on a measure of the information theoretic distance between two autoregressive models: one estimated with data from a dynamic reference window and one estimated with data from a sliding reference window. The Hinkley cumulative sum stopping rule is utilized to detect a change in the mean of this distance measure, corresponding to the detection of a change in the underlying process. The basic theory behind this two-model test is presented, and the problem of practical implementation is addressed, examining windowing methods, model estimation, and detection parameter assignment. Results from the five fault-transition simulations are presented to show the possible limitations of the detection method, and suggestions for future implementation are given.
TreSpEx—Detection of Misleading Signal in Phylogenetic Reconstructions Based on Tree Information
Struck, Torsten H
2014-01-01
Phylogenies of species or genes are commonplace nowadays in many areas of comparative biological studies. However, for phylogenetic reconstructions one must refer to artificial signals such as paralogy, long-branch attraction, saturation, or conflict between different datasets. These signals might eventually mislead the reconstruction even in phylogenomic studies employing hundreds of genes. Unfortunately, there has been no program allowing the detection of such effects in combination with an implementation into automatic process pipelines. TreSpEx (Tree Space Explorer) now combines different approaches (including statistical tests), which utilize tree-based information like nodal support or patristic distances (PDs) to identify misleading signals. The program enables the parallel analysis of hundreds of trees and/or predefined gene partitions, and being command-line driven, it can be integrated into automatic process pipelines. TreSpEx is implemented in Perl and supported on Linux, Mac OS X, and MS Windows. Source code, binaries, and additional material are freely available at http://www.annelida.de/research/bioinformatics/software.html. PMID:24701118
SU-E-CAMPUS-T-04: Statistical Process Control for Patient-Specific QA in Proton Beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
LAH, J; SHIN, D; Kim, G
Purpose: To evaluate and improve the reliability of proton QA process, to provide an optimal customized level using the statistical process control (SPC) methodology. The aim is then to suggest the suitable guidelines for patient-specific QA process. Methods: We investigated the constancy of the dose output and range to see whether it was within the tolerance level of daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to suggest the suitable guidelines for patient-specific QA in proton beam by using process capability indices. In this study, patient QA plans were classifiedmore » into 6 treatment sites: head and neck (41 cases), spinal cord (29 cases), lung (28 cases), liver (30 cases), pancreas (26 cases), and prostate (24 cases). Results: The deviations for the dose output and range of daily QA process were ±0.84% and ±019%, respectively. Our results show that the patient-specific range measurements are capable at a specification limit of ±2% in all treatment sites except spinal cord cases. In spinal cord cases, comparison of process capability indices (Cp, Cpm, Cpk ≥1, but Cpmk ≤1) indicated that the process is capable, but not centered, the process mean deviates from its target value. The UCL (upper control limit), CL (center line) and LCL (lower control limit) for spinal cord cases were 1.37%, −0.27% and −1.89%, respectively. On the other hands, the range differences in prostate cases were good agreement between calculated and measured values. The UCL, CL and LCL for prostate cases were 0.57%, −0.11% and −0.78%, respectively. Conclusion: SPC methodology has potential as a useful tool to customize an optimal tolerance levels and to suggest the suitable guidelines for patient-specific QA in clinical proton beam.« less
Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V
2017-03-01
A mechanistic model-based soft sensor is developed and validated for 550L filamentous fungus fermentations operated at Novozymes A/S. The soft sensor is comprised of a parameter estimation block based on a stoichiometric balance, coupled to a dynamic process model. The on-line parameter estimation block models the changing rates of formation of product, biomass, and water, and the rate of consumption of feed using standard, available on-line measurements. This parameter estimation block, is coupled to a mechanistic process model, which solves the current states of biomass, product, substrate, dissolved oxygen and mass, as well as other process parameters including k L a, viscosity and partial pressure of CO 2 . State estimation at this scale requires a robust mass model including evaporation, which is a factor not often considered at smaller scales of operation. The model is developed using a historical data set of 11 batches from the fermentation pilot plant (550L) at Novozymes A/S. The model is then implemented on-line in 550L fermentation processes operated at Novozymes A/S in order to validate the state estimator model on 14 new batches utilizing a new strain. The product concentration in the validation batches was predicted with an average root mean sum of squared error (RMSSE) of 16.6%. In addition, calculation of the Janus coefficient for the validation batches shows a suitably calibrated model. The robustness of the model prediction is assessed with respect to the accuracy of the input data. Parameter estimation uncertainty is also carried out. The application of this on-line state estimator allows for on-line monitoring of pilot scale batches, including real-time estimates of multiple parameters which are not able to be monitored on-line. With successful application of a soft sensor at this scale, this allows for improved process monitoring, as well as opening up further possibilities for on-line control algorithms, utilizing these on-line model outputs. Biotechnol. Bioeng. 2017;114: 589-599. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Capture cross sections on unstable nuclei
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonchev, A. P.; Escher, J. E.; Scielzo, N.
2017-09-13
Accurate neutron-capture cross sections on unstable nuclei near the line of beta stability are crucial for understanding the s-process nucleosynthesis. However, neutron-capture cross sections for short-lived radionuclides are difficult to measure due to the fact that the measurements require both highly radioactive samples and intense neutron sources. Essential ingredients for describing the γ decays following neutron capture are the γ-ray strength function and level densities. We will compare different indirect approaches for obtaining the most relevant observables that can constrain Hauser-Feshbach statistical-model calculations of capture cross sections. Specifically, we will consider photon scattering using monoenergetic and 100% linearly polarized photonmore » beams. Here, challenges that exist on the path to obtaining neutron-capture cross sections for reactions on isotopes near and far from stability will be discussed.« less
Akshatha, B K; Karuppiah, Karpagaselvi; Manjunath, G S; Kumarswamy, Jayalakshmi; Papaiah, Lokesh; Rao, Jyothi
2017-01-01
Introduction: The three common odontogenic cysts include radicular cysts (RCs), dentigerous cysts (DCs), and odontogenic keratocysts (OKCs). Among these 3 cysts, OKC is recently been classified as benign keratocystic odontogenic tumor attributing to its aggressive behavior, recurrence rate, and malignant potential. The present study involved qualitative and quantitative analysis of inducible nitric oxide synthase (iNOS) expression in epithelial lining of RCs, DCs, and OKCs, compare iNOS expression in epithelial linings of all the 3 cysts and determined overexpression of iNOS in OKCs which might contribute to its aggressive behavior and malignant potential. Aims: The present study is to investigate the role of iNOS in the pathogenesis of OKCs, DCs, and RCs by evaluating the iNOS expression in the epithelial lining of these cysts. Subjects and Methods: Analysis of iNOS expression in epithelial lining cells of 20 RCs, 20 DCs, and 20 OKCs using immunohistochemistry done. Statistical Analysis Used: The percentage of positive cells and intensity of stain was assessed and compared among all the 3 cysts using contingency coefficient. Kappa statistics for the two observers were computed for finding interobserver agreement. Results: The percentage of iNOS-positive cells was found to be remarkably high in OKCs (12/20) –57.1% as compared to RCs (6/20) – 28.6% and DCs (3/20) – 14.3%. The interobserver agreement for iNOS-positive percentage cells was arrived with kappa values with OKCs → Statistically significant (P > 0.000), RCs → statistically significant (P > 0.001) with no significant values for DCs. No statistical difference exists among 3 study samples in regard to the intensity of staining with iNOS. Conclusions: Increased iNOS expression in OKCs may contribute to bone resorption and accumulation of wild-type p53, hence, making OKCs more aggressive. PMID:29391711
Assessment of Cell Line Models of Primary Human Cells by Raman Spectral Phenotyping
Swain, Robin J.; Kemp, Sarah J.; Goldstraw, Peter; Tetley, Teresa D.; Stevens, Molly M.
2010-01-01
Abstract Researchers have previously questioned the suitability of cell lines as models for primary cells. In this study, we used Raman microspectroscopy to characterize live A549 cells from a unique molecular biochemical perspective to shed light on their suitability as a model for primary human pulmonary alveolar type II (ATII) cells. We also investigated a recently developed transduced type I (TT1) cell line as a model for alveolar type I (ATI) cells. Single-cell Raman spectra provide unique biomolecular fingerprints that can be used to characterize cellular phenotypes. A multivariate statistical analysis of Raman spectra indicated that the spectra of A549 and TT1 cells are characterized by significantly lower phospholipid content compared to ATII and ATI spectra because their cytoplasm contains fewer surfactant lamellar bodies. Furthermore, we found that A549 spectra are statistically more similar to ATI spectra than to ATII spectra. The spectral variation permitted phenotypic classification of cells based on Raman spectral signatures with >99% accuracy. These results suggest that A549 cells are not a good model for ATII cells, but TT1 cells do provide a reasonable model for ATI cells. The findings have far-reaching implications for the assessment of cell lines as suitable primary cellular models in live cultures. PMID:20409492
The Evaluation Method of the Lightning Strike on Transmission Lines Aiming at Power Grid Reliability
NASA Astrophysics Data System (ADS)
Wen, Jianfeng; Wu, Jianwei; Huang, Liandong; Geng, Yinan; Yu, zhanqing
2018-01-01
Lightning protection of power system focuses on reducing the flashover rate, only distinguishing by the voltage level, without considering the functional differences between the transmission lines, and being lack of analysis the effect on the reliability of power grid. This will lead lightning protection design of general transmission lines is surplus but insufficient for key lines. In order to solve this problem, the analysis method of lightning striking on transmission lines for power grid reliability is given. Full wave process theory is used to analyze the lightning back striking; the leader propagation model is used to describe the process of shielding failure of transmission lines. The index of power grid reliability is introduced and the effect of transmission line fault on the reliability of power system is discussed in detail.
Moving line model and avalanche statistics of Bingham fluid flow in porous media.
Chevalier, Thibaud; Talon, Laurent
2015-07-01
In this article, we propose a simple model to understand the critical behavior of path opening during flow of a yield stress fluid in porous media as numerically observed by Chevalier and Talon (2015). This model can be mapped to the problem of a contact line moving in an heterogeneous field. Close to the critical point, this line presents an avalanche dynamic where the front advances by a succession of waiting time and large burst events. These burst events are then related to the non-flowing (i.e. unyielded) areas. Remarkably, the statistics of these areas reproduce the same properties as in the direct numerical simulations. Furthermore, even if our exponents seem to be close to the mean field universal exponents, we report an unusual bump in the distribution which depends on the disorder. Finally, we identify a scaling invariance of the cluster spatial shape that is well fit, to first order, by a self-affine parabola.
Adaptive variable-length coding for efficient compression of spacecraft television data.
NASA Technical Reports Server (NTRS)
Rice, R. F.; Plaunt, J. R.
1971-01-01
An adaptive variable length coding system is presented. Although developed primarily for the proposed Grand Tour missions, many features of this system clearly indicate a much wider applicability. Using sample to sample prediction, the coding system produces output rates within 0.25 bit/picture element (pixel) of the one-dimensional difference entropy for entropy values ranging from 0 to 8 bit/pixel. This is accomplished without the necessity of storing any code words. Performance improvements of 0.5 bit/pixel can be simply achieved by utilizing previous line correlation. A Basic Compressor, using concatenated codes, adapts to rapid changes in source statistics by automatically selecting one of three codes to use for each block of 21 pixels. The system adapts to less frequent, but more dramatic, changes in source statistics by adjusting the mode in which the Basic Compressor operates on a line-to-line basis. Furthermore, the compression system is independent of the quantization requirements of the pulse-code modulation system.
VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data
Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel
2014-01-01
This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198
NASA Technical Reports Server (NTRS)
La Dous, Constanze
1991-01-01
IUE observations of dwarf novae at maximum at quiescence and novalike objects at the high brightness state are analyzed for effects of the inclination angle on the emitted continuum and line radiation. A clear pattern in the continuum flux distribution is exhibited only by dwarf novae at maximum where some 80 percent of the non-double-eclipsing systems show essentially identical distributions. This result is not in disagreement with theoretical expectations. All classes of objects exhibit a clear, but in each case different, dependence of the line radiation on the inclination angle.
NASA Astrophysics Data System (ADS)
Li, Xuebao; Cui, Xiang; Lu, Tiebing; Ma, Wenzuo; Bian, Xingming; Wang, Donglai; Hiziroglu, Huseyin
2016-03-01
The corona-generated audible noise (AN) has become one of decisive factors in the design of high voltage direct current (HVDC) transmission lines. The AN from transmission lines can be attributed to sound pressure pulses which are generated by the multiple corona sources formed on the conductor, i.e., transmission lines. In this paper, a detailed time-domain characteristics of the sound pressure pulses, which are generated by the DC corona discharges formed over the surfaces of a stranded conductors, are investigated systematically in a laboratory settings using a corona cage structure. The amplitude of sound pressure pulse and its time intervals are extracted by observing a direct correlation between corona current pulses and corona-generated sound pressure pulses. Based on the statistical characteristics, a stochastic model is presented for simulating the sound pressure pulses due to DC corona discharges occurring on conductors. The proposed stochastic model is validated by comparing the calculated and measured A-weighted sound pressure level (SPL). The proposed model is then used to analyze the influence of the pulse amplitudes and pulse rate on the SPL. Furthermore, a mathematical relationship is found between the SPL and conductor diameter, electric field, and radial distance.
49 CFR Schedule G to Subpart B of... - Selected Statistical Data
Code of Federal Regulations, 2011 CFR
2011-10-01
... 49 Transportation 8 2011-10-01 2011-10-01 false Selected Statistical Data G Schedule G to Subpart... Statistical Data [Dollars in thousands] () Greyhound Lines, Inc. () Trailways combined () All study carriers... purpose of Schedule G is to develop selected property, labor and operational data for use in evaluating...
John F. Caratti
2006-01-01
The FIREMON Line Intercept (LI) method is used to assess changes in plant species cover for a macroplot. This method uses multiple line transects to sample within plot variation and quantify statistically valid changes in plant species cover and height over time. This method is suited for most forest and rangeland communities, but is especially useful for sampling...
NASA Astrophysics Data System (ADS)
Bennadji, A.
2013-07-01
The North East of Scotland's construction method is similar to most popular building typologies in the UK. This typology can vary in term of external material (Granite, brick or stone) but with a secondary, usually timber sub frame with a lining on its interior. Insulation was seldom a consideration when such buildings were completed. Statistics shows that 80% of existing buildings in the UK will need to be upgraded. The lack of knowledge in dealing with old building fabric's manipulation has a negative impact on buildings' integrity. The documentation of such process seems to be an important step that buildings' actors should undertake to communicate a practical knowledge that is still at incubation stage. We wanted for this documentation to be visual, as descriptions might mislead none specialised and specialised in the field due to the innovative approach our method was conducted with. For the Scottish context this research/experiment will concentrate on existing granite wall buildings with plastered lath internal wall. It is unfortunate to see the commonly beautiful interiors of Scottish buildings disappearing, when the internal linings are removed. Skips are filled with old Plaster and Lath and new linings have to be supplied and fitted. Excessive waste is created in this change. This paper is based on a historic building energy improvement case study financed by the European commission and the Scottish Government. The pilot study consists of insulating an 18th century house using an innovative product and method. The project was a response to a call by the CIC start (Construction Innovation Club), aiming to establish a link between SMEs and the Universities. The project saw the day in collaboration with Icynene Canada, KDL Kishorn (see full list in the acknowledgment). This paper describes the process through which the team went through to improve the building envelope without damaging the buildings original features (Loveday et all). The energy efficiency improvement consists on improving the walls U-val by introducing an insulation material Icynene (Sadineni, France & Boehm 2011) into the cavity wall. The U-val was improved by 50% and no redecoration was needed after the operation and no disturbance to the building's occupants.
Modulations of the processing of line discontinuities under selective attention conditions?
Giersch, Anne; Fahle, Manfred
2002-01-01
We examined whether the processing of discontinuities involved in figure-ground segmentation, like line ends, can be modulated under selective attention conditions. Subjects decided whether a gap in collinear or parallel lines was located to the right or left. Two stimuli were displayed in immediate succession. When the gaps were on the same side, reaction times (RTs) for the second stimulus increased when collinear lines followed parallel lines, or the reverse, but only when the two stimuli shared the same orientation and location. The effect did not depend on the global form of the stimuli or on the relative orientation of the gaps. A frame drawn around collinear elements affected the results, suggesting a crucial role of the "amodal" orthogonal lines produced when line ends are aligned. Including several gaps in the first stimulus also eliminated RT variations. By contrast, RT variations remained stable across several experimental blocks and were significant for interstimulus intervals from 50 to 600 msec between the two stimuli. These results are interpreted in terms of a modulation of the processing of line ends or the production of amodal lines, arising when attention is selectively drawn to a gap.
Intuitive Face Judgments Rely on Holistic Eye Movement Pattern
Mega, Laura F.; Volz, Kirsten G.
2017-01-01
Non-verbal signals such as facial expressions are of paramount importance for social encounters. Their perception predominantly occurs without conscious awareness and is effortlessly integrated into social interactions. In other words, face perception is intuitive. Contrary to classical intuition tasks, this work investigates intuitive processes in the realm of every-day type social judgments. Two differently instructed groups of participants judged the authenticity of emotional facial expressions, while their eye movements were recorded: an ‘intuitive group,’ instructed to rely on their “gut feeling” for the authenticity judgments, and a ‘deliberative group,’ instructed to make their judgments after careful analysis of the face. Pixel-wise statistical maps of the resulting eye movements revealed a differential viewing pattern, wherein the intuitive judgments relied on fewer, longer and more centrally located fixations. These markers have been associated with a global/holistic viewing strategy. The holistic pattern of intuitive face judgments is in line with evidence showing that intuition is related to processing the “gestalt” of an object, rather than focusing on details. Our work thereby provides further evidence that intuitive processes are characterized by holistic perception, in an understudied and real world domain of intuition research. PMID:28676773
Intuitive Face Judgments Rely on Holistic Eye Movement Pattern.
Mega, Laura F; Volz, Kirsten G
2017-01-01
Non-verbal signals such as facial expressions are of paramount importance for social encounters. Their perception predominantly occurs without conscious awareness and is effortlessly integrated into social interactions. In other words, face perception is intuitive. Contrary to classical intuition tasks, this work investigates intuitive processes in the realm of every-day type social judgments. Two differently instructed groups of participants judged the authenticity of emotional facial expressions, while their eye movements were recorded: an 'intuitive group,' instructed to rely on their "gut feeling" for the authenticity judgments, and a 'deliberative group,' instructed to make their judgments after careful analysis of the face. Pixel-wise statistical maps of the resulting eye movements revealed a differential viewing pattern, wherein the intuitive judgments relied on fewer, longer and more centrally located fixations. These markers have been associated with a global/holistic viewing strategy. The holistic pattern of intuitive face judgments is in line with evidence showing that intuition is related to processing the "gestalt" of an object, rather than focusing on details. Our work thereby provides further evidence that intuitive processes are characterized by holistic perception, in an understudied and real world domain of intuition research.
Ostadan, Fatemeh; Centeno, Carla; Daloze, Jean-Felix; Frenn, Mira; Lundbye-Jensen, Jesper; Roig, Marc
2016-12-01
A single bout of cardiovascular exercise performed immediately after practicing a motor task improves the long-term retention of the skill through an optimization of memory consolidation. However, the specific brain mechanisms underlying the effects of acute cardiovascular exercise on procedural memory are poorly understood. We sought to determine if a single bout of exercise modifies corticospinal excitability (CSE) during the early stages of memory consolidation. In addition, we investigated if changes in CSE are associated with exercise-induced off-line gains in procedural memory. Participants practiced a serial reaction time task followed by either a short bout of acute exercise or a similar rest period. To monitor changes in CSE we used transcranial magnetic stimulation applied to the primary motor cortex (M1) at baseline, 15, 35, 65 and 125min after exercise or rest. Participants in the exercise condition showed larger (∼24%) improvements in procedural memory through consolidation although differences between groups did not reach statistical significance. Exercise promoted an increase in CSE, which remained elevated 2h after exercise. More importantly, global increases in CSE following exercise correlated with the magnitude of off-line gains in skill level assessed in a retention test performed 8h after motor practice. A single bout of exercise modulates short-term neuroplasticity mechanisms subserving consolidation processes that predict off-line gains in procedural memory. Copyright © 2016 Elsevier Inc. All rights reserved.
Novel Phenotype Issues Raised in Cross-National Epidemiological Research on Drug Dependence
Anthony, James C.
2010-01-01
Stage-transition models based on the American Diagnostic and Statistical Manual (DSM) generally are applied in epidemiology and genetics research on drug dependence syndromes associated with cannabis, cocaine, and other internationally regulated drugs (IRD). Difficulties with DSM stage-transition models have surfaced during cross-national research intended to provide a truly global perspective, such as the work of the World Mental Health Surveys (WMHS) Consortium. Alternative simpler dependence-related phenotypes are possible, including population-level count process models for steps early and before coalescence of clinical features into a coherent syndrome (e.g., zero-inflated Poisson regression). Selected findings are reviewed, based on ZIP modeling of alcohol, tobacco, and IRD count processes, with an illustration that may stimulate new research on genetic susceptibility traits. The annual National Surveys on Drug Use and Health can be readily modified for this purpose, along the lines of a truly anonymous research approach that can help make NSDUH-type cross-national epidemiological surveys more useful in the context of subsequent genome wide association (GWAS) research and post-GWAS investigations with a truly global health perspective. PMID:20201862
2000-08-01
luminance performance and aviation, many aviators develop ametropias refractive error having comparable effects on during their careers. We were... statistically (0.04 logMAR, the non-aviator group. Separate investigators at p=0.01), but not clinically significant (ə/2 line different research facilities... statistically significant (0.11 ± 0.1 logCS, t=4.0, sensitivity on the SLCT decreased for the aviator pɘ.001), yet there is significant overlap group at a
ERIC Educational Resources Information Center
Love, Tracy; Walenski, Matthew; Swinney, David
2009-01-01
The central question underlying this study revolves around how children process co-reference relationships--such as those evidenced by pronouns ("him") and reflexives ("himself")--and how a slowed rate of speech input may critically affect this process. Previous studies of child language processing have demonstrated that typical language…
The relationship between forward head posture and temporomandibular disorders.
Lee, W Y; Okeson, J P; Lindroth, J
1995-01-01
This study investigated the relationship between forward head posture and temporomandibular disorder symptoms. Thirty-three temporomandibular disorder patients with predominant complaints of masticatory muscle pain were compared with an age- and gender-matched control group. Head position was measured from photographs taken with a plumb line drawn from the ceiling to the lateral malleolus of the ankle and with a horizontal plane that was perpendicular to the plumb line and that passed through the spinous process of the seventh cervical vertebra. The distances from the plumb line to the ear, to the seventh vertebra, and to the shoulder were measured. Two angles were also measured: (1) ear-seventh cervical vertebra-horizontal plane and (2) eye-ear-seventh cervical vertebra. The only measurement that revealed a statistically significant difference was angle ear-seventh cervical vertebra-horizontal plane. This angle was smaller in the patients with temporomandibular disorders than in the control subjects. In other words, when evaluating the ear position with respect to the seventh cervical vertebra, the head was positioned more forward in the group with temporomandibular disorders than in the control group (P < .05).
DOE Office of Scientific and Technical Information (OSTI.GOV)
House, L.L.; Querfeld, C.W.; Rees, D.E.
1982-04-15
Coronal magnetic fields influence in the intensity and linear polarization of light scattered by coronal Fe XIV ions. To interpret polarization measurements of Fe XIV 5303 A coronal emission requires a detailed understanding of the dependence of the emitted Stokes vector on coronal magnetic field direction, electron density, and temperature and on height of origin. The required dependence is included in the solutions of statistical equilibrium for the ion which are solved explicitly for 34 magnetic sublevels in both the ground and four excited terms. The full solutions are reduced to equivalent simple analytic forms which clearly show the requiredmore » dependence on coronal conditions. The analytic forms of the reduced solutions are suitable for routine analysis of 5303 green line polarimetric data obtained at Pic du Midi and from the Solar Maximum Mission Coronagraph/Polarimeter.« less
The effects of organization on medical utilization: an analysis of service line organization.
Byrne, Margaret M; Charns, Martin P; Parker, Victoria A; Meterko, Mark M; Wray, Nelda P
2004-01-01
To determine whether clinical service lines in primary care and mental health reduces inpatient and urgent care utilization. All VHA medical centers were surveyed to determine whether service lines had been established in primary care or mental health care prior to the beginning of fiscal year 1997 (FY97). Facility-level data on medical utilization from Veterans Health Affairs (VHA) administrative databases were used for descriptive and multivariate regression analyses of utilization and of changes in measures between FY97 and FY98. Nine primary care-related and 5 mental health-related variables were analyzed. Primary care and mental health service lines had been established in approximately half of all facilities. Service lines varied in duration and extent of restructuring. Mere presence of a service line had no positive and several negative effects on measured outcome variables. More detailed analyses showed that some types of service lines have statistically significant and mostly negative effects on both mental health and primary care-related measures. Newly implemented service lines had significantly less improvement in measures over time than facilities with no service line. Health care organizations are implementing innovative organizational structures in hopes of improving quality of care and reducing resource utilization. We found that service lines in primary care and mental health may lead to an initial period of disruption, with little evidence of a beneficial effect on performance for longer duration service lines.
Kail, Robert V
2013-11-01
According to dual-process models that include analytic and heuristic modes of processing, analytic processing is often expected to become more common with development. Consistent with this view, on reasoning problems, adolescents are more likely than children to select alternatives that are backed by statistical evidence. It is shown here that this pattern depends on the quality of the statistical evidence and the quality of the testimonial that is the typical alternative to statistical evidence. In Experiment 1, 9- and 13-year-olds (N=64) were presented with scenarios in which solid statistical evidence was contrasted with casual or expert testimonial evidence. When testimony was casual, children relied on it but adolescents did not; when testimony was expert, both children and adolescents relied on it. In Experiment 2, 9- and 13-year-olds (N=83) were presented with scenarios in which casual testimonial evidence was contrasted with weak or strong statistical evidence. When statistical evidence was weak, children and adolescents relied on both testimonial and statistical evidence; when statistical evidence was strong, most children and adolescents relied on it. Results are discussed in terms of their implications for dual-process accounts of cognitive development. Copyright © 2013 Elsevier Inc. All rights reserved.
Characterization of palmprints by wavelet signatures via directional context modeling.
Zhang, Lei; Zhang, David
2004-06-01
The palmprint is one of the most reliable physiological characteristics that can be used to distinguish between individuals. Current palmprint-based systems are more user friendly, more cost effective, and require fewer data signatures than traditional fingerprint-based identification systems. The principal lines and wrinkles captured in a low-resolution palmprint image provide more than enough information to uniquely identify an individual. This paper presents a palmprint identification scheme that characterizes a palmprint using a set of statistical signatures. The palmprint is first transformed into the wavelet domain, and the directional context of each wavelet subband is defined and computed in order to collect the predominant coefficients of its principal lines and wrinkles. A set of statistical signatures, which includes gravity center, density, spatial dispersivity and energy, is then defined to characterize the palmprint with the selected directional context values. A classification and identification scheme based on these signatures is subsequently developed. This scheme exploits the features of principal lines and prominent wrinkles sufficiently and achieves satisfactory results. Compared with the line-segments-matching or interesting-points-matching based palmprint verification schemes, the proposed scheme uses a much smaller amount of data signatures. It also provides a convenient classification strategy and more accurate identification.
Order-disorder phenomena in the low-temperature phase of BaTiO3
NASA Astrophysics Data System (ADS)
Völkel, G.; Müller, K. A.
2007-09-01
X - and Q -band electron paramagnetic resonance measurements are reported on Mn4+ -doped BaTiO3 single crystals in the rhombohedral low-temperature phase. The Mn4+ probe ion is statistically substitute for the isovalent Ti4+ ion. The critical line broadening observed when approaching the phase transition to the orthorhombic phase demonstrates the presence of order-disorder processes within the off-center Ti subsystem and the formation of dynamic precursor clusters with a structure compatible with one of the orthorhombic phase. From the data it is concluded that BaTiO3 shows a special type of phase transition where displacive and order-disorder character are not only present at the cubic-tetragonal transition, but also at the orthorhombic-rhombohedral transition at low temperatures. The disappearance of the Mn4+ spectrum in the orthorhombic, tetragonal, and cubic phases can be interpreted as the consequence of the strong line broadening caused by changes of the instantaneous off-center positions in time around the averaged off-center position along a body diagonal.
Miles, P C
1999-03-20
An optical diagnostic system based on line imaging of Raman-scattered light has been developed to study the mixing processes in internal combustion engines. The system permits multipoint, single laser-shot measurements of CO(2), O(2), N(2), C(3)H(8), and H(2)O mole fractions with submillimeter spatial resolution. Selection of appropriate system hardware is discussed, as are subsequent data reduction and analysis procedures. Results are reported for data obtained at multiple crank angles and in two different engine flow fields. Measurements are made at 12 locations simultaneously, each location having measurement volume dimensions of 0.5 mm x 0.5 mm x 0.9 mm. The data are analyzed to obtain statistics of species mole fractions: mean, rms, histograms, and both spatial and cross-species covariance functions. The covariance functions are used to quantify the accuracy of the measured rms mole fraction fluctuations, to determine the integral length scales of the mixture inhomogeneities, and to quantify the cycle-to-cycle fluctuations in bulk mixture composition under well-mixed conditions.
Visual feature-tolerance in the reading network.
Rauschecker, Andreas M; Bowen, Reno F; Perry, Lee M; Kevan, Alison M; Dougherty, Robert F; Wandell, Brian A
2011-09-08
A century of neurology and neuroscience shows that seeing words depends on ventral occipital-temporal (VOT) circuitry. Typically, reading is learned using high-contrast line-contour words. We explored whether a specific VOT region, the visual word form area (VWFA), learns to see only these words or recognizes words independent of the specific shape-defining visual features. Word forms were created using atypical features (motion-dots, luminance-dots) whose statistical properties control word-visibility. We measured fMRI responses as word form visibility varied, and we used TMS to interfere with neural processing in specific cortical circuits, while subjects performed a lexical decision task. For all features, VWFA responses increased with word-visibility and correlated with performance. TMS applied to motion-specialized area hMT+ disrupted reading performance for motion-dots, but not line-contours or luminance-dots. A quantitative model describes feature-convergence in the VWFA and relates VWFA responses to behavioral performance. These findings suggest how visual feature-tolerance in the reading network arises through signal convergence from feature-specialized cortical areas. Copyright © 2011 Elsevier Inc. All rights reserved.
Cytotoxicity induced by cypermethrin in Human Neuroblastoma Cell Line SH-SY5Y.
Raszewski, Grzegorz; Lemieszek, Marta Kinga; Łukawski, Krzysztof
2016-01-01
The purpose of this study was to evaluate the cytotoxic potential of Cypermethrin (CM) on cultured human Neuroblastoma SH-SY5Y cells. SH-SY5Y cells were treated with CM at 0-200µM for 24, 48, and 72 h, in vitro. It was found that CM induced the cell death of Neuroblastoma cells in a dose- and time-dependent manner, as shown by LDH assays. Next, some aspects of the process of cell death triggered by CM in the human SH-SY5Y cell line were investigated. It was revealed that the pan-caspase inhibitor Q-VD-OPh, sensitizes SH-SY5Y cells to necroptosis caused by CM. Furthermore, signal transduction inhibitors PD98059, SL-327, SB202190, SP600125 failed to attenuate the effect of the pesticide. Finally, it was shown that inhibition of TNF-a by Pomalidomide (PLD) caused statistically significant reduction in CM-induced cytotoxicity. Overall, the data obtained suggest that CM induces neurotoxicity in SH-SY5Y cells by necroptosis.
Tannamala, Pavan Kumar; Pulagam, Mahesh; Pottem, Srinivas R; Swapna, B
2012-04-01
The purpose of this study was to compare the sagittal condylar angles set in the Hanau articulator by use of a method of obtaining an intraoral protrusive record to those angles found using a panoramic radiographic image. Ten patients, free of signs and symptoms of temporomandibular disorder and with intact dentition were selected. The dental stone casts of the subjects were mounted on a Hanau articulator with a springbow and poly(vinyl siloxane) interocclusal records. For all patients, the protrusive records were obtained when the mandible moved forward by approximately 6 mm. All procedures for recording, mounting, and setting were done in the same session. The condylar guidance angles obtained were tabulated. A panoramic radiographic image of each patient was made with the Frankfurt horizontal plane parallel to the floor of the mouth. Tracings of the radiographic images were made. The horizontal reference line was marked by joining the orbitale and porion. The most superior and most inferior points of the curvatures were identified. These two lines were connected by a straight line representing the mean curvature line. Angles made by the intersection of the mean curvature line and the horizontal reference line were measured. The results were subjected to statistical analysis with a significance level of p < 0.05. The radiographic values were on average 4° greater than the values obtained by protrusive interocclusal record method. The mean condylar guidance angle between the right and left side by both the methods was not statistically significant. The comparison of mean condylar guidance angles between the right side of the protrusive record method and the right side of the panoramic radiographic method and the left side of the protrusive record method and the left side of the panoramic radiographic method (p= 0.071 and p= 0.057, respectively) were not statistically significant. Within the limitations of this study, it was concluded that the protrusive condylar guidance angles obtained by panoramic radiograph may be used in programming semi-adjustable articulators. © 2012 by the American College of Prosthodontists.
Girgis, Erian H; Mahoney, John P; Khalil, Rafaat H; Soliman, Magdi R
2010-07-01
Studies conducted in our lab have indicated that thalidomide cytotoxicity in the KG-1a human acute myelogenous leukemia (AML) cell line was enhanced by combining it with arsenic trioxide. The current investigation was conducted in order to evaluate the effect of thalidomide either alone or in combination with arsenic trioxide on the release of tumor necrosis factor-α (TNF-α) and vascular endothelial growth factor (VEGF) from this cell line in an attempt to clarify its possible cytotoxic mechanism(s). Human AML cell line KG-1a was used in this study. The cells were cultured for 48 h in the presence or absence of thalidomide (5 mg/l), and or arsenic trioxide (4 μM). The levels of TNF-α and VEGF in the supernatant were determined by ELISA. Results obtained indicate that the levels of TNF-α in the supernatant of KG-1a cell cultures incubated with thalidomide, arsenic trioxide, or combination were statistically lower than those observed in the supernatant of control cells (2.89, 5.07, 4.15 and 16.88 pg/ml, respectively). However, the levels of VEGF in the supernatant of thalidomide-treated cells were statistically higher than those in the supernatant of control cells (69.61 vs. 11.48 pg/l). Arsenic trioxide, whether alone or in combination with thalidomide, did not produce any statistically significant difference in the levels of VEGF as compared to the control or thalidomide-treated cell supernatant. These findings indicate that thalidomide and the arsenic trioxide inhibition of TNF-α production by KG-1a cells may play an important role in their cytotoxic effect.
Online Statistical Modeling (Regression Analysis) for Independent Responses
NASA Astrophysics Data System (ADS)
Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus
2017-06-01
Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.
Model-Based PAT for Quality Management in Pharmaceuticals Freeze-Drying: State of the Art
Fissore, Davide
2017-01-01
Model-based process analytical technologies can be used for the in-line control and optimization of a pharmaceuticals freeze-drying process, as well as for the off-line design of the process, i.e., the identification of the optimal operating conditions. This paper aims at presenting the state of the art in this field, focusing, particularly, on three groups of systems, namely, those based on the temperature measurement (i.e., the soft sensor), on the chamber pressure measurement (i.e., the systems based on the test of pressure rise and of pressure decrease), and on the sublimation flux estimate (i.e., the tunable diode laser absorption spectroscopy and the valveless monitoring system). The application of these systems for in-line process optimization (e.g., using a model predictive control algorithm) and to get a true quality by design (e.g., through the off-line calculation of the design space of the process) is presented and discussed. PMID:28224123
Isotropic Inelastic Collisions in a Multiterm Atom with Hyperfine Structure
NASA Astrophysics Data System (ADS)
Belluzzi, Luca; Landi Degl'Innocenti, Egidio; Trujillo Bueno, Javier
2015-10-01
A correct modeling of the scattering polarization profiles observed in some spectral lines of diagnostic interest, the sodium doublet being one of the most important examples, requires taking hyperfine structure (HFS) and quantum interference between different J-levels into account. An atomic model suitable for taking these physical ingredients into account is the so-called multiterm atom with HFS. In this work, we introduce and study the transfer and relaxation rates due to isotropic inelastic collisions with electrons, which enter the statistical equilibrium equations (SEE) for the atomic density matrix of this atomic model. Under the hypothesis that the electron-atom interaction is described by a dipolar operator, we provide useful relations between the rates describing the transfer and relaxation of quantum interference between different levels (whose numerical values are in most cases unknown) and the usual rates for the atomic level populations, for which experimental data and/or approximate theoretical expressions are generally available. For the particular case of a two-term atom with HFS, we present an analytical solution of the SEE for the spherical statistical tensors of the upper term, including both radiative and collisional processes, and we derive the expression of the emission coefficient in the four Stokes parameters. Finally, an illustrative application to the Na i D1 and D2 lines is presented.
Rezende, Enrico L; Kelly, Scott A; Gomes, Fernando R; Chappell, Mark A; Garland, Theodore
2006-01-01
Selective breeding for over 35 generations has led to four replicate (S) lines of laboratory house mice (Mus domesticus) that run voluntarily on wheels about 170% more than four random-bred control (C) lines. We tested whether S lines have evolved higher running performance by increasing running economy (i.e., decreasing energy spent per unit of distance) as a correlated response to selection, using a recently developed method that allows for nearly continuous measurements of oxygen consumption (VO2) and running speed in freely behaving animals. We estimated slope (incremental cost of transport [COT]) and intercept for regressions of power (the dependent variable, VO2/min) on speed for 49 males and 47 females, as well as their maximum VO2 and speeds during wheel running, under conditions mimicking those that these lines face during the selection protocol. For comparison, we also measured COT and maximum aerobic capacity (VO2max) during forced exercise on a motorized treadmill. As in previous studies, the increased wheel running of S lines was mainly attributable to increased average speed, with males also showing a tendency for increased time spent running. On a whole-animal basis, combined analysis of males and females indicated that COT during voluntary wheel running was significantly lower in the S lines (one-tailed P=0.015). However, mice from S lines are significantly smaller and attain higher maximum speeds on the wheels; with either body mass or maximum speed (or both) entered as a covariate, the statistical significance of the difference in COT is lost (one-tailed P> or =0.2). Thus, both body size and behavior are key components of the reduction in COT. Several statistically significant sex differences were observed, including lower COT and higher resting metabolic rate in females. In addition, maximum voluntary running speeds were negatively correlated with COT in females but not in males. Moreover, males (but not females) from the S lines exhibited significantly higher treadmill VO2max as compared to those from C lines. The sex-specific responses to selection may in part be consequences of sex differences in body mass and running style. Our results highlight how differences in size and running speed can account for lower COT in S lines and suggest that lower COT may have coadapted in response to selection for higher running distances in these lines.
Kurra, Swamy; Metkar, Umesh; Yirenkyi, Henaku; Tallarico, Richard A; Lavelle, William F
Retrospectively reviewed surgeries between 2011 and 2015 of patients who underwent posterior spinal deformity instrumentation with constructs involving fusions to pelvis and encompassing at least five levels. Measure the radiographic outcomes of coronal malalignment (CM) after use of an intraoperative T square shaped instrument in posterior spinal deformity surgeries with at least five levels of fusion and extension to pelvis. Neuromuscular children found to benefit from intraoperative T square technique to help achieve proper coronal spinal balance with extensive fusions. This intraoperative technique used in our posterior spine deformity instrumentation surgeries with the aforementioned parameters. There were 50 patients: n = 16 with intraoperative T square and n = 34 no-T square shaped device. Subgroups divided based on greater than 20 mm displacement and greater than 40 mm displacement of the C7 plumb line to the central sacral vertical line on either side in preoperative radiographs. We analyzed the demographics and the pre- and postoperative radiographic parameters of standing films: standing CM (displacement of C7 plumb line to central sacral vertical line), and major coronal Cobb angles in total sample and subgroups and compared T square shaped device with no-T square shaped device use by analysis of variance. A p value ≤.05 is statistically significant. In the total sample, though postoperative CM mean was not statistically different, we observed greater CM corrections in patients where a T square shaped device was used (70%) versus no-T square shaped device used (18%). In >20 mm and >40 mm subgroups, the postoperative mean CM values were statistically lower for the patients where a T square shaped device was used, p = .016 and p = .003, respectively. Cobb corrections were statistically higher for T square shaped device use in both >20 mm and >40 mm subgroups, 68%, respectively. The intraoperative T square shaped device technique had a positive effect on the amount of spine coronal malalignment correction after its use and for lumbar and thoracic coronal Cobb angles. Level III. Copyright © 2017 Scoliosis Research Society. Published by Elsevier Inc. All rights reserved.
40 CFR 63.603 - Standards for new sources.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) National Emission Standards for Hazardous Air Pollutants From Phosphoric Acid Manufacturing Plants § 63.603 Standards for new sources. (a) Wet process phosphoric acid process line. On and after the date on which the... equivalent P2O5 feed (0.01350 lb/ton). (b) Superphosphoric acid process line. On and after the date on which...
40 CFR 63.603 - Standards for new sources.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) National Emission Standards for Hazardous Air Pollutants From Phosphoric Acid Manufacturing Plants § 63.603 Standards for new sources. (a) Wet process phosphoric acid process line. On and after the date on which the... equivalent P2O5 feed (0.01350 lb/ton). (b) Superphosphoric acid process line. On and after the date on which...
40 CFR 63.603 - Standards for new sources.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) National Emission Standards for Hazardous Air Pollutants From Phosphoric Acid Manufacturing Plants § 63.603 Standards for new sources. (a) Wet process phosphoric acid process line. On and after the date on which the... equivalent P2O5 feed (0.01350 lb/ton). (b) Superphosphoric acid process line. On and after the date on which...
Rybicka, Marta; Stachowska, Ewa; Gutowska, Izabela; Parczewski, Miłosz; Baśkiewicz, Magdalena; Machaliński, Bogusław; Boroń-Kaczmarska, Anna; Chlubek, Dariusz
2011-04-27
The aim of this study was to investigate the effect of conjugated linoleic acids (CLAs) on macrophage reactive oxygen species synthesis and the activity and expression of antioxidant enzymes, catalase (Cat), glutathione peroxidase (GPx), and superoxide dismutase (SOD). The macrophages were obtained from the THP-1 monocytic cell line. Cells were incubated with the addition of cis-9,trans-11 CLA or trans-10,cis-12 CLA or linoleic acid. Reactive oxygen species (ROS) formation was estimated by flow cytometry. Enzymes activity was measured spectrophotometrically. The antioxidant enzyme mRNA expression was estimated by real-time reverse transcriptase polymerase chain reaction (RT-PCR). Statistical analysis was based on nonparametric statistical tests [Friedman analysis of variation (ANOVA) and Wilcoxon signed-rank test]. cis-9,trans-11 CLA significantly increased the activity of Cat, while trans-10,cis-12 CLA notably influenced GPx activity. Both isomers significantly decreased mRNA expression for Cat. Only trans-10,cis-12 significantly influenced mRNA for SOD-2 expression. The CLAs activate processes of the ROS formation in macrophages. Adverse metabolic effects of each isomer action were observed.
Comparison of line shortening assessed by aerial image and wafer measurements
NASA Astrophysics Data System (ADS)
Ziegler, Wolfram; Pforr, Rainer; Thiele, Joerg; Maurer, Wilhelm
1997-02-01
Increasing number of patterns per area and decreasing linewidth demand enhancement technologies for optical lithography. OPC, the correction of systematic non-linearity in the pattern transfer process by correction of design data is one possibility to tighten process control and to increase the lifetime of existing lithographic equipment. The two most prominent proximity effects to be corrected by OPC are CD variation and line shortening. Line shortening measured on a wafer is up to 2 times larger than full resist simulation results. Therefore, the influence of mask geometry to line shortening is a key item to parameterize lithography. The following paper discusses the effect of adding small serifs to line ends with 0.25 micrometer ground-rule design. For reticles produced on an ALTA 3000 with standard wet etch process, the corner rounding on them mask can be reduced by adding serifs of a certain size. The corner rounding was measured and the effect on line shortening on the wafer is determined. This was investigated by resist measurements on wafer, aerial image plus resist simulation and aerial image measurements on the AIMS microscope.
Akshatha, B K; Karuppiah, Karpagaselvi; Manjunath, G S; Kumarswamy, Jayalakshmi; Papaiah, Lokesh; Rao, Jyothi
2017-01-01
The three common odontogenic cysts include radicular cysts (RCs), dentigerous cysts (DCs), and odontogenic keratocysts (OKCs). Among these 3 cysts, OKC is recently been classified as benign keratocystic odontogenic tumor attributing to its aggressive behavior, recurrence rate, and malignant potential. The present study involved qualitative and quantitative analysis of inducible nitric oxide synthase (iNOS) expression in epithelial lining of RCs, DCs, and OKCs, compare iNOS expression in epithelial linings of all the 3 cysts and determined overexpression of iNOS in OKCs which might contribute to its aggressive behavior and malignant potential. The present study is to investigate the role of iNOS in the pathogenesis of OKCs, DCs, and RCs by evaluating the iNOS expression in the epithelial lining of these cysts. Analysis of iNOS expression in epithelial lining cells of 20 RCs, 20 DCs, and 20 OKCs using immunohistochemistry done. The percentage of positive cells and intensity of stain was assessed and compared among all the 3 cysts using contingency coefficient. Kappa statistics for the two observers were computed for finding interobserver agreement. The percentage of iNOS-positive cells was found to be remarkably high in OKCs (12/20) -57.1% as compared to RCs (6/20) - 28.6% and DCs (3/20) - 14.3%. The interobserver agreement for iNOS-positive percentage cells was arrived with kappa values with OKCs → Statistically significant ( P > 0.000), RCs → statistically significant ( P > 0.001) with no significant values for DCs. No statistical difference exists among 3 study samples in regard to the intensity of staining with iNOS. Increased iNOS expression in OKCs may contribute to bone resorption and accumulation of wild-type p53, hence, making OKCs more aggressive.
Studying Lyman-alpha escape and reionization in Green Pea galaxies
NASA Astrophysics Data System (ADS)
Yang, Huan; Malhotra, Sangeeta; Rhoads, James E.; Gronke, Max; Leitherer, Claus; Wofford, Aida; Dijkstra, Mark
2017-01-01
Green Pea galaxies are low-redshift galaxies with extreme [OIII]5007 emission line. We built the first statistical sample of Green Peas observed by HST/COS and used them as analogs of high-z Lyman-alpha emitters to study Ly-alpha escape and Ly-alpha sizes. Using the HST/COS 2D spectra, we found that Ly-alpha sizes of Green Peas are larger than the UV continuum sizes. We found many correlations between Ly-alpha escape fraction and galactic properties -- dust extinction, Ly-alpha kinematic features, [OIII]/[OII] ratio, and gas outflow velocities. We fit an empirical relation to predict Ly-alpha escape fraction from dust extinction and Ly-alpha red-peak velocity. In the JWST era, we can use this relation to derive the IGM HI column density along the line of sight of each high-z Ly-alpha emitter and probe the reionization process.
MHSS: a material handling system simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pomernacki, L.; Hollstien, R.B.
1976-04-07
A Material Handling System Simulator (MHSS) program is described that provides specialized functional blocks for modeling and simulation of nuclear material handling systems. Models of nuclear fuel fabrication plants may be built using functional blocks that simulate material receiving, storage, transport, inventory, processing, and shipping operations as well as the control and reporting tasks of operators or on-line computers. Blocks are also provided that allow the user to observe and gather statistical information on the dynamic behavior of simulated plants over single or replicated runs. Although it is currently being developed for the nuclear materials handling application, MHSS can bemore » adapted to other industries in which material accountability is important. In this paper, emphasis is on the simulation methodology of the MHSS program with application to the nuclear material safeguards problem. (auth)« less
Applications of statistical and atomic physics to the spectral line broadening and stock markets
NASA Astrophysics Data System (ADS)
Volodko, Dmitriy
The purpose of this investigation is the application of time correlation function methodology on the theoretical research of the shift of hydrogen and hydrogen-like spectral lines due to electrons and ions interaction with the spectral line emitters-dipole ionic-electronic shift (DIES) and the describing a behavior of stock-market in terms of a simple physical model simulation which obeys Levy statistical distribution---the same as that of the real stock-market index. Using Generalized Theory of Stark broadening of electrons in plasma we discovered a new source of the shift of hydrogen and hydrogen-like spectral lines that we called a dipole ionic-electronic shift (DIES). This shift results from the indirect coupling of electron and ion microfields in plasmas which is facilitated by the radiating atom/ion. We have shown that the DIES, unlike all previously known shifts, is highly nonlinear and has a different sign for different ranges of plasma parameters. The most favorable conditions for observing the DIES correspond to plasmas of high densities, but of relatively low temperature. For the Balmer-alpha line of hydrogen with the most favorable observational conditions Ne > 1018 cm-3, T < 2 eV, the DIES has been already confirmed experimentally. Based on the study of the time correlations and of the probability distribution of fluctuations in the stock market, we developed a relatively simple physical model, which simulates the Dow Jones Industrials index and makes short-term (a couple of days) predictions of its trend.
Altmann, Gerry T M
2017-01-05
Statistical approaches to emergent knowledge have tended to focus on the process by which experience of individual episodes accumulates into generalizable experience across episodes. However, there is a seemingly opposite, but equally critical, process that such experience affords: the process by which, from a space of types (e.g. onions-a semantic class that develops through exposure to individual episodes involving individual onions), we can perceive or create, on-the-fly, a specific token (a specific onion, perhaps one that is chopped) in the absence of any prior perceptual experience with that specific token. This article reviews a selection of statistical learning studies that lead to the speculation that this process-the generation, on the basis of semantic memory, of a novel episodic representation-is itself an instance of a statistical, in fact associative, process. The article concludes that the same processes that enable statistical abstraction across individual episodes to form semantic memories also enable the generation, from those semantic memories, of representations that correspond to individual tokens, and of novel episodic facts about those tokens. Statistical learning is a window onto these deeper processes that underpin cognition.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).
EuroPhenome: a repository for high-throughput mouse phenotyping data
Morgan, Hugh; Beck, Tim; Blake, Andrew; Gates, Hilary; Adams, Niels; Debouzy, Guillaume; Leblanc, Sophie; Lengger, Christoph; Maier, Holger; Melvin, David; Meziane, Hamid; Richardson, Dave; Wells, Sara; White, Jacqui; Wood, Joe; de Angelis, Martin Hrabé; Brown, Steve D. M.; Hancock, John M.; Mallon, Ann-Marie
2010-01-01
The broad aim of biomedical science in the postgenomic era is to link genomic and phenotype information to allow deeper understanding of the processes leading from genomic changes to altered phenotype and disease. The EuroPhenome project (http://www.EuroPhenome.org) is a comprehensive resource for raw and annotated high-throughput phenotyping data arising from projects such as EUMODIC. EUMODIC is gathering data from the EMPReSSslim pipeline (http://www.empress.har.mrc.ac.uk/) which is performed on inbred mouse strains and knock-out lines arising from the EUCOMM project. The EuroPhenome interface allows the user to access the data via the phenotype or genotype. It also allows the user to access the data in a variety of ways, including graphical display, statistical analysis and access to the raw data via web services. The raw phenotyping data captured in EuroPhenome is annotated by an annotation pipeline which automatically identifies statistically different mutants from the appropriate baseline and assigns ontology terms for that specific test. Mutant phenotypes can be quickly identified using two EuroPhenome tools: PhenoMap, a graphical representation of statistically relevant phenotypes, and mining for a mutant using ontology terms. To assist with data definition and cross-database comparisons, phenotype data is annotated using combinations of terms from biological ontologies. PMID:19933761
NASA Astrophysics Data System (ADS)
Fan, Daidu; Tu, Junbiao; Cai, Guofu; Shang, Shuai
2015-06-01
Grain-size analysis is a basic routine in sedimentology and related fields, but diverse methods of sample collection, processing and statistical analysis often make direct comparisons and interpretations difficult or even impossible. In this paper, 586 published grain-size datasets from the Qiantang Estuary (East China Sea) sampled and analyzed by the same procedures were merged and their textural parameters calculated by a percentile and two moment methods. The aim was to explore which of the statistical procedures performed best in the discrimination of three distinct sedimentary units on the tidal flats of the middle Qiantang Estuary. A Gaussian curve-fitting method served to simulate mixtures of two normal populations having different modal sizes, sorting values and size distributions, enabling a better understanding of the impact of finer tail components on textural parameters, as well as the proposal of a unifying descriptive nomenclature. The results show that percentile and moment procedures yield almost identical results for mean grain size, and that sorting values are also highly correlated. However, more complex relationships exist between percentile and moment skewness (kurtosis), changing from positive to negative correlations when the proportions of the finer populations decrease below 35% (10%). This change results from the overweighting of tail components in moment statistics, which stands in sharp contrast to the underweighting or complete amputation of small tail components by the percentile procedure. Intercomparisons of bivariate plots suggest an advantage of the Friedman & Johnson moment procedure over the McManus moment method in terms of the description of grain-size distributions, and over the percentile method by virtue of a greater sensitivity to small variations in tail components. The textural parameter scalings of Folk & Ward were translated into their Friedman & Johnson moment counterparts by application of mathematical functions derived by regression analysis of measured and modeled grain-size data, or by determining the abscissa values of intersections between auxiliary lines running parallel to the x-axis and vertical lines corresponding to the descriptive percentile limits along the ordinate of representative bivariate plots. Twofold limits were extrapolated for the moment statistics in relation to single descriptive terms in the cases of skewness and kurtosis by considering both positive and negative correlations between percentile and moment statistics. The extrapolated descriptive scalings were further validated by examining entire size-frequency distributions simulated by mixing two normal populations of designated modal size and sorting values, but varying in mixing ratios. These were found to match well in most of the proposed scalings, although platykurtic and very platykurtic categories were questionable when the proportion of the finer population was below 5%. Irrespective of the statistical procedure, descriptive nomenclatures should therefore be cautiously used when tail components contribute less than 5% to grain-size distributions.
Pedersen, Camilla; Bräuner, Elvira V.; Rod, Naja H.; Albieri, Vanna; Andersen, Claus E.; Ulbak, Kaare; Hertel, Ole; Johansen, Christoffer; Schüz, Joachim; Raaschou-Nielsen, Ole
2014-01-01
We investigated whether there is an interaction between distance from residence at birth to nearest power line and domestic radon and traffic-related air pollution, respectively, in relation to childhood leukemia risk. Further, we investigated whether adjusting for potential confounders alters the association between distance to nearest power line and childhood leukemia. We included 1024 cases aged <15, diagnosed with leukemia during 1968–1991, from the Danish Cancer Registry and 2048 controls randomly selected from the Danish childhood population and individually matched by gender and year of birth. We used geographical information systems to determine the distance between residence at birth and the nearest 132–400 kV overhead power line. Concentrations of domestic radon and traffic-related air pollution (NOx at the front door) were estimated using validated models. We found a statistically significant interaction between distance to nearest power line and domestic radon regarding risk of childhood leukemia (p = 0.01) when using the median radon level as cut-off point but not when using the 75th percentile (p = 0.90). We found no evidence of an interaction between distance to nearest power line and traffic-related air pollution (p = 0.73). We found almost no change in the estimated association between distance to power line and risk of childhood leukemia when adjusting for socioeconomic status of the municipality, urbanization, maternal age, birth order, domestic radon and traffic-related air pollution. The statistically significant interaction between distance to nearest power line and domestic radon was based on few exposed cases and controls and sensitive to the choice of exposure categorization and might, therefore, be due to chance. PMID:25259740
Pedersen, Camilla; Bräuner, Elvira V; Rod, Naja H; Albieri, Vanna; Andersen, Claus E; Ulbak, Kaare; Hertel, Ole; Johansen, Christoffer; Schüz, Joachim; Raaschou-Nielsen, Ole
2014-01-01
We investigated whether there is an interaction between distance from residence at birth to nearest power line and domestic radon and traffic-related air pollution, respectively, in relation to childhood leukemia risk. Further, we investigated whether adjusting for potential confounders alters the association between distance to nearest power line and childhood leukemia. We included 1024 cases aged <15, diagnosed with leukemia during 1968-1991, from the Danish Cancer Registry and 2048 controls randomly selected from the Danish childhood population and individually matched by gender and year of birth. We used geographical information systems to determine the distance between residence at birth and the nearest 132-400 kV overhead power line. Concentrations of domestic radon and traffic-related air pollution (NOx at the front door) were estimated using validated models. We found a statistically significant interaction between distance to nearest power line and domestic radon regarding risk of childhood leukemia (p = 0.01) when using the median radon level as cut-off point but not when using the 75th percentile (p = 0.90). We found no evidence of an interaction between distance to nearest power line and traffic-related air pollution (p = 0.73). We found almost no change in the estimated association between distance to power line and risk of childhood leukemia when adjusting for socioeconomic status of the municipality, urbanization, maternal age, birth order, domestic radon and traffic-related air pollution. The statistically significant interaction between distance to nearest power line and domestic radon was based on few exposed cases and controls and sensitive to the choice of exposure categorization and might, therefore, be due to chance.
Statistical Equilibrium of Copper in the Solar Atmosphere
NASA Astrophysics Data System (ADS)
Shi, J. R.; Gehren, T.; Zeng, J. L.; Mashonkina, L.; Zhao, G.
2014-02-01
Non-local thermodynamic equilibrium (NLTE) line formation for neutral copper in the one-dimensional solar atmospheres is presented for the atomic model, including 96 terms of Cu I and the ground state of Cu II. The accurate oscillator strengths for all the line transitions in model atom and photoionization cross sections were calculated using the R-matrix method in the Russell-Saunders coupling scheme. The main NLTE mechanism for Cu I is the ultraviolet overionization. We find that NLTE leads to systematically depleted total absorption in the Cu I lines and, accordingly, positive abundance corrections. Inelastic collisions with neutral hydrogen atoms produce minor effects on the statistical equilibrium of Cu I in the solar atmosphere. For the solar Cu I lines, the departures from LTE are found to be small, the mean NLTE abundance correction of ~0.01 dex. It was found that the six low-excitation lines, with excitation energy of the lower level E exc <= 1.64 eV, give a 0.14 dex lower mean solar abundance compared to that from the six E exc > 3.7 eV lines, when applying experimental gf-values of Kock & Richter. Without the two strong resonance transitions, the solar mean NLTE abundance from 10 lines of Cu I is log ɛ⊙(Cu) = 4.19 ± 0.10, which is consistent within the error bars with the meteoritic value 4.25 ± 0.05 of Lodders et al. The discrepancy between E exc = 1.39-1.64 eV and E exc > 3.7 eV lines can be removed when the calculated gf-values are adopted and a mean solar abundance of log ɛ⊙(Cu) = 4.24 ± 0.08 is derived.
On-Line Control of Metal Processing. Report of the Committee on On-Line Control of Metal Processing
1989-02-01
Materials Engineering. His work has concentrated on the electroprocessing of metals in molten salts . He is a member of TMS, AIME, ES, Canadian Institute...continuous ingot casting process with three 32 discrete control loops Figure 4-2 Controller incorporating process model 36 Figure 4-3 Real-time molten ...and others while providing a controlled macrostructure and solidification substructure. In this process, molten metal continuously flows from a
Research Progresses on Small Flux Ropes
NASA Astrophysics Data System (ADS)
Huang, J.; Liu, Y.; Peng, J.; Klecker, B.
2017-12-01
Small flux ropes (SFRs) have attracted much attention in recent years, but their origins are still debatable. In order to investigate their source regions and formation mechanisms, we present a case study and a statistical study in this work. First, we make a multi-spacecraft study of a SFR entrained by rolling back magnetic field lines around 1 AU. Such SFRs have only been seldom reported in the literature. This SFR was adjacent to a heliospheric plasma sheet (HPS), and they showed similar plasma signatures (except plasma beta), density ratio of alpha particle-to-proton (Nα/Np) and heavy ion ionization states, implying they may have a similar origin in the corona. The composition and the configuration of the rolling back magnetic field lines suggested this SFR should originate from the streamer belt through interchange reconnection. Combining the observations of STEREO and ACE, the SFR was shown to have an axis tilted to the ecliptic plane and the radius may vary with different spatial positions. In this study, we suggest interchange reconnection can play an important role for the origin of, at least, some SFRs and slow solar wind. Then, we make a statistical study of the distributions of iron average charge states (Q) in SFRs. Former studies on magnetic clouds classified the Q distributions into five types, i.e. type A to E. We investigate the SFRs, except "very small flux ropes", from 1998 to 2009, and find that type A cases are absent. Furthermore, we also try to identify their sources. Based on these analysis, we suppose the twist structures of solar corona originated SFRs are generally formed after their eruptions. But the SFRs that originate from interplanetary space may involve with complicate magnetic reconnection processes, which may result of much complicate Q distributions.
An atlas of synthetic line profiles of Planetary Nebulae
NASA Astrophysics Data System (ADS)
Morisset, C.; Stasinska, G.
2008-04-01
We have constructed a grid of photoionization models of spherical, elliptical and bipolar planetary nebulae. Assuming different velocity fields, we have computed line profiles corresponding to different orientations, slit sizes and positions. The atlas is meant both for didactic purposes and for the interpretation of data on real nebulae. As an application, we have shown that line profiles are often degenerate, and that recovering the geometry and velocity field from observations requires lines from ions with different masses and different ionization potentials. We have also shown that the empirical way to measure mass-weighted expansion velocities from observed line widths is reasonably accurate if considering the HWHM. For distant nebulae, entirely covered by the slit, the unknown geometry and orientation do not alter the measured velocities statistically. The atlas is freely accessible from internet. The Cloudy_3D suite and the associated VISNEB tool are available on request.
NASA Astrophysics Data System (ADS)
Goto, Shin-itiro; Umeno, Ken
2018-03-01
Maps on a parameter space for expressing distribution functions are exactly derived from the Perron-Frobenius equations for a generalized Boole transform family. Here the generalized Boole transform family is a one-parameter family of maps, where it is defined on a subset of the real line and its probability distribution function is the Cauchy distribution with some parameters. With this reduction, some relations between the statistical picture and the orbital one are shown. From the viewpoint of information geometry, the parameter space can be identified with a statistical manifold, and then it is shown that the derived maps can be characterized. Also, with an induced symplectic structure from a statistical structure, symplectic and information geometric aspects of the derived maps are discussed.
Byabagambi, John B; Broughton, Edward; Heltebeitel, Simon; Wuliji, Tana; Karamagi, Esther
2017-01-01
Inadequate medication dispensing and management by healthcare providers can contribute to poor outcomes among HIV-positive patients. Gaps in medication availability, often associated with pharmacy workforce shortages, are an important barrier to retention in HIV care in Uganda. An intervention to address pharmacy staffing constraints through strengthening pharmaceutical management, dispensing practices, and general competencies of facility clinical and pharmacy staff was implemented in 14 facilities in three districts in eastern Uganda. Teams of staff were organised in each facility and supported to apply quality improvement (QI) methods to address deficits in availability and rational use of HIV drugs. To evaluate the intervention, baseline and end line data were collected 24 months apart. Dispensing practices, clinical wellness and adherence to antiretrovirals improved by 45%, 28% and 20% from baseline to end line, respectively. All clients at end line received the medications prescribed, and medications were correctly, completely and legibly labelled more often. Clients better understood when, how much and for how long they were supposed to take their prescribed medicines at end line. Pharmaceutical management practices also improved from baseline in most categories by statistically significant margins. Facilities significantly improved on correctly recording stock information about antiretroviral drugs (53%vs100%, P<0.0001). Coinciding with existing staff taking on pharmaceutical roles, facilities improved management of unwanted and expired drugs, notably by optimising use of existing health workers and making pharmaceutical management processes more efficient. Implementation of this improvement intervention in the 14 facilities appeared to have a positive impact on client outcomes, pharmacy department management and providers' self-reported knowledge of QI methods. These results were achieved at a cost of about US$5.50 per client receiving HIV services at participating facilities.
CargoTIPS: an innovative approach to combating cargo theft
NASA Astrophysics Data System (ADS)
Toth, Gail E.
1998-12-01
Cargo theft has been estimated by the Federal Bureau o Investigations to be 6 billion annually, while others believe it to be more than 10 billion annually. Opportunistic thieves, street gangs, traditional organized crime groups, and new organized crime groups have been targeting cargo. They steal from warehouses, terminals, equipment, truck stops, or any place where freight comes to a rest. With zero inventory levels, our trailers have become virtual warehouses on wheels and easy targets for thieves. Without information and communication cargo thieves can thrive. The industry and law enforcement are forced into being reactive instead of developing proactive policies and procedures. Cargo thieves cross town lines, county lines, state lines and country borders. This makes communication within the law enforcement community imperative. CargoTIPS (cargo theft information processing system) was developed in response to the need for cargo theft information. The system allows us to collect cargo theft statistics to analyze the problem, assess the threat and develop a response on a national level. CargoTIPS includes a bulletin board, which allows users to communicate with each other, pass on alerts or seek information. The system is also used as an investigative tool. CargoTIPS can identify the mode of transportation (truck, small parcel, air, rail or ocean). It was designed to take in international data. Currently the system has identified that food products are the number one targeted commodity, followed by electronic products and third, computers and computer parts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yi; Guildenbecher, Daniel R.; Hoffmeister, Kathryn N. G.
The combustion of molten metals is an important area of study with applications ranging from solid aluminized rocket propellants to fireworks displays. Our work uses digital in-line holography (DIH) to experimentally quantify the three-dimensional position, size, and velocity of aluminum particles during combustion of ammonium perchlorate (AP) based solid-rocket propellants. Additionally, spatially resolved particle temperatures are simultaneously measured using two-color imaging pyrometry. To allow for fast characterization of the properties of tens of thousands of particles, automated data processing routines are proposed. In using these methods, statistics from aluminum particles with diameters ranging from 15 to 900 µm are collectedmore » at an ambient pressure of 83 kPa. In the first set of DIH experiments, increasing initial propellant temperature is shown to enhance the agglomeration of nascent aluminum at the burning surface, resulting in ejection of large molten aluminum particles into the exhaust plume. The resulting particle number and volume distributions are quantified. In the second set of simultaneous DIH and pyrometry experiments, particle size and velocity relationships as well as temperature statistics are explored. The average measured temperatures are found to be 2640 ± 282 K, which compares well with previous estimates of the range of particle and gas-phase temperatures. The novel methods proposed here represent new capabilities for simultaneous quantification of the joint size, velocity, and temperature statistics during the combustion of molten metal particles. The proposed techniques are expected to be useful for detailed performance assessment of metalized solid-rocket propellants.« less
Chen, Yi; Guildenbecher, Daniel R.; Hoffmeister, Kathryn N. G.; ...
2017-05-05
The combustion of molten metals is an important area of study with applications ranging from solid aluminized rocket propellants to fireworks displays. Our work uses digital in-line holography (DIH) to experimentally quantify the three-dimensional position, size, and velocity of aluminum particles during combustion of ammonium perchlorate (AP) based solid-rocket propellants. Additionally, spatially resolved particle temperatures are simultaneously measured using two-color imaging pyrometry. To allow for fast characterization of the properties of tens of thousands of particles, automated data processing routines are proposed. In using these methods, statistics from aluminum particles with diameters ranging from 15 to 900 µm are collectedmore » at an ambient pressure of 83 kPa. In the first set of DIH experiments, increasing initial propellant temperature is shown to enhance the agglomeration of nascent aluminum at the burning surface, resulting in ejection of large molten aluminum particles into the exhaust plume. The resulting particle number and volume distributions are quantified. In the second set of simultaneous DIH and pyrometry experiments, particle size and velocity relationships as well as temperature statistics are explored. The average measured temperatures are found to be 2640 ± 282 K, which compares well with previous estimates of the range of particle and gas-phase temperatures. The novel methods proposed here represent new capabilities for simultaneous quantification of the joint size, velocity, and temperature statistics during the combustion of molten metal particles. The proposed techniques are expected to be useful for detailed performance assessment of metalized solid-rocket propellants.« less
SAFARI, an On-Line Text-Processing System User's Manual.
ERIC Educational Resources Information Center
Chapin, P.G.; And Others.
This report describes for the potential user a set of procedures for processing textual materials on-line. In this preliminary model an information analyst can scan through messages, reports, and other documents on a display scope and select relevant facts, which are processed linguistically and then stored in the computer in the form of logical…
Development of evaluation technique of GMAW welding quality based on statistical analysis
NASA Astrophysics Data System (ADS)
Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua
2014-11-01
Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.
Statistical properties of several models of fractional random point processes
NASA Astrophysics Data System (ADS)
Bendjaballah, C.
2011-08-01
Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.
Coakley, K J; Imtiaz, A; Wallis, T M; Weber, J C; Berweger, S; Kabos, P
2015-03-01
Near-field scanning microwave microscopy offers great potential to facilitate characterization, development and modeling of materials. By acquiring microwave images at multiple frequencies and amplitudes (along with the other modalities) one can study material and device physics at different lateral and depth scales. Images are typically noisy and contaminated by artifacts that can vary from scan line to scan line and planar-like trends due to sample tilt errors. Here, we level images based on an estimate of a smooth 2-d trend determined with a robust implementation of a local regression method. In this robust approach, features and outliers which are not due to the trend are automatically downweighted. We denoise images with the Adaptive Weights Smoothing method. This method smooths out additive noise while preserving edge-like features in images. We demonstrate the feasibility of our methods on topography images and microwave |S11| images. For one challenging test case, we demonstrate that our method outperforms alternative methods from the scanning probe microscopy data analysis software package Gwyddion. Our methods should be useful for massive image data sets where manual selection of landmarks or image subsets by a user is impractical. Published by Elsevier B.V.
REMPI-TOFMS for on-line monitoring and controlling the coffee roasting process
NASA Astrophysics Data System (ADS)
Dorfner, Ralph; Ferge, Thomas; Yeretzian, Chahan; Zimmermann, Ralf; Kettrup, Antonius
2001-08-01
REMPI@266nm-TOFMS is used for on-line analysis of the coffee roasting process. Volatile and flavor active compounds of coffee were ionized by REMPI@266nm and monitored on-line and in real-time by TOFMS during the coffee roasting process. The phenol and 4-vinylguaiacol time-intensity profiles, for example, show typical behavior for different roasting temperatures and provide an indicator to the achieved degree of roasting. The impact of the moisture level of the green coffee beans on the time shift of a typical (commercial) roasting time, correlates with REMPI-TOFMS measurements and literature data.
MuffinInfo: HTML5-Based Statistics Extractor from Next-Generation Sequencing Data.
Alic, Andy S; Blanquer, Ignacio
2016-09-01
Usually, the information known a priori about a newly sequenced organism is limited. Even resequencing the same organism can generate unpredictable output. We introduce MuffinInfo, a FastQ/Fasta/SAM information extractor implemented in HTML5 capable of offering insights into next-generation sequencing (NGS) data. Our new tool can run on any software or hardware environment, in command line or graphically, and in browser or standalone. It presents information such as average length, base distribution, quality scores distribution, k-mer histogram, and homopolymers analysis. MuffinInfo improves upon the existing extractors by adding the ability to save and then reload the results obtained after a run as a navigable file (also supporting saving pictures of the charts), by supporting custom statistics implemented by the user, and by offering user-adjustable parameters involved in the processing, all in one software. At the moment, the extractor works with all base space technologies such as Illumina, Roche, Ion Torrent, Pacific Biosciences, and Oxford Nanopore. Owing to HTML5, our software demonstrates the readiness of web technologies for mild intensive tasks encountered in bioinformatics.
Searching for the 3.5 keV Line in the Deep Fields with Chandra: The 10 Ms Observations
NASA Astrophysics Data System (ADS)
Cappelluti, Nico; Bulbul, Esra; Foster, Adam; Natarajan, Priyamvada; Urry, Megan C.; Bautz, Mark W.; Civano, Francesca; Miller, Eric; Smith, Randall K.
2018-02-01
We report a systematic search for an emission line around 3.5 keV in the spectrum of the cosmic X-ray background using a total of ∼10 Ms Chandra observations toward the COSMOS Legacy and Extended Chandra Deep Field South survey fields. We find marginal evidence of a feature at an energy of ∼3.51 keV with a significance of 2.5–3σ, depending on the choice of statistical treatment. The line intensity is best fit at (8.8 ± 2.9) × 10‑7 ph cm‑2 s‑1 when using a simple Δχ 2 or {10.2}-0.4+0.2× {10}-7 ph cm‑2 s‑1 when Markov chain Monte Carlo is used. Based on our knowledge of Chandra and the reported detection of the line by other instruments, an instrumental origin for the line remains unlikely. We cannot, however, rule out a statistical fluctuation, and in that case our results provide a 3σ upper limit at 1.85 × 10‑6 ph cm‑2 s‑1. We discuss the interpretation of this observed line in terms of the iron line background, S XVI charge exchange, as well as potentially being from sterile neutrino decay. We note that our detection is consistent with previous measurements of this line toward the Galactic center and can be modeled as the result of sterile neutrino decay from the Milky Way for the dark matter distribution modeled as a Navarro–Frenk–White profile. For this case, we estimate a mass m ν ∼ 7.01 keV and a mixing angle sin2(2θ) = (0.83–2.75) × 10‑10. These derived values are in agreement with independent estimates from galaxy clusters, the Galactic center, and M31.
NASA Astrophysics Data System (ADS)
Ouriev, Boris; Windhab, Erich; Braun, Peter; Birkhofer, Beat
2004-10-01
In-line visualization and on-line characterization of nontransparent fluids becomes an important subject for process development in food and nonfood industries. In our work, a noninvasive Doppler ultrasound-based technique is introduced. Such a technique is applied for investigation of nonstationary flow in the chocolate precrystallization process. Unstable flow conditions were induced by abrupt flow interruption and were followed up by strong flow pulsations in the piping system. While relying on available process information, such as absolute pressures and temperatures, no analyses of flow conditions or characterization of suspension properties could possibly be done. It is obvious that chocolate flow properties are sensitive to flow boundary conditions. Therefore, it becomes essential to perform reliable structure state monitoring and particularly in application to nonstationary flow processes. Such flow instabilities in chocolate processing can often lead to failed product quality with interruption of the mainstream production. As will be discussed, a combination of flow velocity profiles, on-line fit into flow profiles, and pressure difference measurement are sufficient for reliable analyses of fluid properties and flow boundary conditions as well as monitoring of the flow state. Analyses of the flow state and flow properties of chocolate suspension are based on on-line measurement of one-dimensional velocity profiles across the flow channel and their on-line characterization with the power-law model. Conclusions about flow boundary conditions were drawn from a calculated velocity standard mean deviation, the parameters of power-law fit into velocity profiles, and volumetric flow rate information.
Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini
2018-08-01
Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.
Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M
2009-01-01
Plasma optical spectroscopy is widely employed in on-line welding diagnostics. The determination of the plasma electron temperature, which is typically selected as the output monitoring parameter, implies the identification of the atomic emission lines. As a consequence, additional processing stages are required with a direct impact on the real time performance of the technique. The line-to-continuum method is a feasible alternative spectroscopic approach and it is particularly interesting in terms of its computational efficiency. However, the monitoring signal highly depends on the chosen emission line. In this paper, a feature selection methodology is proposed to solve the uncertainty regarding the selection of the optimum spectral band, which allows the employment of the line-to-continuum method for on-line welding diagnostics. Field test results have been conducted to demonstrate the feasibility of the solution.
MICROLENSING OF QUASAR BROAD EMISSION LINES: CONSTRAINTS ON BROAD LINE REGION SIZE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guerras, E.; Mediavilla, E.; Jimenez-Vicente, J.
2013-02-20
We measure the differential microlensing of the broad emission lines between 18 quasar image pairs in 16 gravitational lenses. We find that the broad emission lines are in general weakly microlensed. The results show, at a modest level of confidence (1.8{sigma}), that high ionization lines such as C IV are more strongly microlensed than low ionization lines such as H{beta}, indicating that the high ionization line emission regions are more compact. If we statistically model the distribution of microlensing magnifications, we obtain estimates for the broad line region size of r{sub s} = 24{sup +22} {sub -15} and r{sub s}more » = 55{sup +150} {sub -35} lt-day (90% confidence) for the high and low ionization lines, respectively. When the samples are divided into higher and lower luminosity quasars, we find that the line emission regions of more luminous quasars are larger, with a slope consistent with the expected scaling from photoionization models. Our estimates also agree well with the results from local reveberation mapping studies.« less
Wheat crown rot pathogens Fusarium graminearum and F. pseudograminearum lack specialization.
Chakraborty, Sukumar; Obanor, Friday; Westecott, Rhyannyn; Abeywickrama, Krishanthi
2010-10-01
This article reports a lack of pathogenic specialization among Australian Fusarium graminearum and F. pseudograminearum causing crown rot (CR) of wheat using analysis of variance (ANOVA), principal component and biplot analysis, Kendall's coefficient of concordance (W), and κ statistics. Overall, F. pseudograminearum was more aggressive than F. graminearum, supporting earlier delineation of the crown-infecting group as a new species. Although significant wheat line-pathogen isolate interaction in ANOVA suggested putative specialization when seedlings of 60 wheat lines were inoculated with 4 pathogen isolates or 26 wheat lines were inoculated with 10 isolates, significant W and κ showed agreement in rank order of wheat lines, indicating a lack of specialization. The first principal component representing nondifferential aggressiveness explained a large part (up to 65%) of the variation in CR severity. The differential components were small and more pronounced in seedlings than in adult plants. By maximizing variance on the first two principal components, biplots were useful for highlighting the association between isolates and wheat lines. A key finding of this work is that a range of analytical tools are needed to explore pathogenic specialization, and a statistically significant interaction in an ANOVA cannot be taken as conclusive evidence of specialization. With no highly resistant wheat cultivars, Fusarium isolates mostly differ in aggressiveness; however, specialization may appear as more resistant cultivars become widespread.
Evolution of high-mass star-forming regions .
NASA Astrophysics Data System (ADS)
Giannetti, A.; Leurini, S.; Wyrowski, F.; Urquhart, J.; König, C.; Csengeri, T.; Güsten, R.; Menten, K. M.
Observational identification of a coherent evolutionary sequence for high-mass star-forming regions is still missing. We use the progressive heating of the gas caused by the feedback of high-mass young stellar objects to prove the statistical validity of the most common schemes used to observationally define an evolutionary sequence for high-mass clumps, and identify which physical process dominates in the different phases. From the spectroscopic follow-ups carried out towards the TOP100 sample between 84 and 365 km s^-1 giga hertz, we selected several multiplets of CH3CN, CH3CCH, and CH3OH lines to derive the physical properties of the gas in the clumps along the evolutionary sequence. We demonstrate that the evolutionary sequence is statistically valid, and we define intervals in L/M separating the compression, collapse and accretion, and disruption phases. The first hot cores and ZAMS stars appear at L/M≈10usk {L_ȯ}msun-1
The chromospheres of late-type stars. I - Eridani as a test case of multiline modelling
NASA Technical Reports Server (NTRS)
Thatcher, John D.; Robinson, Richard D.; Rees, David E.
1991-01-01
A new model of the lower chromosphere of the dwarf K2 star Epsilon Eridani is derived by matching flux profiles of the Ca IR triplet lines 8498 and 8542 A H-alpha and H-beta lines and the Na D lines (all observed simultaneously at the AAT), and the Ca II K line. The coupled non-LTE equations of statistical equilibrium and radiative transfer are solved under the constraint of hydrostatic equilibrium using the Carlsson (1986) code. Within the framework of the model, the Na D lines are an important photospheric diagnostic, and the Ca IR triplet lines can be used to locate the temperature minimum. The computed H-alpha and H-beta depths are highly sensitive constraints on the transition zone gradients and base pressures allowing us to derive a pressure at the base of the transition zone of 0.9 dyn/cm.
Origin of the correlations between exit times in pedestrian flows through a bottleneck
NASA Astrophysics Data System (ADS)
Nicolas, Alexandre; Touloupas, Ioannis
2018-01-01
Robust statistical features have emerged from the microscopic analysis of dense pedestrian flows through a bottleneck, notably with respect to the time gaps between successive passages. We pinpoint the mechanisms at the origin of these features thanks to simple models that we develop and analyse quantitatively. We disprove the idea that anticorrelations between successive time gaps (i.e. an alternation between shorter ones and longer ones) are a hallmark of a zipper-like intercalation of pedestrian lines and show that they simply result from the possibility that pedestrians from distinct ‘lines’ or directions cross the bottleneck within a short time interval. A second feature concerns the bursts of escapes, i.e. egresses that come in fast succession. Despite the ubiquity of exponential distributions of burst sizes, entailed by a Poisson process, we argue that anomalous (power-law) statistics arise if the bottleneck is nearly congested, albeit only in a tiny portion of parameter space. The generality of the proposed mechanisms implies that similar statistical features should also be observed for other types of particulate flows.
Selimović-Dragaš, Mediha; Hasić-Branković, Lajla; Korać, Fehim; Đapo, Nermin; Huseinbegović, Amina; Kobašlija, Sedin; Lekić, Meliha; Hatibović-Kofman, Šahza
2013-08-01
Fluoride release is important characteristic of glass-ionomer cements. Quantity of fluoride ions released from the glass-ionomer cements has major importance in definition of their biological activity. The objectives of this study were to define the quantity of fluoride ions released from the experimental glass-ionomer cements and to define the effect of fluoride ions released from the experimental glass-ionomer cements on their cytotoxicity. Concentrations of the fluoride ions released in the evaluated glass-ionomer cements were measured indirectly, by the fluoride-selective WTW, F500 electrode potential, combined with reference R503/D electrode. Statistical analyses of F-ion concentrations released by all glass-ionomers evaluated at two time points, after 8 and after 24 hours, show statistically higher fluoride releases from RMGICs: Vitrebond, Fuji II LC and Fuji Plus, when compared to conventional glass-ionomer cements: Fuji Triage, Fuji IX GP Fast and Ketac Silver, both after 8 and after 24 hours. Correlation coefficient between concentrations of fluoride ion released by evaluated glass-ionomer cements and cytotoxic response of UMR-106 osteoblast cell-line are relatively high, but do not reach levels of biological significance. Correlation between concentrations of fluoride ion released and cytotoxic response of NIH3T3 mouse fibroblast cell line after 8 hours is high, positive and statistically significant for conventional GICs, Fuji Triage and Fuji IX GP Fast, and RMGIC, Fuji II LC. Statistically significant Correlation coefficient between concentrations of fluoride ion released and cytotoxic response of NIH3T3 cell line after 24 hours is defined for RMGIC Fuji II LC only.
On-line identification of fermentation processes for ethanol production.
Câmara, M M; Soares, R M; Feital, T; Naomi, P; Oki, S; Thevelein, J M; Amaral, M; Pinto, J C
2017-07-01
A strategy for monitoring fermentation processes, specifically, simultaneous saccharification and fermentation (SSF) of corn mash, was developed. The strategy covered the development and use of first principles, semimechanistic and unstructured process model based on major kinetic phenomena, along with mass and energy balances. The model was then used as a reference model within an identification procedure capable of running on-line. The on-line identification procedure consists on updating the reference model through the estimation of corrective parameters for certain reaction rates using the most recent process measurements. The strategy makes use of standard laboratory measurements for sugars quantification and in situ temperature and liquid level data. The model, along with the on-line identification procedure, has been tested against real industrial data and have been able to accurately predict the main variables of operational interest, i.e., state variables and its dynamics, and key process indicators. The results demonstrate that the strategy is capable of monitoring, in real time, this complex industrial biomass fermentation. This new tool provides a great support for decision-making and opens a new range of opportunities for industrial optimization.
Thermodynamic Spectrum of Solar Flares Based on SDO/EVE Observations: Techniques and First Results
NASA Technical Reports Server (NTRS)
Wang, Yuming; Zhou, Zhenjun; Zhang, Jie; Liu, Kai; Liu, Rui; Shen, Chenglong; Chamberlin, Phillip C.
2016-01-01
The Solar Dynamics Observatory (SDO)/EUV Variability Experiment (EVE) provides rich information on the thermodynamic processes of solar activities, particularly on solar flares. Here, we develop a method to construct thermodynamic spectrum (TDS) charts based on the EVE spectral lines. This tool could potentially be useful for extreme ultraviolet (EUV) astronomy to learn about the eruptive activities on distant astronomical objects. Through several cases, we illustrate what we can learn from the TDS charts. Furthermore, we apply the TDS method to 74 flares equal to or greater than the M5.0 class, and reach the following statistical results. First, EUV peaks are always behind the soft X-ray (SXR) peaks and stronger flares tend to have faster cooling rates. There is a power-law correlation between the peak delay times and the cooling rates, suggesting a coherent cooling process of flares from SXR to EUV emissions. Second, there are two distinct temperature drift patterns, called Type I and Type II. For Type I flares, the enhanced emission drifts from high to low temperature like a quadrilateral, whereas for Type II flares the drift pattern looks like a triangle. Statistical analysis suggests that Type II flares are more impulsive than Type I flares. Third, for late-phase flares, the peak intensity ratio of the late phase to the main phase is roughly correlated with the flare class, and the flares with a strong late phase are all confined. We believe that the re-deposition of the energy carried by a flux rope, which unsuccessfully erupts out, into thermal emissions is responsible for the strong late phase found in a confined flare. Furthermore, we show the signatures of the flare thermodynamic process in the chromosphere and transition region in the TDS charts. These results provide new clues to advance our understanding of the thermodynamic processes of solar flares and associated solar eruptions, e.g., coronal mass ejections.
THERMODYNAMIC SPECTRUM OF SOLAR FLARES BASED ON SDO/EVE OBSERVATIONS: TECHNIQUES AND FIRST RESULTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yuming; Zhou, Zhenjun; Liu, Kai
2016-03-15
The Solar Dynamics Observatory (SDO)/EUV Variability Experiment (EVE) provides rich information on the thermodynamic processes of solar activities, particularly on solar flares. Here, we develop a method to construct thermodynamic spectrum (TDS) charts based on the EVE spectral lines. This tool could potentially be useful for extreme ultraviolet (EUV) astronomy to learn about the eruptive activities on distant astronomical objects. Through several cases, we illustrate what we can learn from the TDS charts. Furthermore, we apply the TDS method to 74 flares equal to or greater than the M5.0 class, and reach the following statistical results. First, EUV peaks are always behind the soft X-raymore » (SXR) peaks and stronger flares tend to have faster cooling rates. There is a power-law correlation between the peak delay times and the cooling rates, suggesting a coherent cooling process of flares from SXR to EUV emissions. Second, there are two distinct temperature drift patterns, called Type I and Type II. For Type I flares, the enhanced emission drifts from high to low temperature like a quadrilateral, whereas for Type II flares the drift pattern looks like a triangle. Statistical analysis suggests that Type II flares are more impulsive than Type I flares. Third, for late-phase flares, the peak intensity ratio of the late phase to the main phase is roughly correlated with the flare class, and the flares with a strong late phase are all confined. We believe that the re-deposition of the energy carried by a flux rope, which unsuccessfully erupts out, into thermal emissions is responsible for the strong late phase found in a confined flare. Furthermore, we show the signatures of the flare thermodynamic process in the chromosphere and transition region in the TDS charts. These results provide new clues to advance our understanding of the thermodynamic processes of solar flares and associated solar eruptions, e.g., coronal mass ejections.« less
CADDIS Volume 4. Data Analysis: Download Software
Overview of the data analysis tools available for download on CADDIS. Provides instructions for downloading and installing CADStat, access to Microsoft Excel macro for computing SSDs, a brief overview of command line use of R, a statistical software.
NASA Astrophysics Data System (ADS)
Csengeri, T.; Leurini, S.; Wyrowski, F.; Urquhart, J. S.; Menten, K. M.; Walmsley, M.; Bontemps, S.; Wienen, M.; Beuther, H.; Motte, F.; Nguyen-Luong, Q.; Schilke, P.; Schuller, F.; Zavagno, A.; Sanna, C.
2016-02-01
Context. The processes leading to the birth of high-mass stars are poorly understood. The key first step to reveal their formation processes is characterising the clumps and cores from which they form. Aims: We define a representative sample of massive clumps in different evolutionary stages selected from the APEX Telescope Large Area Survey of the Galaxy (ATLASGAL), from which we aim to establish a census of molecular tracers of their evolution. As a first step, we study the shock tracer, SiO, mainly associated with shocks from jets probing accretion processes. In low-mass young stellar objects (YSOs), outflow and jet activity decreases with time during the star formation processes. Recently, a similar scenario was suggested for massive clumps based on SiO observations. Here we analyse observations of the SiO (2-1) and (5-4) lines in a statistically significant sample to constrain the change of SiO abundance and the excitation conditions as a function of evolutionary stage of massive star-forming clumps. Methods: We performed an unbiased spectral line survey covering the 3-mm atmospheric window between 84-117 GHz with the IRAM 30 m telescope of a sample of 430 sources of the ATLASGAL survey, covering various evolutionary stages of massive clumps. A smaller sample of 128 clumps has been observed in the SiO (5-4) transition with the APEX telescope to complement the (2-1) line and probe the excitation conditions of the emitting gas. We derived detection rates to assess the star formation activity of the sample, and we estimated the column density and abundance using both an LTE approximation and non-LTE calculations for a smaller subsample, where both transitions have been observed. Results: We characterise the physical properties of the selected sources, which greatly supersedes the largest samples studied so far, and show that they are representative of different evolutionary stages. We report a high detection rate of >75% of the SiO (2-1) line and a >90% detection rate from the dedicated follow-ups in the (5-4) transition. Up to 25% of the infrared-quiet clumps exhibit high-velocity line wings, suggesting that molecular tracers are more efficient tools to determine the level of star formation activity than infrared colour criteria. We also find infrared-quiet clumps that exhibit only a low-velocity component (FWHM ~ 5-6 km s-1) SiO emission in the (2-1) line. In the current picture, where this is attributed to low-velocity shocks from cloud-cloud collisions, this can be used to pinpoint the youngest, thus, likely prestellar massive structures. Using the optically thin isotopologue (29SiO), we estimate that the (2-1) line is optically thin towards most of the sample. Furthermore, based on the line ratio of the (5-4) to the (2-1) line, our study reveals a trend of changing excitation conditions that lead to brighter emission in the (5-4) line towards more evolved sources. Our models show that a proper treatment of non-LTE effects and beam dilution is necessary to constrain trends in the SiO column density and abundance. Conclusions: We conclude that the SiO (2-1) line with broad line profiles and high detection rates is a powerful probe of star formation activity in the deeply embedded phase of the evolution of massive clumps. The ubiquitous detection of SiO in all evolutionary stages suggests a continuous star formation process in massive clumps. Our analysis delivers a more robust estimate of SiO column density and abundance than previous studies and questions the decrease of jet activity in massive clumps as a function of age. The observed increase of excitation conditions towards the more evolved clumps suggests a higher pressure in the shocked gas towards more evolved or more massive clumps in our sample. Full Tables 4, 6, 7 are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/586/A149
NASA Astrophysics Data System (ADS)
Schultz, David R.; Nash, Jeffrey K.
1996-07-01
The need for atomic data is one which continues to expand in a wide variety of applications including fusion energy, astrophysics, laser-produced plasma research, and plasma processing. Modern computer database and communications technology enables this data to be placed on-line and obtained by users over the INTERNET. Presented here is a summary of the observations and conclusions regarding such on-line atomic data access derived from a forum held at the Tenth APS Topical Conference on Atomic Processes in Plasmas.
NASA Astrophysics Data System (ADS)
Liu, Eric; Ko, Akiteru; O'Meara, David; Mohanty, Nihar; Franke, Elliott; Pillai, Karthik; Biolsi, Peter
2017-05-01
Dimension shrinkage has been a major driving force in the development of integrated circuit processing over a number of decades. The Self-Aligned Quadruple Patterning (SAQP) technique is widely adapted for sub-10nm node in order to achieve the desired feature dimensions. This technique provides theoretical feasibility of multiple pitch-halving from 193nm immersion lithography by using various pattern transferring steps. The major concept of this approach is to a create spacer defined self-aligned pattern by using single lithography print. By repeating the process steps, double, quadruple, or octuple are possible to be achieved theoretically. In these small architectures, line roughness control becomes extremely important since it may contribute to a significant portion of process and device performance variations. In addition, the complexity of SAQP in terms of processing flow makes the roughness improvement indirective and ineffective. It is necessary to discover a new approach in order to improve the roughness in the current SAQP technique. In this presentation, we demonstrate a novel method to improve line roughness performances on 30nm pitch SAQP flow. We discover that the line roughness performance is strongly related to stress management. By selecting different stress level of film to be deposited onto the substrate, we can manipulate the roughness performance in line and space patterns. In addition, the impact of curvature change by applied film stress to SAQP line roughness performance is also studied. No significant correlation is found between wafer curvature and line roughness performance. We will discuss in details the step-by-step physical performances for each processing step in terms of critical dimension (CD)/ critical dimension uniformity (CDU)/line width roughness (LWR)/line edge roughness (LER). Finally, we summarize the process needed to reach the full wafer performance targets of LWR/LER in 1.07nm/1.13nm on 30nm pitch line and space pattern.
2017-01-01
Statistical approaches to emergent knowledge have tended to focus on the process by which experience of individual episodes accumulates into generalizable experience across episodes. However, there is a seemingly opposite, but equally critical, process that such experience affords: the process by which, from a space of types (e.g. onions—a semantic class that develops through exposure to individual episodes involving individual onions), we can perceive or create, on-the-fly, a specific token (a specific onion, perhaps one that is chopped) in the absence of any prior perceptual experience with that specific token. This article reviews a selection of statistical learning studies that lead to the speculation that this process—the generation, on the basis of semantic memory, of a novel episodic representation—is itself an instance of a statistical, in fact associative, process. The article concludes that the same processes that enable statistical abstraction across individual episodes to form semantic memories also enable the generation, from those semantic memories, of representations that correspond to individual tokens, and of novel episodic facts about those tokens. Statistical learning is a window onto these deeper processes that underpin cognition. This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences’. PMID:27872378
Hill, Ryley; Masui, Kiyoshi W; Scott, Douglas
2018-05-01
Cosmic background (CB) radiation, encompassing the sum of emission from all sources outside our own Milky Way galaxy across the entire electromagnetic spectrum, is a fundamental phenomenon in observational cosmology. Many experiments have been conceived to measure it (or its constituents) since the extragalactic Universe was first discovered; in addition to estimating the bulk (cosmic monopole) spectrum, directional variations have also been detected over a wide range of wavelengths. Here we gather the most recent of these measurements and discuss the current status of our understanding of the CB from radio to γ-ray energies. Using available data in the literature, we piece together the sky-averaged intensity spectrum and discuss the emission processes responsible for what is observed. We examine the effect of perturbations to the continuum spectrum from atomic and molecular line processes and comment on the detectability of these signals. We also discuss how one could, in principle, obtain a complete census of the CB by measuring the full spectrum of each spherical harmonic expansion coefficient. This set of spectra of multipole moments effectively encodes the entire statistical history of nuclear, atomic, and molecular processes in the Universe.
NASA Astrophysics Data System (ADS)
Hill, Ryley; Masui, Kiyoshi W.; Scott, Douglas
2018-05-01
The cosmic background (CB) radiation, encompassing the sum of emission from all sources outside our own Milky Way galaxy across the entire electromagnetic spectrum, is a fundamental phenomenon in observational cosmology. Many experiments have been conceived to measure it (or its constituents) since the extragalactic Universe was first discovered; in addition to estimating the bulk (cosmic monopole) spectrum, directional variations have also been detected over a wide range of wavelengths. Here we gather the most recent of these measurements and discuss the current status of our understanding of the CB from radio to $\\gamma$-ray energies. Using available data in the literature we piece together the sky-averaged intensity spectrum, and discuss the emission processes responsible for what is observed. We examine the effect of perturbations to the continuum spectrum from atomic and molecular line processes and comment on the detectability of these signals. We also discuss how one could in principle obtain a complete census of the CB by measuring the full spectrum of each spherical harmonic expansion coefficient. This set of spectra of multipole moments effectively encodes the entire statistical history of nuclear, atomic and molecular processes in the Universe.
Multispectral and geomorphic studies of processed Voyager 2 images of Europa
NASA Technical Reports Server (NTRS)
Meier, T. A.
1984-01-01
High resolution images of Europa taken by the Voyager 2 spacecraft were used to study a portion of Europa's dark lineations and the major white line feature Agenor Linea. Initial image processing of images 1195J2-001 (violet filter), 1198J2-001 (blue filter), 1201J2-001 (orange filter), and 1204J2-001 (ultraviolet filter) was performed at the U.S.G.S. Branch of Astrogeology in Flagstaff, Arizona. Processing was completed through the stages of image registration and color ratio image construction. Pixel printouts were used in a new technique of linear feature profiling to compensate for image misregistration through the mapping of features on the printouts. In all, 193 dark lineation segments were mapped and profiled. The more accurate multispectral data derived by this method was plotted using a new application of the ternary diagram, with orange, blue, and violet relative spectral reflectances serving as end members. Statistical techniques were then applied to the ternary diagram plots. The image products generated at LPI were used mainly to cross-check and verify the results of the ternary diagram analysis.
Statistical mechanics of image processing by digital halftoning
NASA Astrophysics Data System (ADS)
Inoue, Jun-Ichi; Norimatsu, Wataru; Saika, Yohei; Okada, Masato
2009-03-01
We consider the problem of digital halftoning (DH). The DH is an image processing representing each grayscale in images in terms of black and white dots, and it is achieved by making use of the threshold dither mask, namely, each pixel is determined as black if the grayscale pixel is greater than or equal to the mask value and as white vice versa. To determine the mask for a given grayscale image, we assume that human-eyes might recognize the BW dots as the corresponding grayscale by linear filters. Then, the Hamiltonian is constructed as a distance between the original and recognized images which is written in terms of the mask. Finding the ground state of the Hamiltonian via deterministic annealing, we obtain the optimal mask and the BW dots simultaneously. From the spectrum analysis, we find that the BW dots are desirable from the view point of human-eyes modulation properties. We also show that the lower bound of the mean square error for the inverse process of the DH is minimized on the Nishimori line which is well-known in the research field of spin glasses.
NASA Astrophysics Data System (ADS)
Louarn, Philippe; Andre, Nicolas; Jackman, Caitriona M.; Kasahara, Satoshi; Kronberg, Elena A.; Vogt, Marissa F.
2015-04-01
We review in situ observations made in Jupiter and Saturn's magnetosphere that illustrate the possible roles of magnetic reconnection in rapidly-rotating magnetospheres. In the Earth's solar wind-driven magnetosphere, the magnetospheric convection is classically described as a cycle of dayside opening and tail closing reconnection (the Dungey cycle). For the rapidly-rotating Jovian and Kronian magnetospheres, heavily populated by internal plasma sources, the classical concept (the Vasyliunas cycle) is that the magnetic reconnection plays a key role in the final stage of the radial plasma transport across the disk. By cutting and closing flux tubes that have been elongated by the rotational stress, the reconnection process would lead to the formation of plasmoids that propagate down the tail, contributing to the final evacuation of the internally produced plasma and allowing the return of the magnetic flux toward the planet. This process has been studied by inspecting possible `local' signatures of the reconnection, as magnetic field reversals, plasma flow anisotropies, energetic particle bursts, and more global consequences on the magnetospheric activity. The investigations made at Jupiter support the concept of an `average' X-line, extended in the dawn/dusk direction and located at 90-120 Jovian radius (RJ) on the night side. The existence of a similar average X-line has not yet been established at Saturn, perhaps by lack of statistics. Both at Jupiter and Saturn, the reconfiguration signatures are consistent with magnetospheric dipolarizations and formation of plasmoids and flux ropes. In several cases, the reconfigurations also appear to be closely associated with large scale activations of the magnetosphere, seen from the radio and auroral emissions. Nevertheless, the statistical study also suggests that the reconnection events and the associated plasmoids are not frequent enough to explain a plasma evacuation that matches the mass input rate from the satellites and the rings. Different forms of transport should thus act together to evacuate the plasma, which still needs to be established. Investigations of reconnection signatures at the magnetopause and other processes as the Kelvin-Helmholtz instability are also reviewed. A provisional conclusion would be that the dayside reconnection is unlikely a crucial process in the overall dynamics. On the small scales, the detailed analysis of one reconnection event at Jupiter shows that the local plasma signatures (field-aligned flows, energetic particle bursts…) are very similar to those observed at Earth, with likely a similar scaling with respect to characteristic kinetic lengths (Larmor radius and inertial scales).
ERIC Educational Resources Information Center
Li, Xiao-qing; Ren, Gui-qin
2012-01-01
An event-related brain potentials (ERP) experiment was carried out to investigate how and when accentuation influences temporally selective attention and subsequent semantic processing during on-line spoken language comprehension, and how the effect of accentuation on attention allocation and semantic processing changed with the degree of…
NASA Technical Reports Server (NTRS)
Poole, L. R.; Huckins, E. K., III
1972-01-01
A general theory on mathematical modeling of elastic parachute suspension lines during the unfurling process was developed. Massless-spring modeling of suspension-line elasticity was evaluated in detail. For this simple model, equations which govern the motion were developed and numerically integrated. The results were compared with flight test data. In most regions, agreement was satisfactory. However, poor agreement was obtained during periods of rapid fluctuations in line tension.
Research on On-Line Modeling of Fed-Batch Fermentation Process Based on v-SVR
NASA Astrophysics Data System (ADS)
Ma, Yongjun
The fermentation process is very complex and non-linear, many parameters are not easy to measure directly on line, soft sensor modeling is a good solution. This paper introduces v-support vector regression (v-SVR) for soft sensor modeling of fed-batch fermentation process. v-SVR is a novel type of learning machine. It can control the accuracy of fitness and prediction error by adjusting the parameter v. An on-line training algorithm is discussed in detail to reduce the training complexity of v-SVR. The experimental results show that v-SVR has low error rate and better generalization with appropriate v.
NASA Astrophysics Data System (ADS)
Steiner, Matthias
A statistically proven, series injection molding technique for ceramic components was developed for the construction of engines and gas turbines. The flow behavior of silicon injection-molding materials was characterized and improved. Hot-isostatic-pressing reaction bonded silicon nitride (HIPRBSN) was developed. A nondestructive component evaluation method was developed. An injection molding line for HIPRBSN engine components precombustion chamber, flame spreader, and valve guide was developed. This line allows the production of small series for engine tests.
The effect of welding line heat-affected-zone on the formability of tube hydroforming process
NASA Astrophysics Data System (ADS)
ChiuHuang, Cheng-Kai; Hsu, Cheng-En; Lee, Ping-Kun
2016-08-01
Tube hydroforming has been used as a lightweight design approach to reduce CO2 emission for the automotive industry. For the high strength steel tube, the strength and quality of the welding line is very important for a successful tube hydroforming process. This paper aims to investigate the effect of the welding line's strength and the width of the heat-affected zone on the tube thinning during the hydroforming process. The simulation results show that both factors play an important role on the thickness distribution during the tube expansion.
Digitalizing historical high resolution water level data: Challenges and opportunities
NASA Astrophysics Data System (ADS)
Holinde, Lars; Hein, Hartmut; Barjenbruch, Ulrich
2017-04-01
Historical tide-gauge data offer the opportunities for determining variations in key characteristics for water level data and the analyses of past extreme events (storm surges). These information are important for calculating future trends and scenarios. But there are challenges involved due to the extensive effort needed to digitalize gauge sheets and quality control the resulting historical data. Based on these conditions, two main sources for inaccuracies in historical time series can be identified. First are several challenges due to the digitalization of the historical data, e.g. general quality of the sheets, multiple crossing lines of the observed water levels and additional comments on the sheet describing problems or additional information during the measurements. Second are problems during the measurements themselves. These can include the incorrect positioning of the sheets, trouble with the tide-gauge and maintenance. Errors resulting from these problems can be e.g. flat lines, discontinuities and outlier. Especially, the characterization of outliers has to be conducted carefully, to distinguish between real outliers and the appearance of extreme events. Methods for the quality control process involve the use of statistics, machine learning and neural networks. These will be described and applied to three different time series from tide gauge stations at the cost of Lower Saxony, Germany. Resulting difficulties and outcomes of the quality control process will be presented and explained. Furthermore, we will present a first glance at analyses for these time series.
Statistical EMC: A new dimension electromagnetic compatibility of digital electronic systems
NASA Astrophysics Data System (ADS)
Tsaliovich, Anatoly
Electromagnetic compatibility compliance test results are used as a database for addressing three classes of electromagnetic-compatibility (EMC) related problems: statistical EMC profiles of digital electronic systems, the effect of equipment-under-test (EUT) parameters on the electromagnetic emission characteristics, and EMC measurement specifics. Open area test site (OATS) and absorber line shielded room (AR) results are compared for equipment-under-test highest radiated emissions. The suggested statistical evaluation methodology can be utilized to correlate the results of different EMC test techniques, characterize the EMC performance of electronic systems and components, and develop recommendations for electronic product optimal EMC design.
FY2017 Report on NISC Measurements and Detector Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Madison Theresa; Meierbachtol, Krista Cruse; Jordan, Tyler Alexander
FY17 work focused on automation, both of the measurement analysis and comparison of simulations. The experimental apparatus was relocated and weeks of continuous measurements of the spontaneous fission source 252Cf was performed. Programs were developed to automate the conversion of measurements into ROOT data framework files with a simple terminal input. The complete analysis of the measurement (which includes energy calibration and the identification of correlated counts) can now be completed with a documented process which involves one simple execution line as well. Finally, the hurdles of slow MCNP simulations resulting in low simulation statistics have been overcome with themore » generation of multi-run suites which make use of the highperformance computing resources at LANL. Preliminary comparisons of measurements and simulations have been performed and will be the focus of FY18 work.« less
Influence of damage and basal friction on the grounding line dynamics
NASA Astrophysics Data System (ADS)
Brondex, Julien; Gagliardini, Olivier; Gillet-Chaulet, Fabien; Durand, Gael
2016-04-01
The understanding of grounding line dynamics is a major issue in the prediction of future sea level rise due to ice released from polar ice sheets into the ocean. This dynamics is complex and significantly affected by several physical processes not always adequately accounted for in current ice flow models. Among those processes, our study focuses on ice damage and evolving basal friction conditions. Softening of the ice due to damaging processes is known to have a strong impact on its rheology by reducing its viscosity and therefore promoting flow acceleration. Damage creates where shear stresses are high enough which is usually the case at shear margins and in the vicinity of pinning points in contact with ice-shelves. Those areas are known to have a buttressing effect on ice shelves contributing to stabilize the grounding line. We aim at evaluating the extent to which this stabilizing effect is hampered by damaging processes. Several friction laws have been proposed by various author to model the contact between grounded-ice and bedrock. Among them, Coulomb-type friction laws enable to account for reduced friction related to low effective pressure (the ice pressure minus the water pressure). Combining such a friction law to a parametrization of the effective pressure accounting for the fact that the area upstream the grounded line is connected to the ocean, is expected to have a significant impact on the grounding line dynamics. Using the finite-element code Elmer/Ice within which both the Coulomb-type friction law, the effective pressure parametrization and the damage model have been implemented, the goal of this study is to investigate the sensitivity of the grounding line dynamics to damage and to an evolving basal friction. The relative importance between those two processes on the grounding line dynamics is addressed as well.
On-line IR analyzer system to monitor cephamycin C loading on ion-exchange resin
NASA Astrophysics Data System (ADS)
Shank, Sheldon; Russ, Warren; Gravatt, Douglas; Lee, Wesley; Donahue, Steven M.
1992-08-01
An on-line infrared analyzer is being developed for monitoring cephamycin C loading on ion exchange resin. Accurate measurement of product loading offers productivity improvements with direct savings from product loss avoidance, minimized raw material cost, and reduced off-line laboratory testing. Ultrafiltered fermentation broth is fed onto ion exchange columns under conditions which adsorb the product, cephamycin C, to the resin while allowing impurities to pass unretained. Product loading is stopped when the on-line analyzer determines that resin capacity for adsorbing product is nearly exhausted. Infrared spectroscopy has been shown capable of quantifying cephamycin C in the process matrix at concentrations that support process control decisions. Process-to-analyzer interface challenges have been resolved, including sample conditioning requirements. Analyzer requirements have been defined. The sample conditioning station is under design.
Statistical and observational research of solar flare for total spectra and geometrical features
NASA Astrophysics Data System (ADS)
Nishimoto, S.; Watanabe, K.; Imada, S.; Kawate, T.; Lee, K. S.
2017-12-01
Impulsive energy release phenomena such as solar flares, sometimes affect to the solar-terrestrial environment. Usually, we use soft X-ray flux (GOES class) as the index of flare scale. However, the magnitude of effect to the solar-terrestrial environment is not proportional to that scale. To identify the relationship between solar flare phenomena and influence to the solar-terrestrial environment, we need to understand the full spectrum of solar flares. There is the solar flare irradiance model named the Flare Irradiance Spectral Model (FISM) (Chamberlin et al., 2006, 2007, 2008). The FISM can estimate solar flare spectra with high wavelength resolution. However, this model can not express the time evolution of emitted plasma during the solar flare, and has low accuracy on short wavelength that strongly effects and/or controls the total flare spectra. For the purpose of obtaining the time evolution of total solar flare spectra, we are performing statistical analysis of the electromagnetic data of solar flares. In this study, we select solar flare events larger than M-class from the Hinode flare catalogue (Watanabe et al., 2012). First, we focus on the EUV emission observed by the SDO/EVE. We examined the intensities and time evolutions of five EUV lines of 55 flare events. As a result, we found positive correlation between the "soft X-ray flux" and the "EUV peak flux" for all EVU lines. Moreover, we found that hot lines peaked earlier than cool lines of the EUV light curves. We also examined the hard X-ray data obtained by RHESSI. When we analyzed 163 events, we found good correlation between the "hard X-ray intensity" and the "soft X-ray flux". Because it seems that the geometrical features of solar flares effect to those time evolutions, we also looked into flare ribbons observed by SDO/AIA. We examined 21 flare events, and found positive correlation between the "GOES duration" and the "ribbon length". We also found positive correlation between the "ribbon length" and the "ribbon distance", however, there was no remarkable correlation of the "ribbon width". To understand physical process of flare emission, we performed numerical simulation (Imada et al., 2015), and compared with the observational flare model. We also discuss the flare numerical model which can be fitted to the observational flare model.
NASA Astrophysics Data System (ADS)
Annila, Arto
2016-02-01
The principle of increasing entropy is derived from statistical physics of open systems assuming that quanta of actions, as undividable basic build blocks, embody everything. According to this tenet, all systems evolve from one state to another either by acquiring quanta from their surroundings or by discarding quanta to the surroundings in order to attain energetic balance in least time. These natural processes result in ubiquitous scale-free patterns: skewed distributions that accumulate in a sigmoid manner and hence span log-log scales mostly as straight lines. Moreover, the equation for least-time motions reveals that evolution is by nature a non-deterministic process. Although the obtained insight in thermodynamics from the notion of quanta in motion yields nothing new, it accentuates that contemporary comprehension is impaired when modeling evolution as a computable process by imposing conservation of energy and thereby ignoring that quantum of actions are the carriers of energy from the system to its surroundings.
NASA Technical Reports Server (NTRS)
Ong, K. M.; Macdoran, P. F.; Thomas, J. B.; Fliegel, H. F.; Skjerve, L. J.; Spitzmesser, D. J.; Batelaan, P. D.; Paine, S. R.; Newsted, M. G.
1976-01-01
A precision geodetic measurement system (Aries, for Astronomical Radio Interferometric Earth Surveying) based on the technique of very long base line interferometry has been designed and implemented through the use of a 9-m transportable antenna and the NASA 64-m antenna of the Deep Space Communications Complex at Goldstone, California. A series of experiments designed to demonstrate the inherent accuracy of a transportable interferometer was performed on a 307-m base line during the period from December 1973 to June 1974. This short base line was chosen in order to obtain a comparison with a conventional survey with a few-centimeter accuracy and to minimize Aries errors due to transmission media effects, source locations, and earth orientation parameters. The base-line vector derived from a weighted average of the measurements, representing approximately 24 h of data, possessed a formal uncertainty of about 3 cm in all components. This average interferometry base-line vector was in good agreement with the conventional survey vector within the statistical range allowed by the combined uncertainties (3-4 cm) of the two techniques.
Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona
2012-01-01
Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.
Detection of Interstellar Urea
NASA Astrophysics Data System (ADS)
Kuo, Hsin-Lun; Remijan, Anthony J.; Snyder, Lewis E.; Looney, Leslie W.; Friedel, Douglas N.; Lovas, Francis J.; McCall, Benjamin J.; Hollis, Jan M.
2010-11-01
Urea, a molecule discovered in human urine by H. M. Rouelle in 1773, has a significant role in prebiotic chemistry. Previous BIMA observations have suggested that interstellar urea [(NH2)2CO] is a compact hot core molecule such as other large molecules (e.g. methyl formate and acetic acid). We have conducted an extensive search for urea toward the high mass hot molecular core Sgr B2(N-LMH) using BIMA, CARMA and the IRAM 30 m. Because the spectral lines of heavy molecules like urea tend to be weak and hot cores display lines from a wide range of molecules, it is necessary to detect a number of urea lines and apply sophisticated statistical tests before having confidence in an identification. The 1 mm resolution of CARMA enables favorable coupling of the source size and synthesized beam size, which was found to be essential for the detection of weak signals. We have detected a total of 65 spectral lines (32 molecular transitions and 33 unidentified transitions), most of which are narrower than the SEST survey (Nummelin et al. 1998) due to the small synthesized beam (2.5" x 2") of CARMA. It significantly resolves out the contamination by extended emission and reveals the eight weak urea lines that were previously blended with nearby transitions. Our analysis indicates that these lines are likely to be urea since the resulting observed line frequencies are coincident with a set of overlapping connecting urea lines, and the observed line intensities are consistent with the expected line strengths of urea. In addition, we have developed a new statistical approach to examine the spatial correlation between the observed lines by applying the Student's t test to the high resolution channel maps obtained from CARMA. The t test shows consistent spatial distributions from all eight candidate lines, suggesting a common molecular origin, urea. Our t test method could have a broad impact on the next generation of arrays, such as ALMA, because the new arrays will require a method to systematically determine the credibility of detections of weaker signals from new and larger interstellar molecules.
Research on the Influence of Perceived Risk in Consumer On-line Purchasing Decision
NASA Astrophysics Data System (ADS)
Hong, Zhao; Yi, Li
Perceived risk is an important factor that affects consumer's on-line shopping purchasing decision, through the perceived theories the consumer can know clearly which step owns higher risk in the whole shopping process, then learn how to prevent it, this process also strengthen the consumer confidence, thus lowering to know that the risk adjudicate to the feeling, so the essay has important and realistic meaning for further expand the electronic commerce. At first, investigate, collect, tidy up, analyze the questionnaire information, and thus get the primary data. Finally try to find out the influence of perceived risk to each stage of purchasing decision during consumer on-line shopping process with data and personal analytical. The paper is a complement to the local and existing perceived theories. The result of the study manifests that, the order of main perceived risks which felt by consumer during on-line shopping process are as follow: financial risk, the performance risk and service risk.
The impact of profitability of hospital admissions on mortality.
Lindrooth, Richard C; Konetzka, R Tamara; Navathe, Amol S; Zhu, Jingsan; Chen, Wei; Volpp, Kevin
2013-04-01
Fiscal constraints faced by Medicare are leading to policies designed to reduce expenditures. Evidence of the effect of reduced reimbursement on the mortality of Medicare patients discharged from all major hospital service lines is limited. We modeled risk-adjusted 30-day mortality of patients discharged from 21 hospital service lines as a function of service line profitability, service line time trends, and hospital service line and year-fixed effects. We simulated the effect of alternative revenue-neutral reimbursement policies on mortality. Our sample included all Medicare discharges from PPS-eligible hospitals (1997, 2001, and 2005). The results reveal a statistically significant inverse relationship between changes in profitability and mortality. A $0.19 average reduction in profit per $1.00 of costs led to a 0.010-0.020 percentage-point increase in mortality rates (p < .001). Mortality in newly unprofitable service lines is significantly more sensitive to reduced payment generosity than in service lines that remain profitable. Policy simulations that target service line inequities in payment generosity result in lower mortality rates, roughly 700-13,000 fewer deaths nationally. The policy simulations raise questions about the trade-offs implicit in universal reductions in reimbursement. The effect of reduced payment generosity on mortality could be mitigated by targeting highly profitable services only for lower reimbursement. © Health Research and Educational Trust.
The Impact of Profitability of Hospital Admissions on Mortality
Lindrooth, Richard C; Konetzka, R Tamara; Navathe, Amol S; Zhu, Jingsan; Chen, Wei; Volpp, Kevin
2013-01-01
Background Fiscal constraints faced by Medicare are leading to policies designed to reduce expenditures. Evidence of the effect of reduced reimbursement on the mortality of Medicare patients discharged from all major hospital service lines is limited. Methods We modeled risk-adjusted 30-day mortality of patients discharged from 21 hospital service lines as a function of service line profitability, service line time trends, and hospital service line and year-fixed effects. We simulated the effect of alternative revenue-neutral reimbursement policies on mortality. Our sample included all Medicare discharges from PPS-eligible hospitals (1997, 2001, and 2005). Results The results reveal a statistically significant inverse relationship between changes in profitability and mortality. A $0.19 average reduction in profit per $1.00 of costs led to a 0.010–0.020 percentage-point increase in mortality rates (p < .001). Mortality in newly unprofitable service lines is significantly more sensitive to reduced payment generosity than in service lines that remain profitable. Policy simulations that target service line inequities in payment generosity result in lower mortality rates, roughly 700–13,000 fewer deaths nationally. Conclusions The policy simulations raise questions about the trade-offs implicit in universal reductions in reimbursement. The effect of reduced payment generosity on mortality could be mitigated by targeting highly profitable services only for lower reimbursement. PMID:23346946
NASA Technical Reports Server (NTRS)
Lo, C. F.; Wu, K.; Whitehead, B. A.
1993-01-01
The statistical and neural networks methods have been applied to investigate the feasibility in detecting anomalies in turbopump vibration of SSME. The anomalies are detected based on the amplitude of peaks of fundamental and harmonic frequencies in the power spectral density. These data are reduced to the proper format from sensor data measured by strain gauges and accelerometers. Both methods are feasible to detect the vibration anomalies. The statistical method requires sufficient data points to establish a reasonable statistical distribution data bank. This method is applicable for on-line operation. The neural networks method also needs to have enough data basis to train the neural networks. The testing procedure can be utilized at any time so long as the characteristics of components remain unchanged.
NASA Astrophysics Data System (ADS)
Fujimoto, K.; Yanagisawa, T.; Uetsuhara, M.
Automated detection and tracking of faint objects in optical, or bearing-only, sensor imagery is a topic of immense interest in space surveillance. Robust methods in this realm will lead to better space situational awareness (SSA) while reducing the cost of sensors and optics. They are especially relevant in the search for high area-to-mass ratio (HAMR) objects, as their apparent brightness can change significantly over time. A track-before-detect (TBD) approach has been shown to be suitable for faint, low signal-to-noise ratio (SNR) images of resident space objects (RSOs). TBD does not rely upon the extraction of feature points within the image based on some thresholding criteria, but rather directly takes as input the intensity information from the image file. Not only is all of the available information from the image used, TBD avoids the computational intractability of the conventional feature-based line detection (i.e., "string of pearls") approach to track detection for low SNR data. Implementation of TBD rooted in finite set statistics (FISST) theory has been proposed recently by Vo, et al. Compared to other TBD methods applied so far to SSA, such as the stacking method or multi-pass multi-period denoising, the FISST approach is statistically rigorous and has been shown to be more computationally efficient, thus paving the path toward on-line processing. In this paper, we intend to apply a multi-Bernoulli filter to actual CCD imagery of RSOs. The multi-Bernoulli filter can explicitly account for the birth and death of multiple targets in a measurement arc. TBD is achieved via a sequential Monte Carlo implementation. Preliminary results with simulated single-target data indicate that a Bernoulli filter can successfully track and detect objects with measurement SNR as low as 2.4. Although the advent of fast-cadence scientific CMOS sensors have made the automation of faint object detection a realistic goal, it is nonetheless a difficult goal, as measurements arcs in space surveillance are often both short and sparse. FISST methodologies have been applied to the general problem of SSA by many authors, but they generally focus on tracking scenarios with long arcs or assume that line detection is tractable. We will instead focus this work on estimating sensor-level kinematics of RSOs for low SNR too-short arc observations. Once said estimate is made available, track association and simultaneous initial orbit determination may be achieved via any number of proposed solutions to the too-short arc problem, such as those incorporating the admissible region. We show that the benefit of combining FISST-based TBD with too-short arc association goes both ways; i.e., the former provides consistent statistics regarding bearing-only measurements, whereas the latter makes better use of the precise dynamical models nominally applicable to RSOs in orbit determination.
Real-Time Measurement of Width and Height of Weld Beads in GMAW Processes
Pinto-Lopera, Jesús Emilio; S. T. Motta, José Mauricio; Absi Alfaro, Sadek Crisostomo
2016-01-01
Associated to the weld quality, the weld bead geometry is one of the most important parameters in welding processes. It is a significant requirement in a welding project, especially in automatic welding systems where a specific width, height, or penetration of weld bead is needed. This paper presents a novel technique for real-time measuring of the width and height of weld beads in gas metal arc welding (GMAW) using a single high-speed camera and a long-pass optical filter in a passive vision system. The measuring method is based on digital image processing techniques and the image calibration process is based on projective transformations. The measurement process takes less than 3 milliseconds per image, which allows a transfer rate of more than 300 frames per second. The proposed methodology can be used in any metal transfer mode of a gas metal arc welding process and does not have occlusion problems. The responses of the measurement system, presented here, are in a good agreement with off-line data collected by a common laser-based 3D scanner. Each measurement is compare using a statistical Welch’s t-test of the null hypothesis, which, in any case, does not exceed the threshold of significance level α = 0.01, validating the results and the performance of the proposed vision system. PMID:27649198
Sun, Fei; Xu, Bing; Zhang, Yi; Dai, Shengyun; Yang, Chan; Cui, Xianglong; Shi, Xinyuan; Qiao, Yanjiang
2016-01-01
The quality of Chinese herbal medicine tablets suffers from batch-to-batch variability due to a lack of manufacturing process understanding. In this paper, the Panax notoginseng saponins (PNS) immediate release tablet was taken as the research subject. By defining the dissolution of five active pharmaceutical ingredients and the tablet tensile strength as critical quality attributes (CQAs), influences of both the manipulated process parameters introduced by an orthogonal experiment design and the intermediate granules' properties on the CQAs were fully investigated by different chemometric methods, such as the partial least squares, the orthogonal projection to latent structures, and the multiblock partial least squares (MBPLS). By analyzing the loadings plots and variable importance in the projection indexes, the granule particle sizes and the minimal punch tip separation distance in tableting were identified as critical process parameters. Additionally, the MBPLS model suggested that the lubrication time in the final blending was also important in predicting tablet quality attributes. From the calculated block importance in the projection indexes, the tableting unit was confirmed to be the critical process unit of the manufacturing line. The results demonstrated that the combinatorial use of different multivariate modeling methods could help in understanding the complex process relationships as a whole. The output of this study can then be used to define a control strategy to improve the quality of the PNS immediate release tablet.
The Statistical Consulting Center for Astronomy (SCCA)
NASA Technical Reports Server (NTRS)
Akritas, Michael
2001-01-01
The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.
NASA Astrophysics Data System (ADS)
Park, Jong Ho; Ahn, Byung Tae
2003-01-01
A failure model for electromigration based on the "failure unit model" was presented for the prediction of lifetime in metal lines.The failure unit model, which consists of failure units in parallel and series, can predict both the median time to failure (MTTF) and the deviation in the time to failure (DTTF) in Al metal lines. The model can describe them only qualitatively. In our model, both the probability function of the failure unit in single grain segments and polygrain segments are considered instead of in polygrain segments alone. Based on our model, we calculated MTTF, DTTF, and activation energy for different median grain sizes, grain size distributions, linewidths, line lengths, current densities, and temperatures. Comparisons between our results and published experimental data showed good agreements and our model could explain the previously unexplained phenomena. Our advanced failure unit model might be further applied to other electromigration characteristics of metal lines.