Regular FPGA based on regular fabric
NASA Astrophysics Data System (ADS)
Xun, Chen; Jianwen, Zhu; Minxuan, Zhang
2011-08-01
In the sub-wavelength regime, design for manufacturability (DFM) becomes increasingly important for field programmable gate arrays (FPGAs). In this paper, an automated tile generation flow targeting micro-regular fabric is reported. Using a publicly accessible, well-documented academic FPGA as a case study, we found that compared to the tile generators previously reported, our generated micro-regular tile incurs less than 10% area overhead, which could be potentially recovered by process window optimization, thanks to its superior printability. In addition, we demonstrate that on 45 nm technology, the generated FPGA tile reduces lithography induced process variation by 33%, and reduce probability of failure by 21.2%. If a further overhead of 10% area can be recovered by enhanced resolution, we can achieve the variation reduction of 93.8% and reduce the probability of failure by 16.2%.
How Humanae vitae has advanced reproductive health1
Doroski, Derek M.
2014-01-01
By encouraging doctors and scientists to improve the regulation of births through the observation of natural fertility rhythms, Humanae vitae promoted the development of natural family planning (NFP). The study of NFP has lead to NFP-based methodologies in reproductive healthcare that are promoting advances in treatment of infertility, miscarriage, and a number of reproductive health disorders. In contrast, the contraceptive mentality has stunted the development of reproductive healthcare. Humanae vitae has provided a great gift to science and reproductive healthcare that all Catholics should be proud of. PMID:25249708
Regular gravitational lagrangians
NASA Astrophysics Data System (ADS)
Dragon, Norbert
1992-02-01
The Einstein action with vanishing cosmological constant is for appropriate field content the unique local action which is regular at the fixed point of affine coordinate transformations. Imposing this regularity requirement one excludes also Wess-Zumino counterterms which trade gravitational anomalies for Lorentz anomalies. One has to expect dilatational and SL (D) anomalies. If these anomalies are absent and if the regularity of the quantum vertex functional can be controlled then Einstein gravity is renormalizable. On leave of absence from Institut für Theoretische Physik, Universität Hannover, W-3000 Hannover 1, FRG.
Illusory Liberalism in "Atlas de Geografía Humana"
ERIC Educational Resources Information Center
Ryan, Lorraine
2014-01-01
"Atlas de Geografía Humana" constitutes a critique of the much vaunted notion of a progressive Spain that has rectified the gender inequalities of the Francoist era, as one of the highly educated and successful protagonists, Fran, unwittingly adopts her mother's alignment with patriarchal norms. This novel elucidates the…
Regularized Structural Equation Modeling
Jacobucci, Ross; Grimm, Kevin J.; McArdle, John J.
2016-01-01
A new method is proposed that extends the use of regularization in both lasso and ridge regression to structural equation models. The method is termed regularized structural equation modeling (RegSEM). RegSEM penalizes specific parameters in structural equation models, with the goal of creating easier to understand and simpler models. Although regularization has gained wide adoption in regression, very little has transferred to models with latent variables. By adding penalties to specific parameters in a structural equation model, researchers have a high level of flexibility in reducing model complexity, overcoming poor fitting models, and the creation of models that are more likely to generalize to new samples. The proposed method was evaluated through a simulation study, two illustrative examples involving a measurement model, and one empirical example involving the structural part of the model to demonstrate RegSEM’s utility. PMID:27398019
NASA Astrophysics Data System (ADS)
Forghan, B.; Takook, M. V.; Zarei, A.
2012-09-01
In this paper, the electron self-energy, photon self-energy and vertex functions are explicitly calculated in Krein space quantization including quantum metric fluctuation. The results are automatically regularized or finite. The magnetic anomaly and Lamb shift are also calculated in the one loop approximation in this method. Finally, the obtained results are compared to conventional QED results.
Geometry of spinor regularization
NASA Technical Reports Server (NTRS)
Hestenes, D.; Lounesto, P.
1983-01-01
The Kustaanheimo theory of spinor regularization is given a new formulation in terms of geometric algebra. The Kustaanheimo-Stiefel matrix and its subsidiary condition are put in a spinor form directly related to the geometry of the orbit in physical space. A physically significant alternative to the KS subsidiary condition is discussed. Derivations are carried out without using coordinates.
The impact of 25 years of "Humanae Vitae".
1993-08-01
In 1968, Pope Paul VI reinforced the Catholic Church's position forbidding contraception in an encyclical known as "Humanae Vitae." The current Pope, John Paul II, has reiterated this position and stated that use of condoms is forbidden, even to prevent HIV transmission. Several public figures, including the IPPF President and the British Overseas Development Minister, have sought to initiate a dialogue with the Pope to express concern about population growth in developing countries. The Catholic Church is currently opposing efforts on the part of the Presidents of Peru and the Philippines to expand access to family planning (FP) programs. Many of the 500,000 women who die each year of pregnancy complications would have used contraceptives and lived, and an estimated 300 million developing world couples have no access to modern contraception, yet desire no more children. Many Catholics disagree with the Church's stance; 87% of US Catholics surveyed in 1992 stated that FP is the couple's choice. After 25 years of "Humanae Vitae," it is time for the Church to enter into dialogue and join efforts to improve life on earth in the next century. PMID:12345157
Regularized Generalized Canonical Correlation Analysis
ERIC Educational Resources Information Center
Tenenhaus, Arthur; Tenenhaus, Michel
2011-01-01
Regularized generalized canonical correlation analysis (RGCCA) is a generalization of regularized canonical correlation analysis to three or more sets of variables. It constitutes a general framework for many multi-block data analysis methods. It combines the power of multi-block data analysis methods (maximization of well identified criteria) and…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-02
... CORPORATION Regular Meeting AGENCY: Farm Credit System Insurance Corporation Board. SUMMARY: Notice is hereby given of the regular meeting of the Farm Credit System Insurance Corporation Board (Board). DATE AND TIME: The meeting of the Board will be held at the offices of the Farm Credit Administration in...
Regularly timed events amid chaos.
Blakely, Jonathan N; Cooper, Roy M; Corron, Ned J
2015-11-01
We show rigorously that the solutions of a class of chaotic oscillators are characterized by regularly timed events in which the derivative of the solution is instantaneously zero. The perfect regularity of these events is in stark contrast with the well-known unpredictability of chaos. We explore some consequences of these regularly timed events through experiments using chaotic electronic circuits. First, we show that a feedback loop can be implemented to phase lock the regularly timed events to a periodic external signal. In this arrangement the external signal regulates the timing of the chaotic signal but does not strictly lock its phase. That is, phase slips of the chaotic oscillation persist without disturbing timing of the regular events. Second, we couple the regularly timed events of one chaotic oscillator to those of another. A state of synchronization is observed where the oscillators exhibit synchronized regular events while their chaotic amplitudes and phases evolve independently. Finally, we add additional coupling to synchronize the amplitudes, as well, however in the opposite direction illustrating the independence of the amplitudes from the regularly timed events. PMID:26651759
Natural selection and mechanistic regularity.
DesAutels, Lane
2016-06-01
In this article, I address the question of whether natural selection operates regularly enough to qualify as a mechanism of the sort characterized by Machamer, Darden, and Craver (2000). Contrary to an influential critique by Skipper and Millstein (2005), I argue that natural selection can be seen to be regular enough to qualify as an MDC mechanism just fine-as long as we pay careful attention to some important distinctions regarding mechanistic regularity and abstraction. Specifically, I suggest that when we distinguish between process vs. product regularity, mechanism-internal vs. mechanism-external sources of irregularity, and abstract vs. concrete regularity, we can see that natural selection is only irregular in senses that are unthreatening to its status as an MDC mechanism. PMID:26921876
NONCONVEX REGULARIZATION FOR SHAPE PRESERVATION
CHARTRAND, RICK
2007-01-16
The authors show that using a nonconvex penalty term to regularize image reconstruction can substantially improve the preservation of object shapes. The commonly-used total-variation regularization, {integral}|{del}u|, penalizes the length of the object edges. They show that {integral}|{del}u|{sup p}, 0 < p < 1, only penalizes edges of dimension at least 2-p, and thus finite-length edges not at all. We give numerical examples showing the resulting improvement in shape preservation.
Geometric continuum regularization of quantum field theory
Halpern, M.B. . Dept. of Physics)
1989-11-08
An overview of the continuum regularization program is given. The program is traced from its roots in stochastic quantization, with emphasis on the examples of regularized gauge theory, the regularized general nonlinear sigma model and regularized quantum gravity. In its coordinate-invariant form, the regularization is seen as entirely geometric: only the supermetric on field deformations is regularized, and the prescription provides universal nonperturbative invariant continuum regularization across all quantum field theory. 54 refs.
Regular patterns stabilize auditory streams.
Bendixen, Alexandra; Denham, Susan L; Gyimesi, Kinga; Winkler, István
2010-12-01
The auditory system continuously parses the acoustic environment into auditory objects, usually representing separate sound sources. Sound sources typically show characteristic emission patterns. These regular temporal sound patterns are possible cues for distinguishing sound sources. The present study was designed to test whether regular patterns are used as cues for source distinction and to specify the role that detecting these regularities may play in the process of auditory stream segregation. Participants were presented with tone sequences, and they were asked to continuously indicate whether they perceived the tones in terms of a single coherent sequence of sounds (integrated) or as two concurrent sound streams (segregated). Unknown to the participant, in some stimulus conditions, regular patterns were present in one or both putative streams. In all stimulus conditions, participants' perception switched back and forth between the two sound organizations. Importantly, regular patterns occurring in either one or both streams prolonged the mean duration of two-stream percepts, whereas the duration of one-stream percepts was unaffected. These results suggest that temporal regularities are utilized in auditory scene analysis. It appears that the role of this cue lies in stabilizing streams once they have been formed on the basis of simpler acoustic cues. PMID:21218898
Extended Locus of Regular Nuclei
Amon, L.; Casten, R. F.
2007-04-23
A new family of IBM Hamiltonians, characterized by certain parameter values, was found about 15 years ago by Alhassid and Whelan to display almost regular dynamics, and yet these solutions to the IBM do not belong to any of the known dynamical symmetry limits (vibrational, rotational and {gamma} - unstable). Rather, they comprise an 'Arc of Regularity' cutting through the interior of the symmetry triangle from U(5) to SU(3) where suddenly there is a decrease in chaoticity and a significant increase in regularity. A few years ago, the first set of nuclei lying along this arc was discovered. The purpose of the present work is to search more broadly in the nuclear chart at all nuclei from Z = 40 - 100 for other examples of such 'regular' nuclei. Using a unique signature for such nuclei involving energy differences of certain excited states, we have identified an additional set of 12 nuclei lying near or along the arc. Some of these nuclei are known to have low-lying intruder states and therefore care must be taken, however, in judging their structure. The regularity exhibited by nuclei near the arc presumably reflects the validity or partial validity of some new, as yet unknown, quantum number describing these systems and giving the regularity found for them.
Regularization Analysis of SAR Superresolution
DELAURENTIS,JOHN M.; DICKEY,FRED M.
2002-04-01
Superresolution concepts offer the potential of resolution beyond the classical limit. This great promise has not generally been realized. In this study we investigate the potential application of superresolution concepts to synthetic aperture radar. The analytical basis for superresolution theory is discussed. In a previous report the application of the concept to synthetic aperture radar was investigated as an operator inversion problem. Generally, the operator inversion problem is ill posed. This work treats the problem from the standpoint of regularization. Both the operator inversion approach and the regularization approach show that the ability to superresolve SAR imagery is severely limited by system noise.
Dimensional regularization in configuration space
Bollini, C.G. |; Giambiagi, J.J.
1996-05-01
Dimensional regularization is introduced in configuration space by Fourier transforming in {nu} dimensions the perturbative momentum space Green functions. For this transformation, the Bochner theorem is used; no extra parameters, such as those of Feynman or Bogoliubov and Shirkov, are needed for convolutions. The regularized causal functions in {ital x} space have {nu}-dependent moderated singularities at the origin. They can be multiplied together and Fourier transformed (Bochner) without divergence problems. The usual ultraviolet divergences appear as poles of the resultant analytic functions of {nu}. Several examples are discussed. {copyright} {ital 1996 The American Physical Society.}
Rotations of the Regular Polyhedra
ERIC Educational Resources Information Center
Jones, MaryClara; Soto-Johnson, Hortensia
2006-01-01
The study of the rotational symmetries of the regular polyhedra is important in the classroom for many reasons. Besides giving the students an opportunity to visualize in three dimensions, it is also an opportunity to relate two-dimensional and three-dimensional concepts. For example, rotations in R[superscript 2] require a point and an angle of…
Regularized Generalized Structured Component Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun
2009-01-01
Generalized structured component analysis (GSCA) has been proposed as a component-based approach to structural equation modeling. In practice, GSCA may suffer from multi-collinearity, i.e., high correlations among exogenous variables. GSCA has yet no remedy for this problem. Thus, a regularized extension of GSCA is proposed that integrates a ridge…
Academic Improvement through Regular Assessment
ERIC Educational Resources Information Center
Wolf, Patrick J.
2007-01-01
Media reports are rife with claims that students in the United States are overtested and that they and their education are suffering as result. Here I argue the opposite--that students would benefit in numerous ways from more frequent assessment, especially of diagnostic testing. The regular assessment of students serves critical educational and…
Operator regularization and quantum gravity
NASA Astrophysics Data System (ADS)
Mann, R. B.; Tarasov, L.; Mckeon, D. G. C.; Steele, T.
1989-01-01
Operator regularization has been shown to be a symmetry preserving means of computing Green functions in gauge symmetric and supersymmetric theories which avoids the explicit occurrence of divergences. In this paper we examine how this technique can be applied to computing quantities in non-renormalizable theories in general and quantum gravity in particular. Specifically, we consider various processes to one- and two-loop order in φ4N theory for N > 4 for which the theory is non-renormalizable. We then apply operator regularization to determine the one-loop graviton correction to the spinor propagator. The effective action for quantum scalars in a background gravitational field is evaluated in operator regularization using both the weak-field method and the normal coordinate expansion. This latter case yields a new derivation of the Schwinger-de Witt expansion which avoids the use of recursion relations. Finally we consider quantum gravity coupled to scalar fields in n dimensions, evaluating those parts of the effective action that (in other methods) diverge as n → 4. We recover the same divergence structure as is found using dimensional regularization if n ≠ 4, but if n = 4 at the outset no divergence arises at any stage of the calculation. The non-renormalizability of such theories manifests itself in the scale-dependence at one-loop order of terms that do not appear in the original lagrangian. In all cases our regularization procedure does not break any invariances present in the theory and avoids the occurence of explicit divergences.
Temporal regularity in speech perception: Is regularity beneficial or deleterious?
Geiser, Eveline; Shattuck-Hufnagel, Stefanie
2012-01-01
Speech rhythm has been proposed to be of crucial importance for correct speech perception and language learning. This study investigated the influence of speech rhythm in second language processing. German pseudo-sentences were presented to participants in two conditions: ‘naturally regular speech rhythm’ and an ‘emphasized regular rhythm'. Nine expert English speakers with 3.5±1.6 years of German training repeated each sentence after hearing it once over headphones. Responses were transcribed using the International Phonetic Alphabet and analyzed for the number of correct, false and missing consonants as well as for consonant additions. The over-all number of correct reproductions of consonants did not differ between the two experimental conditions. However, speech rhythmicization significantly affected the serial position curve of correctly reproduced syllables. The results of this pilot study are consistent with the view that speech rhythm is important for speech perception. PMID:22701753
Distributional Stress Regularity: A Corpus Study
ERIC Educational Resources Information Center
Temperley, David
2009-01-01
The regularity of stress patterns in a language depends on "distributional stress regularity", which arises from the pattern of stressed and unstressed syllables, and "durational stress regularity", which arises from the timing of syllables. Here we focus on distributional regularity, which depends on three factors. "Lexical stress patterning"…
Adaptive regularization of earthquake slip distribution inversion
NASA Astrophysics Data System (ADS)
Wang, Chisheng; Ding, Xiaoli; Li, Qingquan; Shan, Xinjian; Zhu, Jiasong; Guo, Bo; Liu, Peng
2016-04-01
Regularization is a routine approach used in earthquake slip distribution inversion to avoid numerically abnormal solutions. To date, most slip inversion studies have imposed uniform regularization on all the fault patches. However, adaptive regularization, where each retrieved parameter is regularized differently, has exhibited better performances in other research fields such as image restoration. In this paper, we implement an investigation into adaptive regularization for earthquake slip distribution inversion. It is found that adaptive regularization can achieve a significantly smaller mean square error (MSE) than uniform regularization, if it is set properly. We propose an adaptive regularization method based on weighted total least squares (WTLS). This approach assumes that errors exist in both the regularization matrix and observation, and an iterative algorithm is used to solve the solution. A weight coefficient is used to balance the regularization matrix residual and the observation residual. An experiment using four slip patterns was carried out to validate the proposed method. The results show that the proposed regularization method can derive a smaller MSE than uniform regularization and resolution-based adaptive regularization, and the improvement in MSE is more significant for slip patterns with low-resolution slip patches. In this paper, we apply the proposed regularization method to study the slip distribution of the 2011 Mw 9.0 Tohoku earthquake. The retrieved slip distribution is less smooth and more detailed than the one retrieved with the uniform regularization method, and is closer to the existing slip model from joint inversion of the geodetic and seismic data.
Knowledge and regularity in planning
NASA Technical Reports Server (NTRS)
Allen, John A.; Langley, Pat; Matwin, Stan
1992-01-01
The field of planning has focused on several methods of using domain-specific knowledge. The three most common methods, use of search control, use of macro-operators, and analogy, are part of a continuum of techniques differing in the amount of reused plan information. This paper describes TALUS, a planner that exploits this continuum, and is used for comparing the relative utility of these methods. We present results showing how search control, macro-operators, and analogy are affected by domain regularity and the amount of stored knowledge.
RES: Regularized Stochastic BFGS Algorithm
NASA Astrophysics Data System (ADS)
Mokhtari, Aryan; Ribeiro, Alejandro
2014-12-01
RES, a regularized stochastic version of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton method is proposed to solve convex optimization problems with stochastic objectives. The use of stochastic gradient descent algorithms is widespread, but the number of iterations required to approximate optimal arguments can be prohibitive in high dimensional problems. Application of second order methods, on the other hand, is impracticable because computation of objective function Hessian inverses incurs excessive computational cost. BFGS modifies gradient descent by introducing a Hessian approximation matrix computed from finite gradient differences. RES utilizes stochastic gradients in lieu of deterministic gradients for both, the determination of descent directions and the approximation of the objective function's curvature. Since stochastic gradients can be computed at manageable computational cost RES is realizable and retains the convergence rate advantages of its deterministic counterparts. Convergence results show that lower and upper bounds on the Hessian egeinvalues of the sample functions are sufficient to guarantee convergence to optimal arguments. Numerical experiments showcase reductions in convergence time relative to stochastic gradient descent algorithms and non-regularized stochastic versions of BFGS. An application of RES to the implementation of support vector machines is developed.
Tessellating the Sphere with Regular Polygons
ERIC Educational Resources Information Center
Soto-Johnson, Hortensia; Bechthold, Dawn
2004-01-01
Tessellations in the Euclidean plane and regular polygons that tessellate the sphere are reviewed. The regular polygons that can possibly tesellate the sphere are spherical triangles, squares and pentagons.
Some Cosine Relations and the Regular Heptagon
ERIC Educational Resources Information Center
Osler, Thomas J.; Heng, Phongthong
2007-01-01
The ancient Greek mathematicians sought to construct, by use of straight edge and compass only, all regular polygons. They had no difficulty with regular polygons having 3, 4, 5 and 6 sides, but the 7-sided heptagon eluded all their attempts. In this article, the authors discuss some cosine relations and the regular heptagon. (Contains 1 figure.)
Regular Pentagons and the Fibonacci Sequence.
ERIC Educational Resources Information Center
French, Doug
1989-01-01
Illustrates how to draw a regular pentagon. Shows the sequence of a succession of regular pentagons formed by extending the sides. Calculates the general formula of the Lucas and Fibonacci sequences. Presents a regular icosahedron as an example of the golden ratio. (YP)
22 CFR 120.39 - Regular employee.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 22 Foreign Relations 1 2013-04-01 2013-04-01 false Regular employee. 120.39 Section 120.39 Foreign Relations DEPARTMENT OF STATE INTERNATIONAL TRAFFIC IN ARMS REGULATIONS PURPOSE AND DEFINITIONS § 120.39 Regular employee. (a) A regular employee means for purposes of this subchapter: (1) An...
22 CFR 120.39 - Regular employee.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 22 Foreign Relations 1 2012-04-01 2012-04-01 false Regular employee. 120.39 Section 120.39 Foreign Relations DEPARTMENT OF STATE INTERNATIONAL TRAFFIC IN ARMS REGULATIONS PURPOSE AND DEFINITIONS § 120.39 Regular employee. (a) A regular employee means for purposes of this subchapter: (1) An...
22 CFR 120.39 - Regular employee.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 22 Foreign Relations 1 2014-04-01 2014-04-01 false Regular employee. 120.39 Section 120.39 Foreign Relations DEPARTMENT OF STATE INTERNATIONAL TRAFFIC IN ARMS REGULATIONS PURPOSE AND DEFINITIONS § 120.39 Regular employee. (a) A regular employee means for purposes of this subchapter: (1) An...
Natural frequency of regular basins
NASA Astrophysics Data System (ADS)
Tjandra, Sugih S.; Pudjaprasetya, S. R.
2014-03-01
Similar to the vibration of a guitar string or an elastic membrane, water waves in an enclosed basin undergo standing oscillatory waves, also known as seiches. The resonant (eigen) periods of seiches are determined by water depth and geometry of the basin. For regular basins, explicit formulas are available. Resonance occurs when the dominant frequency of external force matches the eigen frequency of the basin. In this paper, we implement the conservative finite volume scheme to 2D shallow water equation to simulate resonance in closed basins. Further, we would like to use this scheme and utilizing energy spectra of the recorded signal to extract resonant periods of arbitrary basins. But here we first test the procedure for getting resonant periods of a square closed basin. The numerical resonant periods that we obtain are comparable with those from analytical formulas.
Pairing effect and misleading regularity
NASA Astrophysics Data System (ADS)
Al-Sayed, A.
2015-11-01
We study the nearest neighbor spacing distribution of energy levels of even-even nuclei classified according to their reduced electric quadrupole transition probability B (E2) ↑ using the available experimental data. We compare between Brody, and Abul-Magd distributions that extract the degree of chaoticity within nuclear dynamics. The results show that Abul-Magd parameter f can represents the chaotic behavior in more acceptable way than Brody, especially if a statistically significant study is desired. A smooth transition from chaos to order is observed as B (E2) ↑ increases. An apparent regularity was located at the second interval, namely: at 0.05 ≤ B (E2) < 0.1 in e2b2 units, and at 10 ≤ B (E2) < 15 in Weisskopf unit. Finally, the chaotic behavior parameterized in terms of B (E2) ↑ does not depend on the unit used.
Wavelet Regularization Per Nullspace Shuttle
NASA Astrophysics Data System (ADS)
Charléty, J.; Nolet, G.; Sigloch, K.; Voronin, S.; Loris, I.; Simons, F. J.; Daubechies, I.; Judd, S.
2010-12-01
Wavelet decomposition of models in an over-parameterized Earth and L1-norm minimization in wavelet space is a promising strategy to deal with the very heterogeneous data coverage in the Earth without sacrificing detail in the solution where this is resolved (see Loris et al., abstract this session). However, L1-norm minimizations are nonlinear, and pose problems of convergence speed when applied to large data sets. In an effort to speed up computations we investigate the application of the nullspace shuttle (Deal and Nolet, GJI 1996). The nullspace shuttle is a filter that adds components from the nullspace to the minimum norm solution so as to have the model satisfy additional conditions not imposed by the data. In our case, the nullspace shuttle projects the model on a truncated basis of wavelets. The convergence of this strategy is unproven, in contrast to algorithms using Landweber iteration or one of its variants, but initial computations using a very large data base give reason for optimism. We invert 430,554 P delay times measured by cross-correlation in different frequency windows. The data are dominated by observations with US Array, leading to a major discrepancy in the resolution beneath North America and the rest of the world. This is a subset of the data set inverted by Sigloch et al (Nature Geosci, 2008), excluding only a small number of ISC delays at short distance and all amplitude data. The model is a cubed Earth model with 3,637,248 voxels spanning mantle and crust, with a resolution everywhere better than 70 km, to which 1912 event corrections are added. In each iteration we determine the optimal solution by a least squares inversion with minimal damping, after which we regularize the model in wavelet space. We then compute the residual data vector (after an intermediate scaling step), and solve for a model correction until a satisfactory chi-square fit for the truncated model is obtained. We present our final results on convergence as well as a
12 CFR 725.3 - Regular membership.
Code of Federal Regulations, 2013 CFR
2013-01-01
... advances without approval of the NCUA Board for a period of six months after becoming a member. This subsection shall not apply to any credit union which becomes a Regular member of the Facility within six... member of the Facility at any time within six months prior to becoming a Regular member of the Facility....
Transport Code for Regular Triangular Geometry
Energy Science and Technology Software Center (ESTSC)
1993-06-09
DIAMANT2 solves the two-dimensional static multigroup neutron transport equation in planar regular triangular geometry. Both regular and adjoint, inhomogeneous and homogeneous problems subject to vacuum, reflective or input specified boundary flux conditions are solved. Anisotropy is allowed for the scattering source. Volume and surface sources are allowed for inhomogeneous problems.
Continuum regularization of quantum field theory
Bern, Z.
1986-04-01
Possible nonperturbative continuum regularization schemes for quantum field theory are discussed which are based upon the Langevin equation of Parisi and Wu. Breit, Gupta and Zaks made the first proposal for new gauge invariant nonperturbative regularization. The scheme is based on smearing in the ''fifth-time'' of the Langevin equation. An analysis of their stochastic regularization scheme for the case of scalar electrodynamics with the standard covariant gauge fixing is given. Their scheme is shown to preserve the masslessness of the photon and the tensor structure of the photon vacuum polarization at the one-loop level. Although stochastic regularization is viable in one-loop electrodynamics, two difficulties arise which, in general, ruins the scheme. One problem is that the superficial quadratic divergences force a bottomless action for the noise. Another difficulty is that stochastic regularization by fifth-time smearing is incompatible with Zwanziger's gauge fixing, which is the only known nonperturbaive covariant gauge fixing for nonabelian gauge theories. Finally, a successful covariant derivative scheme is discussed which avoids the difficulties encountered with the earlier stochastic regularization by fifth-time smearing. For QCD the regularized formulation is manifestly Lorentz invariant, gauge invariant, ghost free and finite to all orders. A vanishing gluon mass is explicitly verified at one loop. The method is designed to respect relevant symmetries, and is expected to provide suitable regularization for any theory of interest. Hopefully, the scheme will lend itself to nonperturbative analysis. 44 refs., 16 figs.
Regular Decompositions for H(div) Spaces
Kolev, Tzanio; Vassilevski, Panayot
2012-01-01
We study regular decompositions for H(div) spaces. In particular, we show that such regular decompositions are closely related to a previously studied “inf-sup” condition for parameter-dependent Stokes problems, for which we provide an alternative, more direct, proof.
On regularizations of the Dirac delta distribution
NASA Astrophysics Data System (ADS)
Hosseini, Bamdad; Nigam, Nilima; Stockie, John M.
2016-01-01
In this article we consider regularizations of the Dirac delta distribution with applications to prototypical elliptic and hyperbolic partial differential equations (PDEs). We study the convergence of a sequence of distributions SH to a singular term S as a parameter H (associated with the support size of SH) shrinks to zero. We characterize this convergence in both the weak-* topology of distributions and a weighted Sobolev norm. These notions motivate a framework for constructing regularizations of the delta distribution that includes a large class of existing methods in the literature. This framework allows different regularizations to be compared. The convergence of solutions of PDEs with these regularized source terms is then studied in various topologies such as pointwise convergence on a deleted neighborhood and weighted Sobolev norms. We also examine the lack of symmetry in tensor product regularizations and effects of dissipative error in hyperbolic problems.
Quantitative regularities in floodplain formation
NASA Astrophysics Data System (ADS)
Nevidimova, O.
2009-04-01
Quantitative regularities in floodplain formation Modern methods of the theory of complex systems allow to build mathematical models of complex systems where self-organizing processes are largely determined by nonlinear effects and feedback. However, there exist some factors that exert significant influence on the dynamics of geomorphosystems, but hardly can be adequately expressed in the language of mathematical models. Conceptual modeling allows us to overcome this difficulty. It is based on the methods of synergetic, which, together with the theory of dynamic systems and classical geomorphology, enable to display the dynamics of geomorphological systems. The most adequate for mathematical modeling of complex systems is the concept of model dynamics based on equilibrium. This concept is based on dynamic equilibrium, the tendency to which is observed in the evolution of all geomorphosystems. As an objective law, it is revealed in the evolution of fluvial relief in general, and in river channel processes in particular, demonstrating the ability of these systems to self-organization. Channel process is expressed in the formation of river reaches, rifts, meanders and floodplain. As floodplain is a periodically flooded surface during high waters, it naturally connects river channel with slopes, being one of boundary expressions of the water stream activity. Floodplain dynamics is inseparable from the channel dynamics. It is formed at simultaneous horizontal and vertical displacement of the river channel, that is at Y=Y(x, y), where х, y - horizontal and vertical coordinates, Y - floodplain height. When dу/dt=0 (for not lowering river channel), the river, being displaced in a horizontal plane, leaves behind a low surface, which flooding during high waters (total duration of flooding) changes from the maximum during the initial moment of time t0 to zero in the moment tn. In a similar manner changed is the total amount of accumulated material on the floodplain surface
Quantitative regularities in floodplain formation
NASA Astrophysics Data System (ADS)
Nevidimova, O.
2009-04-01
Quantitative regularities in floodplain formation Modern methods of the theory of complex systems allow to build mathematical models of complex systems where self-organizing processes are largely determined by nonlinear effects and feedback. However, there exist some factors that exert significant influence on the dynamics of geomorphosystems, but hardly can be adequately expressed in the language of mathematical models. Conceptual modeling allows us to overcome this difficulty. It is based on the methods of synergetic, which, together with the theory of dynamic systems and classical geomorphology, enable to display the dynamics of geomorphological systems. The most adequate for mathematical modeling of complex systems is the concept of model dynamics based on equilibrium. This concept is based on dynamic equilibrium, the tendency to which is observed in the evolution of all geomorphosystems. As an objective law, it is revealed in the evolution of fluvial relief in general, and in river channel processes in particular, demonstrating the ability of these systems to self-organization. Channel process is expressed in the formation of river reaches, rifts, meanders and floodplain. As floodplain is a periodically flooded surface during high waters, it naturally connects river channel with slopes, being one of boundary expressions of the water stream activity. Floodplain dynamics is inseparable from the channel dynamics. It is formed at simultaneous horizontal and vertical displacement of the river channel, that is at Y=Y(x, y), where х, y - horizontal and vertical coordinates, Y - floodplain height. When dу/dt=0 (for not lowering river channel), the river, being displaced in a horizontal plane, leaves behind a low surface, which flooding during high waters (total duration of flooding) changes from the maximum during the initial moment of time t0 to zero in the moment tn. In a similar manner changed is the total amount of accumulated material on the floodplain surface
Functional MRI Using Regularized Parallel Imaging Acquisition
Lin, Fa-Hsuan; Huang, Teng-Yi; Chen, Nan-Kuei; Wang, Fu-Nien; Stufflebeam, Steven M.; Belliveau, John W.; Wald, Lawrence L.; Kwong, Kenneth K.
2013-01-01
Parallel MRI techniques reconstruct full-FOV images from undersampled k-space data by using the uncorrelated information from RF array coil elements. One disadvantage of parallel MRI is that the image signal-to-noise ratio (SNR) is degraded because of the reduced data samples and the spatially correlated nature of multiple RF receivers. Regularization has been proposed to mitigate the SNR loss originating due to the latter reason. Since it is necessary to utilize static prior to regularization, the dynamic contrast-to-noise ratio (CNR) in parallel MRI will be affected. In this paper we investigate the CNR of regularized sensitivity encoding (SENSE) acquisitions. We propose to implement regularized parallel MRI acquisitions in functional MRI (fMRI) experiments by incorporating the prior from combined segmented echo-planar imaging (EPI) acquisition into SENSE reconstructions. We investigated the impact of regularization on the CNR by performing parametric simulations at various BOLD contrasts, acceleration rates, and sizes of the active brain areas. As quantified by receiver operating characteristic (ROC) analysis, the simulations suggest that the detection power of SENSE fMRI can be improved by regularized reconstructions, compared to unregularized reconstructions. Human motor and visual fMRI data acquired at different field strengths and array coils also demonstrate that regularized SENSE improves the detection of functionally active brain regions. PMID:16032694
Partitioning of regular computation on multiprocessor systems
NASA Technical Reports Server (NTRS)
Lee, Fung Fung
1988-01-01
Problem partitioning of regular computation over two dimensional meshes on multiprocessor systems is examined. The regular computation model considered involves repetitive evaluation of values at each mesh point with local communication. The computational workload and the communication pattern are the same at each mesh point. The regular computation model arises in numerical solutions of partial differential equations and simulations of cellular automata. Given a communication pattern, a systematic way to generate a family of partitions is presented. The influence of various partitioning schemes on performance is compared on the basis of computation to communication ratio.
Oseledets Regularity Functions for Anosov Flows
NASA Astrophysics Data System (ADS)
Simić, Slobodan N.
2011-07-01
Oseledets regularity functions quantify the deviation of the growth associated with a dynamical system along its Lyapunov bundles from the corresponding uniform exponential growth. The precise degree of regularity of these functions is unknown. We show that for every invariant Lyapunov bundle of a volume preserving Anosov flow on a closed smooth Riemannian manifold, the corresponding Oseledets regularity functions are in L p ( m), for some p > 0, where m is the probability measure defined by the volume form. We prove an analogous result for essentially bounded cocycles over volume preserving Anosov flows.
Continuum regularization of gauge theory with fermions
Chan, H.S.
1987-03-01
The continuum regularization program is discussed in the case of d-dimensional gauge theory coupled to fermions in an arbitrary representation. Two physically equivalent formulations are given. First, a Grassmann formulation is presented, which is based on the two-noise Langevin equations of Sakita, Ishikawa and Alfaro and Gavela. Second, a non-Grassmann formulation is obtained by regularized integration of the matter fields within the regularized Grassmann system. Explicit perturbation expansions are studied in both formulations, and considerable simplification is found in the integrated non-Grassmann formalism.
The Volume of the Regular Octahedron
ERIC Educational Resources Information Center
Trigg, Charles W.
1974-01-01
Five methods are given for computing the area of a regular octahedron. It is suggested that students first construct an octahedron as this will aid in space visualization. Six further extensions are left for the reader to try. (LS)
Regular Exercise May Boost Prostate Cancer Survival
... nih.gov/medlineplus/news/fullstory_158374.html Regular Exercise May Boost Prostate Cancer Survival Study found that ... HealthDay News) -- Sticking to a moderate or intense exercise regimen may improve a man's odds of surviving ...
Regular Exercise: Antidote for Deadly Diseases?
... https://medlineplus.gov/news/fullstory_160326.html Regular Exercise: Antidote for Deadly Diseases? High levels of physical ... Aug. 9, 2016 (HealthDay News) -- Getting lots of exercise may reduce your risk for five common diseases, ...
Scaling behavior of regularized bosonic strings
NASA Astrophysics Data System (ADS)
Ambjørn, J.; Makeenko, Y.
2016-03-01
We implement a proper-time UV regularization of the Nambu-Goto string, introducing an independent metric tensor and the corresponding Lagrange multiplier, and treating them in the mean-field approximation justified for long strings and/or when the dimension of space-time is large. We compute the regularized determinant of the 2D Laplacian for the closed string winding around a compact dimension, obtaining in this way the effective action, whose minimization determines the energy of the string ground state in the mean-field approximation. We discuss the existence of two scaling limits when the cutoff is taken to infinity. One scaling limit reproduces the results obtained by the hypercubic regularization of the Nambu-Goto string as well as by the use of the dynamical triangulation regularization of the Polyakov string. The other scaling limit reproduces the results obtained by canonical quantization of the Nambu-Goto string.
Nonminimal black holes with regular electric field
NASA Astrophysics Data System (ADS)
Balakin, Alexander B.; Zayats, Alexei E.
2015-05-01
We discuss the problem of identification of coupling constants, which describe interactions between photons and spacetime curvature, using exact regular solutions to the extended equations of the nonminimal Einstein-Maxwell theory. We argue the idea that three nonminimal coupling constants in this theory can be reduced to the single guiding parameter, which plays the role of nonminimal radius. We base our consideration on two examples of exact solutions obtained earlier in our works: the first of them describes a nonminimal spherically symmetric object (star or black hole) with regular radial electric field; the second example represents a nonminimal Dirac-type object (monopole or black hole) with regular metric. We demonstrate that one of the inflexion points of the regular metric function identifies a specific nonminimal radius, thus marking the domain of dominance of nonminimal interactions.
Parallelization of irregularly coupled regular meshes
NASA Technical Reports Server (NTRS)
Chase, Craig; Crowley, Kay; Saltz, Joel; Reeves, Anthony
1992-01-01
Regular meshes are frequently used for modeling physical phenomena on both serial and parallel computers. One advantage of regular meshes is that efficient discretization schemes can be implemented in a straight forward manner. However, geometrically-complex objects, such as aircraft, cannot be easily described using a single regular mesh. Multiple interacting regular meshes are frequently used to describe complex geometries. Each mesh models a subregion of the physical domain. The meshes, or subdomains, can be processed in parallel, with periodic updates carried out to move information between the coupled meshes. In many cases, there are a relatively small number (one to a few dozen) subdomains, so that each subdomain may also be partitioned among several processors. We outline a composite run-time/compile-time approach for supporting these problems efficiently on distributed-memory machines. These methods are described in the context of a multiblock fluid dynamics problem developed at LaRC.
Blind Poissonian images deconvolution with framelet regularization.
Fang, Houzhang; Yan, Luxin; Liu, Hai; Chang, Yi
2013-02-15
We propose a maximum a posteriori blind Poissonian images deconvolution approach with framelet regularization for the image and total variation (TV) regularization for the point spread function. Compared with the TV based methods, our algorithm not only suppresses noise effectively but also recovers edges and detailed information. Moreover, the split Bregman method is exploited to solve the resulting minimization problem. Comparative results on both simulated and real images are reported. PMID:23455078
Regularized CT reconstruction on unstructured grid
NASA Astrophysics Data System (ADS)
Chen, Yun; Lu, Yao; Ma, Xiangyuan; Xu, Yuesheng
2016-04-01
Computed tomography (CT) is an ill-posed problem. Reconstruction on unstructured grid reduces the computational cost and alleviates the ill-posedness by decreasing the dimension of the solution space. However, there was no systematic study on edge-preserving regularization methods for CT reconstruction on unstructured grid. In this work, we propose a novel regularization method for CT reconstruction on unstructured grid, such as triangular or tetrahedral meshes generated from the initial images reconstructed via analysis reconstruction method (e.g., filtered back-projection). The proposed regularization method is modeled as a three-term optimization problem, containing a weighted least square fidelity term motivated by the simultaneous algebraic reconstruction technique (SART). The related cost function contains two non-differentiable terms, which bring difficulty to the development of the fast solver. A fixed-point proximity algorithm with SART is developed for solving the related optimization problem, and accelerating the convergence. Finally, we compare the regularized CT reconstruction method to SART with different regularization methods. Numerical experiments demonstrated that the proposed regularization method on unstructured grid is effective to suppress noise and preserve edge features.
Continuum regularization of quantum field theory
Bern, Z.
1986-01-01
Breit, Gupta, and Zaks made the first proposal for new gauge invariant nonperturbative regularization. The scheme is based on smearing in the fifth-time of the Langevin equation. An analysis of their stochastic regularization scheme for the case of scalar electrodynamics with the standard covariant gauge fixing is given. Their scheme is shown to preserve the masslessness of the photon and the tensor structure of the photon vacuum polarization at the one-loop level. Although stochastic regularization is viable in one-loop electrodynamics, difficulties arise which, in general, ruins the scheme. A successful covariant derivative scheme is discussed which avoids the difficulties encountered with the earlier stochastic regularization by fifth-time smearing. For QCD the regularized formulation is manifestly Lorentz invariant, gauge invariant, ghost free and finite to all orders. A vanishing gluon mass is explicitly verified at one loop. The method is designed to respect relevant symmetries, and is expected to provide suitable regularization for any theory of interest.
Usual Source of Care in Preventive Service Use: A Regular Doctor versus a Regular Site
Xu, K Tom
2002-01-01
Objective To compare the effects of having a regular doctor and having a regular site on five preventive services, controlling for the endogeneity of having a usual source of care. Data Source The Medical Expenditure Panel Survey 1996 conducted by the Agency for Healthcare Research and Quality and the National Center for Health Statistics. Study Design Mammograms, pap smears, blood pressure checkups, cholesterol level checkups, and flu shots were examined. A modified behavioral model framework was presented, which controlled for the endogeneity of having a usual source of care. Based on this framework, a two-equation empirical model was established to predict the probabilities of having a regular doctor and having a regular site, and use of each type of preventive service. Principal Findings Having a regular doctor was found to have a greater impact than having a regular site on discretional preventive services, such as blood pressure and cholesterol level checkups. No statistically significant differences were found between the effects a having a regular doctor and having a regular site on the use of flu shots, pap smears, and mammograms. Among the five preventive services, having a usual source of care had the greatest impact on cholesterol level checkups and pap smears. Conclusions Promoting a stable physician–patient relationship can improve patients’ timely receipt of clinical prevention. For certain preventive services, having a regular doctor is more effective than having a regular site. PMID:12546284
Improvements in GRACE Gravity Fields Using Regularization
NASA Astrophysics Data System (ADS)
Save, H.; Bettadpur, S.; Tapley, B. D.
2008-12-01
The unconstrained global gravity field models derived from GRACE are susceptible to systematic errors that show up as broad "stripes" aligned in a North-South direction on the global maps of mass flux. These errors are believed to be a consequence of both systematic and random errors in the data that are amplified by the nature of the gravity field inverse problem. These errors impede scientific exploitation of the GRACE data products, and limit the realizable spatial resolution of the GRACE global gravity fields in certain regions. We use regularization techniques to reduce these "stripe" errors in the gravity field products. The regularization criteria are designed such that there is no attenuation of the signal and that the solutions fit the observations as well as an unconstrained solution. We have used a computationally inexpensive method, normally referred to as "L-ribbon", to find the regularization parameter. This paper discusses the characteristics and statistics of a 5-year time-series of regularized gravity field solutions. The solutions show markedly reduced stripes, are of uniformly good quality over time, and leave little or no systematic observation residuals, which is a frequent consequence of signal suppression from regularization. Up to degree 14, the signal in regularized solution shows correlation greater than 0.8 with the un-regularized CSR Release-04 solutions. Signals from large-amplitude and small-spatial extent events - such as the Great Sumatra Andaman Earthquake of 2004 - are visible in the global solutions without using special post-facto error reduction techniques employed previously in the literature. Hydrological signals as small as 5 cm water-layer equivalent in the small river basins, like Indus and Nile for example, are clearly evident, in contrast to noisy estimates from RL04. The residual variability over the oceans relative to a seasonal fit is small except at higher latitudes, and is evident without the need for de-striping or
Oxygen saturation resolution influences regularity measurements.
Garde, Ainara; Karlen, Walter; Dehkordi, Parastoo; Ansermino, J Mark; Dumont, Guy A
2014-01-01
The measurement of regularity in the oxygen saturation (SpO(2)) signal has been suggested for use in identifying subjects with sleep disordered breathing (SDB). Previous work has shown that children with SDB have lower SpO(2) regularity than subjects without SDB (NonSDB). Regularity was measured using non-linear methods like approximate entropy (ApEn), sample entropy (SamEn) and Lempel-Ziv (LZ) complexity. Different manufacturer's pulse oximeters provide SpO(2) at various resolutions and the effect of this resolution difference on SpO(2) regularity, has not been studied. To investigate this effect, we used the SpO(2) signal of children with and without SDB, recorded from the Phone Oximeter (0.1% resolution) and the same SpO(2) signal rounded to the nearest integer (artificial 1% resolution). To further validate the effect of rounding, we also used the SpO(2) signal (1% resolution) recorded simultaneously from polysomnography (PSG), as a control signal. We estimated SpO(2) regularity by computing the ApEn, SamEn and LZ complexity, using a 5-min sliding window and showed that different resolutions provided significantly different results. The regularity calculated using 0.1% SpO(2) resolution provided no significant differences between SDB and NonSDB. However, the artificial 1% resolution SpO(2) provided significant differences between SDB and NonSDB, showing a more random SpO(2) pattern (lower SpO(2) regularity) in SDB children, as suggested in the past. Similar results were obtained with the SpO(2) recorded from PSG (1% resolution), which further validated that this SpO(2) regularity change was due to the rounding effect. Therefore, the SpO(2) resolution has a great influence in regularity measurements like ApEn, SamEn and LZ complexity that should be considered when studying the SpO(2) pattern in children with SDB. PMID:25570437
Modified sparse regularization for electrical impedance tomography.
Fan, Wenru; Wang, Huaxiang; Xue, Qian; Cui, Ziqiang; Sun, Benyuan; Wang, Qi
2016-03-01
Electrical impedance tomography (EIT) aims to estimate the electrical properties at the interior of an object from current-voltage measurements on its boundary. It has been widely investigated due to its advantages of low cost, non-radiation, non-invasiveness, and high speed. Image reconstruction of EIT is a nonlinear and ill-posed inverse problem. Therefore, regularization techniques like Tikhonov regularization are used to solve the inverse problem. A sparse regularization based on L1 norm exhibits superiority in preserving boundary information at sharp changes or discontinuous areas in the image. However, the limitation of sparse regularization lies in the time consumption for solving the problem. In order to further improve the calculation speed of sparse regularization, a modified method based on separable approximation algorithm is proposed by using adaptive step-size and preconditioning technique. Both simulation and experimental results show the effectiveness of the proposed method in improving the image quality and real-time performance in the presence of different noise intensities and conductivity contrasts. PMID:27036798
Assessment of regularization techniques for electrocardiographic imaging
Milanič, Matija; Jazbinšek, Vojko; MacLeod, Robert S.; Brooks, Dana H.; Hren, Rok
2014-01-01
A widely used approach to solving the inverse problem in electrocardiography involves computing potentials on the epicardium from measured electrocardiograms (ECGs) on the torso surface. The main challenge of solving this electrocardiographic imaging (ECGI) problem lies in its intrinsic ill-posedness. While many regularization techniques have been developed to control wild oscillations of the solution, the choice of proper regularization methods for obtaining clinically acceptable solutions is still a subject of ongoing research. However there has been little rigorous comparison across methods proposed by different groups. This study systematically compared various regularization techniques for solving the ECGI problem under a unified simulation framework, consisting of both 1) progressively more complex idealized source models (from single dipole to triplet of dipoles), and 2) an electrolytic human torso tank containing a live canine heart, with the cardiac source being modeled by potentials measured on a cylindrical cage placed around the heart. We tested 13 different regularization techniques to solve the inverse problem of recovering epicardial potentials, and found that non-quadratic methods (total variation algorithms) and first-order and second-order Tikhonov regularizations outperformed other methodologies and resulted in similar average reconstruction errors. PMID:24369741
Perturbations in a regular bouncing universe
Battefeld, T.J.; Geshnizjani, G.
2006-03-15
We consider a simple toy model of a regular bouncing universe. The bounce is caused by an extra timelike dimension, which leads to a sign flip of the {rho}{sup 2} term in the effective four dimensional Randall Sundrum-like description. We find a wide class of possible bounces: big bang avoiding ones for regular matter content, and big rip avoiding ones for phantom matter. Focusing on radiation as the matter content, we discuss the evolution of scalar, vector and tensor perturbations. We compute a spectral index of n{sub s}=-1 for scalar perturbations and a deep blue index for tensor perturbations after invoking vacuum initial conditions, ruling out such a model as a realistic one. We also find that the spectrum (evaluated at Hubble crossing) is sensitive to the bounce. We conclude that it is challenging, but not impossible, for cyclic/ekpyrotic models to succeed, if one can find a regularized version.
Shadow of rotating regular black holes
NASA Astrophysics Data System (ADS)
Abdujabbarov, Ahmadjon; Amir, Muhammed; Ahmedov, Bobomurat; Ghosh, Sushant G.
2016-05-01
We study the shadows cast by the different types of rotating regular black holes viz. Ayón-Beato-García (ABG), Hayward, and Bardeen. These black holes have in addition to the total mass (M ) and rotation parameter (a ), different parameters as electric charge (Q ), deviation parameter (g ), and magnetic charge (g*). Interestingly, the size of the shadow is affected by these parameters in addition to the rotation parameter. We found that the radius of the shadow in each case decreases monotonically, and the distortion parameter increases when the values of these parameters increase. A comparison with the standard Kerr case is also investigated. We have also studied the influence of the plasma environment around regular black holes to discuss its shadow. The presence of the plasma affects the apparent size of the regular black hole's shadow to be increased due to two effects: (i) gravitational redshift of the photons and (ii) radial dependence of plasma density.
Strong regularizing effect of integrable systems
Zhou, Xin
1997-11-01
Many time evolution problems have the so-called strong regularization effect, that is, with any irregular initial data, as soon as becomes greater than 0, the solution becomes C{sup {infinity}} for both spacial and temporal variables. This paper studies 1 x 1 dimension integrable systems for such regularizing effect. In the work by Sachs, Kappler [S][K], (see also earlier works [KFJ] and [Ka]), strong regularizing effect is proved for KdV with rapidly decaying irregular initial data, using the inverse scattering method. There are two equivalent Gel`fand-Levitan-Marchenko (GLM) equations associated to an inverse scattering problem, one is normalized at x = {infinity} and another at x = {infinity}. The method of [S][K] relies on the fact that the KdV waves propagate only in one direction and therefore one of the two GLM equations remains normalized and can be differentiated infinitely many times. 15 refs.
Regularized image recovery in scattering media.
Schechner, Yoav Y; Averbuch, Yuval
2007-09-01
When imaging in scattering media, visibility degrades as objects become more distant. Visibility can be significantly restored by computer vision methods that account for physical processes occurring during image formation. Nevertheless, such recovery is prone to noise amplification in pixels corresponding to distant objects, where the medium transmittance is low. We present an adaptive filtering approach that counters the above problems: while significantly improving visibility relative to raw images, it inhibits noise amplification. Essentially, the recovery formulation is regularized, where the regularization adapts to the spatially varying medium transmittance. Thus, this regularization does not blur close objects. We demonstrate the approach in atmospheric and underwater experiments, based on an automatic method for determining the medium transmittance. PMID:17627052
[Why regular physical activity favors longevity].
Pentimone, F; Del Corso, L
1998-06-01
Regular physical exercise is useful at all ages. In the elderly, even a gentle exercise programme consisting of walking, bicycling, playing golf if performed constantly increases longevity by preventing the onset of the main diseases or alleviating the handicaps they may have caused. Cardiovascular diseases, which represent the main cause of death in the elderly, and osteoporosis, a disabling disease potentially capable of shortening life expectancy, benefit from physical exercise which if performed regularly well before the start of old age may help to prevent them. Over the past few years there has been growing evidence of the concrete protection offered against neoplasia and even the ageing process itself. PMID:9739351
Learning with regularizers in multilayer neural networks
NASA Astrophysics Data System (ADS)
Saad, David; Rattray, Magnus
1998-02-01
We study the effect of regularization in an on-line gradient-descent learning scenario for a general two-layer student network with an arbitrary number of hidden units. Training examples are randomly drawn input vectors labeled by a two-layer teacher network with an arbitrary number of hidden units that may be corrupted by Gaussian output noise. We examine the effect of weight decay regularization on the dynamical evolution of the order parameters and generalization error in various phases of the learning process, in both noiseless and noisy scenarios.
Regular transport dynamics produce chaotic travel times.
Villalobos, Jorge; Muñoz, Víctor; Rogan, José; Zarama, Roberto; Johnson, Neil F; Toledo, Benjamín; Valdivia, Juan Alejandro
2014-06-01
In the hope of making passenger travel times shorter and more reliable, many cities are introducing dedicated bus lanes (e.g., Bogota, London, Miami). Here we show that chaotic travel times are actually a natural consequence of individual bus function, and hence of public transport systems more generally, i.e., chaotic dynamics emerge even when the route is empty and straight, stops and lights are equidistant and regular, and loading times are negligible. More generally, our findings provide a novel example of chaotic dynamics emerging from a single object following Newton's laws of motion in a regularized one-dimensional system. PMID:25019866
Demosaicing as the problem of regularization
NASA Astrophysics Data System (ADS)
Kunina, Irina; Volkov, Aleksey; Gladilin, Sergey; Nikolaev, Dmitry
2015-12-01
Demosaicing is the process of reconstruction of a full-color image from Bayer mosaic, which is used in digital cameras for image formation. This problem is usually considered as an interpolation problem. In this paper, we propose to consider the demosaicing problem as a problem of solving an underdetermined system of algebraic equations using regularization methods. We consider regularization with standard l1/2-, l1 -, l2- norms and their effect on quality image reconstruction. The experimental results showed that the proposed technique can both be used in existing methods and become the base for new ones
REGULAR VERSUS DIFFUSIVE PHOTOSPHERIC FLUX CANCELLATION
Litvinenko, Yuri E.
2011-04-20
Observations of photospheric flux cancellation on the Sun imply that cancellation can be a diffusive rather than regular process. A criterion is derived, which quantifies the parameter range in which diffusive photospheric cancellation should occur. Numerical estimates show that regular cancellation models should be expected to give a quantitatively accurate description of photospheric cancellation. The estimates rely on a recently suggested scaling for a turbulent magnetic diffusivity, which is consistent with the diffusivity measurements on spatial scales varying by almost two orders of magnitude. Application of the turbulent diffusivity to large-scale dispersal of the photospheric magnetic flux is discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-21
... Notice of Determination was published in the Federal Register on August 30, 2010 (75 FR 52986). Workers... Employment and Training Administration Humana Insurance Company, a Division Of Carenetwork, Inc., Green Bay..., Inc., Green Bay, Wisconsin was based on the findings that the subject firm did not, during the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-13
... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF JUSTICE Antitrust Division United States v. Humana Inc. and Arcadian Management Services, Inc.; Public Comment and Response on Proposed Final Judgment Pursuant to the Antitrust Procedures and Penalties Act, 15 U.S.C. 16(b)-(h), the United States hereby...
12 CFR 725.3 - Regular membership.
Code of Federal Regulations, 2014 CFR
2014-01-01
... and Banking NATIONAL CREDIT UNION ADMINISTRATION REGULATIONS AFFECTING CREDIT UNIONS NATIONAL CREDIT UNION ADMINISTRATION CENTRAL LIQUIDITY FACILITY § 725.3 Regular membership. (a) A natural person credit... the credit union's paid-in and unimpaired capital and surplus, as determined in accordance with §...
12 CFR 725.3 - Regular membership.
Code of Federal Regulations, 2011 CFR
2011-01-01
... and Banking NATIONAL CREDIT UNION ADMINISTRATION REGULATIONS AFFECTING CREDIT UNIONS NATIONAL CREDIT UNION ADMINISTRATION CENTRAL LIQUIDITY FACILITY § 725.3 Regular membership. (a) A natural person credit... the credit union's paid-in and unimpaired capital and surplus, as determined in accordance with §...
12 CFR 725.3 - Regular membership.
Code of Federal Regulations, 2010 CFR
2010-01-01
... and Banking NATIONAL CREDIT UNION ADMINISTRATION REGULATIONS AFFECTING CREDIT UNIONS NATIONAL CREDIT UNION ADMINISTRATION CENTRAL LIQUIDITY FACILITY § 725.3 Regular membership. (a) A natural person credit... the credit union's paid-in and unimpaired capital and surplus, as determined in accordance with §...
12 CFR 725.3 - Regular membership.
Code of Federal Regulations, 2012 CFR
2012-01-01
... and Banking NATIONAL CREDIT UNION ADMINISTRATION REGULATIONS AFFECTING CREDIT UNIONS NATIONAL CREDIT UNION ADMINISTRATION CENTRAL LIQUIDITY FACILITY § 725.3 Regular membership. (a) A natural person credit... the credit union's paid-in and unimpaired capital and surplus, as determined in accordance with §...
Commitment and Dependence Upon Regular Running.
ERIC Educational Resources Information Center
Sachs, Michael L.; Pargman, David
The linear relationship between intellectual commitment to running and psychobiological dependence upon running is examined. A sample of 540 regular runners (running frequency greater than three days per week for the past year for the majority) was surveyed with a questionnaire. Measures of commitment and dependence on running, as well as…
RBOOST: RIEMANNIAN DISTANCE BASED REGULARIZED BOOSTING.
Liu, Meizhu; Vemuri, Baba C
2011-03-30
Boosting is a versatile machine learning technique that has numerous applications including but not limited to image processing, computer vision, data mining etc. It is based on the premise that the classification performance of a set of weak learners can be boosted by some weighted combination of them. There have been a number of boosting methods proposed in the literature, such as the AdaBoost, LPBoost, SoftBoost and their variations. However, the learning update strategies used in these methods usually lead to overfitting and instabilities in the classification accuracy. Improved boosting methods via regularization can overcome such difficulties. In this paper, we propose a Riemannian distance regularized LPBoost, dubbed RBoost. RBoost uses Riemannian distance between two square-root densities (in closed form) - used to represent the distribution over the training data and the classification error respectively - to regularize the error distribution in an iterative update formula. Since this distance is in closed form, RBoost requires much less computational cost compared to other regularized Boosting algorithms. We present several experimental results depicting the performance of our algorithm in comparison to recently published methods, LP-Boost and CAVIAR, on a variety of datasets including the publicly available OASIS database, a home grown Epilepsy database and the well known UCI repository. Results depict that the RBoost algorithm performs better than the competing methods in terms of accuracy and efficiency. PMID:21927643
Generalisation of Regular and Irregular Morphological Patterns.
ERIC Educational Resources Information Center
Prasada, Sandeep; and Pinker, Steven
1993-01-01
When it comes to explaining English verbs' patterns of regular and irregular generalization, single-network theories have difficulty with the former, rule-only theories with the latter process. Linguistic and psycholinguistic evidence, based on observation during experiments and simulations in morphological pattern generation, independently call…
Observing Special and Regular Education Classrooms.
ERIC Educational Resources Information Center
Hersh, Susan B.
The paper describes an observation instrument originally developed as a research tool to assess both the special setting and the regular classroom. The instrument can also be used in determining appropriate placement for students with learning disabilities and for programming the transfer of skills learned in the special setting to the regular…
Starting flow in regular polygonal ducts
NASA Astrophysics Data System (ADS)
Wang, C. Y.
2016-06-01
The starting flows in regular polygonal ducts of S = 3, 4, 5, 6, 8 sides are determined by the method of eigenfunction superposition. The necessary S-fold symmetric eigenfunctions and eigenvalues of the Helmholtz equation are found either exactly or by boundary point match. The results show the starting time is governed by the first eigenvalue.
28 CFR 540.44 - Regular visitors.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PERSONS IN THE COMMUNITY Visiting Regulations § 540.44 Regular visitors. An inmate desiring to have... ordinarily will be extended to friends and associates having an established relationship with the inmate... of the institution. Exceptions to the prior relationship rule may be made, particularly for...
Regular Classroom Teachers' Perceptions of Mainstreaming Effects.
ERIC Educational Resources Information Center
Ringlaben, Ravic P.; Price, Jay R.
To assess regular classroom teachers' perceptions of mainstreaming, a 22 item questionnaire was completed by 117 teachers (K through 12). Among results were that nearly half of the Ss indicated a lack of preparation for implementing mainstreaming; 47% tended to be very willing to accept aminstreamed students; 42% said mainstreaming was working…
Regularizing cosmological singularities by varying physical constants
Dąbrowski, Mariusz P.; Marosek, Konrad E-mail: k.marosek@wmf.univ.szczecin.pl
2013-02-01
Varying physical constant cosmologies were claimed to solve standard cosmological problems such as the horizon, the flatness and the Λ-problem. In this paper, we suggest yet another possible application of these theories: solving the singularity problem. By specifying some examples we show that various cosmological singularities may be regularized provided the physical constants evolve in time in an appropriate way.
Exploring the structural regularities in networks
NASA Astrophysics Data System (ADS)
Shen, Hua-Wei; Cheng, Xue-Qi; Guo, Jia-Feng
2011-11-01
In this paper, we consider the problem of exploring structural regularities of networks by dividing the nodes of a network into groups such that the members of each group have similar patterns of connections to other groups. Specifically, we propose a general statistical model to describe network structure. In this model, a group is viewed as a hidden or unobserved quantity and it is learned by fitting the observed network data using the expectation-maximization algorithm. Compared with existing models, the most prominent strength of our model is the high flexibility. This strength enables it to possess the advantages of existing models and to overcome their shortcomings in a unified way. As a result, not only can broad types of structure be detected without prior knowledge of the type of intrinsic regularities existing in the target network, but also the type of identified structure can be directly learned from the network. Moreover, by differentiating outgoing edges from incoming edges, our model can detect several types of structural regularities beyond competing models. Tests on a number of real world and artificial networks demonstrate that our model outperforms the state-of-the-art model in shedding light on the structural regularities of networks, including the overlapping community structure, multipartite structure, and several other types of structure, which are beyond the capability of existing models.
Dyslexia in Regular Orthographies: Manifestation and Causation
ERIC Educational Resources Information Center
Wimmer, Heinz; Schurz, Matthias
2010-01-01
This article summarizes our research on the manifestation of dyslexia in German and on cognitive deficits, which may account for the severe reading speed deficit and the poor orthographic spelling performance that characterize dyslexia in regular orthographies. An only limited causal role of phonological deficits (phonological awareness,…
Regularities in Spearman's Law of Diminishing Returns.
ERIC Educational Resources Information Center
Jensen, Arthur R.
2003-01-01
Examined the assumption that Spearman's law acts unsystematically and approximately uniformly for various subtests of cognitive ability in an IQ test battery when high- and low-ability IQ groups are selected. Data from national standardization samples for Wechsler adult and child IQ tests affirm regularities in Spearman's "Law of Diminishing…
Fast Image Reconstruction with L2-Regularization
Bilgic, Berkin; Chatnuntawech, Itthi; Fan, Audrey P.; Setsompop, Kawin; Cauley, Stephen F.; Wald, Lawrence L.; Adalsteinsson, Elfar
2014-01-01
Purpose We introduce L2-regularized reconstruction algorithms with closed-form solutions that achieve dramatic computational speed-up relative to state of the art L1- and L2-based iterative algorithms while maintaining similar image quality for various applications in MRI reconstruction. Materials and Methods We compare fast L2-based methods to state of the art algorithms employing iterative L1- and L2-regularization in numerical phantom and in vivo data in three applications; 1) Fast Quantitative Susceptibility Mapping (QSD), 2) Lipid artifact suppression in Magnetic Resonance Spectroscopic Imaging (MRSI), and 3) Diffusion Spectrum Imaging (DSI). In all cases, proposed L2-based methods are compared with the state of the art algorithms, and two to three orders of magnitude speed up is demonstrated with similar reconstruction quality. Results The closed-form solution developed for regularized QSM allows processing of a 3D volume under 5 seconds, the proposed lipid suppression algorithm takes under 1 second to reconstruct single-slice MRSI data, while the PCA based DSI algorithm estimates diffusion propagators from undersampled q-space for a single slice under 30 seconds, all running in Matlab using a standard workstation. Conclusion For the applications considered herein, closed-form L2-regularization can be a faster alternative to its iterative counterpart or L1-based iterative algorithms, without compromising image quality. PMID:24395184
Handicapped Children in the Regular Classroom.
ERIC Educational Resources Information Center
Fountain Valley School District, CA.
Reported was a project in which 60 educable mentally retarded (EMR) and 30 educationally handicapped (EH) elementary school students were placed in regular classrooms to determine whether they could be effectively educated in those settings. Effective education was defined in terms of improvement in reading, mathematics, student and teacher…
Learning regular expressions for clinical text classification
Bui, Duy Duc An; Zeng-Treitler, Qing
2014-01-01
Objectives Natural language processing (NLP) applications typically use regular expressions that have been developed manually by human experts. Our goal is to automate both the creation and utilization of regular expressions in text classification. Methods We designed a novel regular expression discovery (RED) algorithm and implemented two text classifiers based on RED. The RED+ALIGN classifier combines RED with an alignment algorithm, and RED+SVM combines RED with a support vector machine (SVM) classifier. Two clinical datasets were used for testing and evaluation: the SMOKE dataset, containing 1091 text snippets describing smoking status; and the PAIN dataset, containing 702 snippets describing pain status. We performed 10-fold cross-validation to calculate accuracy, precision, recall, and F-measure metrics. In the evaluation, an SVM classifier was trained as the control. Results The two RED classifiers achieved 80.9–83.0% in overall accuracy on the two datasets, which is 1.3–3% higher than SVM's accuracy (p<0.001). Similarly, small but consistent improvements have been observed in precision, recall, and F-measure when RED classifiers are compared with SVM alone. More significantly, RED+ALIGN correctly classified many instances that were misclassified by the SVM classifier (8.1–10.3% of the total instances and 43.8–53.0% of SVM's misclassifications). Conclusions Machine-generated regular expressions can be effectively used in clinical text classification. The regular expression-based classifier can be combined with other classifiers, like SVM, to improve classification performance. PMID:24578357
NASA Astrophysics Data System (ADS)
Lanteri, Henri; Roche, Muriel; Cuevas, Olga; Aime, Claude
1999-12-01
We propose regularized versions of Maximum Likelihood algorithms for Poisson process with non-negativity constraint. For such process, the best-known (non- regularized) algorithm is that of Richardson-Lucy, extensively used for astronomical applications. Regularization is necessary to prevent an amplification of the noise during the iterative reconstruction; this can be done either by limiting the iteration number or by introducing a penalty term. In this Communication, we focus our attention on the explicit regularization using Tikhonov (Identity and Laplacian operator) or entropy terms (Kullback-Leibler and Csiszar divergences). The algorithms are established from the Kuhn-Tucker first order optimality conditions for the minimization of the Lagrange function and from the method of successive substitutions. The algorithms may be written in a `product form'. Numerical illustrations are given for simulated images corrupted by photon noise. The effects of the regularization are shown in the Fourier plane. The tests we have made indicate that a noticeable improvement of the results may be obtained for some of these explicitly regularized algorithms. We also show that a comparison with a Wiener filter can give the optimal regularizing conditions (operator and strength).
42 CFR 61.3 - Purpose of regular fellowships.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 1 2010-10-01 2010-10-01 false Purpose of regular fellowships. 61.3 Section 61.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES FELLOWSHIPS, INTERNSHIPS, TRAINING FELLOWSHIPS Regular Fellowships § 61.3 Purpose of regular fellowships. Regular fellowships...
42 CFR 61.3 - Purpose of regular fellowships.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 1 2011-10-01 2011-10-01 false Purpose of regular fellowships. 61.3 Section 61.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES FELLOWSHIPS, INTERNSHIPS, TRAINING FELLOWSHIPS Regular Fellowships § 61.3 Purpose of regular fellowships. Regular fellowships...
42 CFR 61.3 - Purpose of regular fellowships.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 1 2014-10-01 2014-10-01 false Purpose of regular fellowships. 61.3 Section 61.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES FELLOWSHIPS, INTERNSHIPS, TRAINING FELLOWSHIPS Regular Fellowships § 61.3 Purpose of regular fellowships. Regular fellowships...
42 CFR 61.3 - Purpose of regular fellowships.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 1 2012-10-01 2012-10-01 false Purpose of regular fellowships. 61.3 Section 61.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES FELLOWSHIPS, INTERNSHIPS, TRAINING FELLOWSHIPS Regular Fellowships § 61.3 Purpose of regular fellowships. Regular fellowships...
42 CFR 61.3 - Purpose of regular fellowships.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 1 2013-10-01 2013-10-01 false Purpose of regular fellowships. 61.3 Section 61.3 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES FELLOWSHIPS, INTERNSHIPS, TRAINING FELLOWSHIPS Regular Fellowships § 61.3 Purpose of regular fellowships. Regular fellowships...
Modeling Regular Replacement for String Constraint Solving
NASA Technical Reports Server (NTRS)
Fu, Xiang; Li, Chung-Chih
2010-01-01
Bugs in user input sanitation of software systems often lead to vulnerabilities. Among them many are caused by improper use of regular replacement. This paper presents a precise modeling of various semantics of regular substitution, such as the declarative, finite, greedy, and reluctant, using finite state transducers (FST). By projecting an FST to its input/output tapes, we are able to solve atomic string constraints, which can be applied to both the forward and backward image computation in model checking and symbolic execution of text processing programs. We report several interesting discoveries, e.g., certain fragments of the general problem can be handled using less expressive deterministic FST. A compact representation of FST is implemented in SUSHI, a string constraint solver. It is applied to detecting vulnerabilities in web applications
Generalized Higher Degree Total Variation (HDTV) Regularization
Hu, Yue; Ongie, Greg; Ramani, Sathish; Jacob, Mathews
2015-01-01
We introduce a family of novel image regularization penalties called generalized higher degree total variation (HDTV). These penalties further extend our previously introduced HDTV penalties, which generalize the popular total variation (TV) penalty to incorporate higher degree image derivatives. We show that many of the proposed second degree extensions of TV are special cases or are closely approximated by a generalized HDTV penalty. Additionally, we propose a novel fast alternating minimization algorithm for solving image recovery problems with HDTV and generalized HDTV regularization. The new algorithm enjoys a ten-fold speed up compared to the iteratively reweighted majorize minimize algorithm proposed in a previous work. Numerical experiments on 3D magnetic resonance images and 3D microscopy images show that HDTV and generalized HDTV improve the image quality significantly compared with TV. PMID:24710832
Convex nonnegative matrix factorization with manifold regularization.
Hu, Wenjun; Choi, Kup-Sze; Wang, Peiliang; Jiang, Yunliang; Wang, Shitong
2015-03-01
Nonnegative Matrix Factorization (NMF) has been extensively applied in many areas, including computer vision, pattern recognition, text mining, and signal processing. However, nonnegative entries are usually required for the data matrix in NMF, which limits its application. Besides, while the basis and encoding vectors obtained by NMF can represent the original data in low dimension, the representations do not always reflect the intrinsic geometric structure embedded in the data. Motivated by manifold learning and Convex NMF (CNMF), we propose a novel matrix factorization method called Graph Regularized and Convex Nonnegative Matrix Factorization (GCNMF) by introducing a graph regularized term into CNMF. The proposed matrix factorization technique not only inherits the intrinsic low-dimensional manifold structure, but also allows the processing of mixed-sign data matrix. Clustering experiments on nonnegative and mixed-sign real-world data sets are conducted to demonstrate the effectiveness of the proposed method. PMID:25523040
Charged fermions tunneling from regular black holes
Sharif, M. Javed, W.
2012-11-15
We study Hawking radiation of charged fermions as a tunneling process from charged regular black holes, i.e., the Bardeen and ABGB black holes. For this purpose, we apply the semiclassical WKB approximation to the general covariant Dirac equation for charged particles and evaluate the tunneling probabilities. We recover the Hawking temperature corresponding to these charged regular black holes. Further, we consider the back-reaction effects of the emitted spin particles from black holes and calculate their corresponding quantum corrections to the radiation spectrum. We find that this radiation spectrum is not purely thermal due to the energy and charge conservation but has some corrections. In the absence of charge, e = 0, our results are consistent with those already present in the literature.
A regular version of Smilansky model
Barseghyan, Diana; Exner, Pavel
2014-04-15
We discuss a modification of Smilansky model in which a singular potential “channel” is replaced by a regular, below unbounded potential which shrinks as it becomes deeper. We demonstrate that, similarly to the original model, such a system exhibits a spectral transition with respect to the coupling constant, and determine the critical value above which a new spectral branch opens. The result is generalized to situations with multiple potential “channels.”.
A regularization approach to hydrofacies delineation
Wohlberg, Brendt; Tartakovsky, Daniel
2009-01-01
We consider an inverse problem of identifying complex internal structures of composite (geological) materials from sparse measurements of system parameters and system states. Two conceptual frameworks for identifying internal boundaries between constitutive materials in a composite are considered. A sequential approach relies on support vector machines, nearest neighbor classifiers, or geostatistics to reconstruct boundaries from measurements of system parameters and then uses system states data to refine the reconstruction. A joint approach inverts the two data sets simultaneously by employing a regularization approach.
Optical tomography by means of regularized MLEM
NASA Astrophysics Data System (ADS)
Majer, Charles L.; Urbanek, Tina; Peter, Jörg
2015-09-01
To solve the inverse problem involved in fluorescence mediated tomography a regularized maximum likelihood expectation maximization (MLEM) reconstruction strategy is proposed. This technique has recently been applied to reconstruct galaxy clusters in astronomy and is adopted here. The MLEM algorithm is implemented as Richardson-Lucy (RL) scheme and includes entropic regularization and a floating default prior. Hence, the strategy is very robust against measurement noise and also avoids converging into noise patterns. Normalized Gaussian filtering with fixed standard deviation is applied for the floating default kernel. The reconstruction strategy is investigated using the XFM-2 homogeneous mouse phantom (Caliper LifeSciences Inc., Hopkinton, MA) with known optical properties. Prior to optical imaging, X-ray CT tomographic data of the phantom were acquire to provide structural context. Phantom inclusions were fit with various fluorochrome inclusions (Cy5.5) for which optical data at 60 projections over 360 degree have been acquired, respectively. Fluorochrome excitation has been accomplished by scanning laser point illumination in transmission mode (laser opposite to camera). Following data acquisition, a 3D triangulated mesh is derived from the reconstructed CT data which is then matched with the various optical projection images through 2D linear interpolation, correlation and Fourier transformation in order to assess translational and rotational deviations between the optical and CT imaging systems. Preliminary results indicate that the proposed regularized MLEM algorithm, when driven with a constant initial condition, yields reconstructed images that tend to be smoother in comparison to classical MLEM without regularization. Once the floating default prior is included this bias was significantly reduced.
Sparse regularization for force identification using dictionaries
NASA Astrophysics Data System (ADS)
Qiao, Baijie; Zhang, Xingwu; Wang, Chenxi; Zhang, Hang; Chen, Xuefeng
2016-04-01
The classical function expansion method based on minimizing l2-norm of the response residual employs various basis functions to represent the unknown force. Its difficulty lies in determining the optimum number of basis functions. Considering the sparsity of force in the time domain or in other basis space, we develop a general sparse regularization method based on minimizing l1-norm of the coefficient vector of basis functions. The number of basis functions is adaptively determined by minimizing the number of nonzero components in the coefficient vector during the sparse regularization process. First, according to the profile of the unknown force, the dictionary composed of basis functions is determined. Second, a sparsity convex optimization model for force identification is constructed. Third, given the transfer function and the operational response, Sparse reconstruction by separable approximation (SpaRSA) is developed to solve the sparse regularization problem of force identification. Finally, experiments including identification of impact and harmonic forces are conducted on a cantilever thin plate structure to illustrate the effectiveness and applicability of SpaRSA. Besides the Dirac dictionary, other three sparse dictionaries including Db6 wavelets, Sym4 wavelets and cubic B-spline functions can also accurately identify both the single and double impact forces from highly noisy responses in a sparse representation frame. The discrete cosine functions can also successfully reconstruct the harmonic forces including the sinusoidal, square and triangular forces. Conversely, the traditional Tikhonov regularization method with the L-curve criterion fails to identify both the impact and harmonic forces in these cases.
Regularization Parameter Selections via Generalized Information Criterion
Zhang, Yiyun; Li, Runze; Tsai, Chih-Ling
2009-01-01
We apply the nonconcave penalized likelihood approach to obtain variable selections as well as shrinkage estimators. This approach relies heavily on the choice of regularization parameter, which controls the model complexity. In this paper, we propose employing the generalized information criterion (GIC), encompassing the commonly used Akaike information criterion (AIC) and Bayesian information criterion (BIC), for selecting the regularization parameter. Our proposal makes a connection between the classical variable selection criteria and the regularization parameter selections for the nonconcave penalized likelihood approaches. We show that the BIC-type selector enables identification of the true model consistently, and the resulting estimator possesses the oracle property in the terminology of Fan and Li (2001). In contrast, however, the AIC-type selector tends to overfit with positive probability. We further show that the AIC-type selector is asymptotically loss efficient, while the BIC-type selector is not. Our simulation results confirm these theoretical findings, and an empirical example is presented. Some technical proofs are given in the online supplementary material. PMID:20676354
Regularity theory for general stable operators
NASA Astrophysics Data System (ADS)
Ros-Oton, Xavier; Serra, Joaquim
2016-06-01
We establish sharp regularity estimates for solutions to Lu = f in Ω ⊂Rn, L being the generator of any stable and symmetric Lévy process. Such nonlocal operators L depend on a finite measure on S n - 1, called the spectral measure. First, we study the interior regularity of solutions to Lu = f in B1. We prove that if f is Cα then u belong to C α + 2 s whenever α + 2 s is not an integer. In case f ∈L∞, we show that the solution u is C2s when s ≠ 1 / 2, and C 2 s - ɛ for all ɛ > 0 when s = 1 / 2. Then, we study the boundary regularity of solutions to Lu = f in Ω, u = 0 in Rn ∖ Ω, in C 1 , 1 domains Ω. We show that solutions u satisfy u /ds ∈C s - ɛ (Ω ‾) for all ɛ > 0, where d is the distance to ∂Ω. Finally, we show that our results are sharp by constructing two counterexamples.
Regular language constrained sequence alignment revisited.
Kucherov, Gregory; Pinhas, Tamar; Ziv-Ukelson, Michal
2011-05-01
Imposing constraints in the form of a finite automaton or a regular expression is an effective way to incorporate additional a priori knowledge into sequence alignment procedures. With this motivation, the Regular Expression Constrained Sequence Alignment Problem was introduced, which proposed an O(n²t⁴) time and O(n²t²) space algorithm for solving it, where n is the length of the input strings and t is the number of states in the input non-deterministic automaton. A faster O(n²t³) time algorithm for the same problem was subsequently proposed. In this article, we further speed up the algorithms for Regular Language Constrained Sequence Alignment by reducing their worst case time complexity bound to O(n²t³)/log t). This is done by establishing an optimal bound on the size of Straight-Line Programs solving the maxima computation subproblem of the basic dynamic programming algorithm. We also study another solution based on a Steiner Tree computation. While it does not improve the worst case, our simulations show that both approaches are efficient in practice, especially when the input automata are dense. PMID:21554020
Regular surface layer of Azotobacter vinelandii.
Bingle, W H; Doran, J L; Page, W J
1984-01-01
Washing Azotobacter vinelandii UW1 with Burk buffer or heating cells at 42 degrees C exposed a regular surface layer which was effectively visualized by freeze-etch electron microscopy. This layer was composed of tetragonally arranged subunits separated by a center-to-center spacing of approximately 10 nm. Cells washed with distilled water to remove an acidic major outer membrane protein with a molecular weight of 65,000 did not possess the regular surface layer. This protein, designated the S protein, specifically reattached to the surface of distilled-water-washed cells in the presence of the divalent calcium, magnesium, strontium, or beryllium cations. All of these cations except beryllium supported reassembly of the S protein into a regular tetragonal array. Although the surface localization of the S protein has been demonstrated, radioiodination of exposed envelope proteins in whole cells did not confirm this. The labeling behavior of the S protein could be explained on the basis of varying accessibilities of different tyrosine residues to iodination. Images PMID:6735982
Discovering Structural Regularity in 3D Geometry
Pauly, Mark; Mitra, Niloy J.; Wallner, Johannes; Pottmann, Helmut; Guibas, Leonidas J.
2010-01-01
We introduce a computational framework for discovering regular or repeated geometric structures in 3D shapes. We describe and classify possible regular structures and present an effective algorithm for detecting such repeated geometric patterns in point- or mesh-based models. Our method assumes no prior knowledge of the geometry or spatial location of the individual elements that define the pattern. Structure discovery is made possible by a careful analysis of pairwise similarity transformations that reveals prominent lattice structures in a suitable model of transformation space. We introduce an optimization method for detecting such uniform grids specifically designed to deal with outliers and missing elements. This yields a robust algorithm that successfully discovers complex regular structures amidst clutter, noise, and missing geometry. The accuracy of the extracted generating transformations is further improved using a novel simultaneous registration method in the spatial domain. We demonstrate the effectiveness of our algorithm on a variety of examples and show applications to compression, model repair, and geometry synthesis. PMID:21170292
Weighted power counting and chiral dimensional regularization
NASA Astrophysics Data System (ADS)
Anselmi, Damiano
2014-06-01
We define a modified dimensional-regularization technique that overcomes several difficulties of the ordinary technique, and is specially designed to work efficiently in chiral and parity violating quantum field theories, in arbitrary dimensions greater than 2. When the dimension of spacetime is continued to complex values, spinors, vectors and tensors keep the components they have in the physical dimension; therefore, the γ matrices are the standard ones. Propagators are regularized with the help of evanescent higher-derivative kinetic terms, which are of the Majorana type in the case of chiral fermions. If the new terms are organized in a clever way, weighted power counting provides an efficient control on the renormalization of the theory, and allows us to show that the resulting chiral dimensional regularization is consistent to all orders. The new technique considerably simplifies the proofs of properties that hold to all orders, and makes them suitable to be generalized to wider classes of models. Typical examples are the renormalizability of chiral gauge theories and the Adler-Bardeen theorem. The difficulty of explicit computations, on the other hand, may increase.
Automatic detection of regularly repeating vocalizations
NASA Astrophysics Data System (ADS)
Mellinger, David
2005-09-01
Many animal species produce repetitive sounds at regular intervals. This regularity can be used for automatic recognition of the sounds, providing improved detection at a given signal-to-noise ratio. Here, the detection of sperm whale sounds is examined. Sperm whales produce highly repetitive ``regular clicks'' at periods of about 0.2-2 s, and faster click trains in certain behavioral contexts. The following detection procedure was tested: a spectrogram was computed; values within a certain frequency band were summed; time windowing was applied; each windowed segment was autocorrelated; and the maximum of the autocorrelation within a certain periodicity range was chosen. This procedure was tested on sets of recordings containing sperm whale sounds and interfering sounds, both low-frequency recordings from autonomous hydrophones and high-frequency ones from towed hydrophone arrays. An optimization procedure iteratively varies detection parameters (spectrogram frame length and frequency range, window length, periodicity range, etc.). Performance of various sets of parameters was measured by setting a standard level of allowable missed calls, and the resulting optimium parameters are described. Performance is also compared to that of a neural network trained using the data sets. The method is also demonstrated for sounds of blue whales, minke whales, and seismic airguns. [Funding from ONR.
Sparsity regularization for parameter identification problems
NASA Astrophysics Data System (ADS)
Jin, Bangti; Maass, Peter
2012-12-01
The investigation of regularization schemes with sparsity promoting penalty terms has been one of the dominant topics in the field of inverse problems over the last years, and Tikhonov functionals with ℓp-penalty terms for 1 ⩽ p ⩽ 2 have been studied extensively. The first investigations focused on regularization properties of the minimizers of such functionals with linear operators and on iteration schemes for approximating the minimizers. These results were quickly transferred to nonlinear operator equations, including nonsmooth operators and more general function space settings. The latest results on regularization properties additionally assume a sparse representation of the true solution as well as generalized source conditions, which yield some surprising and optimal convergence rates. The regularization theory with ℓp sparsity constraints is relatively complete in this setting; see the first part of this review. In contrast, the development of efficient numerical schemes for approximating minimizers of Tikhonov functionals with sparsity constraints for nonlinear operators is still ongoing. The basic iterated soft shrinkage approach has been extended in several directions and semi-smooth Newton methods are becoming applicable in this field. In particular, the extension to more general non-convex, non-differentiable functionals by variational principles leads to a variety of generalized iteration schemes. We focus on such iteration schemes in the second part of this review. A major part of this survey is devoted to applying sparsity constrained regularization techniques to parameter identification problems for partial differential equations, which we regard as the prototypical setting for nonlinear inverse problems. Parameter identification problems exhibit different levels of complexity and we aim at characterizing a hierarchy of such problems. The operator defining these inverse problems is the parameter-to-state mapping. We first summarize some
The regular state in higher order gravity
NASA Astrophysics Data System (ADS)
Cotsakis, Spiros; Kadry, Seifedine; Trachilis, Dimitrios
2016-08-01
We consider the higher-order gravity theory derived from the quadratic Lagrangian R + 𝜖R2 in vacuum as a first-order (ADM-type) system with constraints, and build time developments of solutions of an initial value formulation of the theory. We show that all such solutions, if analytic, contain the right number of free functions to qualify as general solutions of the theory. We further show that any regular analytic solution which satisfies the constraints and the evolution equations can be given in the form of an asymptotic formal power series expansion.
Regularization ambiguities in loop quantum gravity
NASA Astrophysics Data System (ADS)
Perez, Alejandro
2006-02-01
One of the main achievements of loop quantum gravity is the consistent quantization of the analog of the Wheeler-DeWitt equation which is free of ultraviolet divergences. However, ambiguities associated to the intermediate regularization procedure lead to an apparently infinite set of possible theories. The absence of an UV problem—the existence of well-behaved regularization of the constraints—is intimately linked with the ambiguities arising in the quantum theory. Among these ambiguities is the one associated to the SU(2) unitary representation used in the diffeomorphism covariant “point-splitting” regularization of the nonlinear functionals of the connection. This ambiguity is labeled by a half-integer m and, here, it is referred to as the m ambiguity. The aim of this paper is to investigate the important implications of this ambiguity. We first study 2+1 gravity (and more generally BF theory) quantized in the canonical formulation of loop quantum gravity. Only when the regularization of the quantum constraints is performed in terms of the fundamental representation of the gauge group does one obtain the usual topological quantum field theory as a result. In all other cases unphysical local degrees of freedom arise at the level of the regulated theory that conspire against the existence of the continuum limit. This shows that there is a clear-cut choice in the quantization of the constraints in 2+1 loop quantum gravity. We then analyze the effects of the ambiguity in 3+1 gravity exhibiting the existence of spurious solutions for higher representation quantizations of the Hamiltonian constraint. Although the analysis is not complete in 3+1 dimensions—due to the difficulties associated to the definition of the physical inner product—it provides evidence supporting the definitions quantum dynamics of loop quantum gravity in terms of the fundamental representation of the gauge group as the only consistent possibilities. If the gauge group is SO(3) we
Total-variation regularization with bound constraints
Chartrand, Rick; Wohlberg, Brendt
2009-01-01
We present a new algorithm for bound-constrained total-variation (TV) regularization that in comparison with its predecessors is simple, fast, and flexible. We use a splitting approach to decouple TV minimization from enforcing the constraints. Consequently, existing TV solvers can be employed with minimal alteration. This also makes the approach straightforward to generalize to any situation where TV can be applied. We consider deblurring of images with Gaussian or salt-and-pepper noise, as well as Abel inversion of radiographs with Poisson noise. We incorporate previous iterative reweighting algorithms to solve the TV portion.
Multichannel image regularization using anisotropic geodesic filtering
Grazzini, Jacopo A
2010-01-01
This paper extends a recent image-dependent regularization approach introduced in aiming at edge-preserving smoothing. For that purpose, geodesic distances equipped with a Riemannian metric need to be estimated in local neighbourhoods. By deriving an appropriate metric from the gradient structure tensor, the associated geodesic paths are constrained to follow salient features in images. Following, we design a generalized anisotropic geodesic filter; incorporating not only a measure of the edge strength, like in the original method, but also further directional information about the image structures. The proposed filter is particularly efficient at smoothing heterogeneous areas while preserving relevant structures in multichannel images.
Promoting regular physical activity in pulmonary rehabilitation.
Garcia-Aymerich, Judith; Pitta, Fabio
2014-06-01
Patients with chronic respiratory diseases are usually physically inactive, which is an important negative prognostic factor. Therefore, promoting regular physical activity is of key importance in reducing morbidity and mortality and improving the quality of life in this population. A current challenge to pulmonary rehabilitation is the need to develop strategies that induce or facilitate the enhancement of daily levels of physical activity. Because exercise training alone, despite improving exercise capacity, does not consistently generate similar improvements in physical activity in daily life, there is also a need to develop behavioral interventions that help to promote activity. PMID:24874131
Regularized Grad equations for multicomponent plasmas
NASA Astrophysics Data System (ADS)
Magin, Thierry E.; Martins, Gérald; Torrilhon, Manuel
2011-05-01
The moment method of Grad is used to derive macroscopic conservation equations for multicomponent plasmas for small and moderate Knudsen numbers, accounting for the electromagnetic field influence and thermal nonequilibrium. In the low Knudsen number limit, the equations derived are fully consistent with those obtained by means of the Chapman-Enskog method. In particular, we have retieved the Kolesnikov effect coupling electrons and heavy particles in the case of the Boltzmann moment systems. Finally, a regularization procedure is proposed to achieve continuous shock structures at all Mach numbers.
Spectral action with zeta function regularization
NASA Astrophysics Data System (ADS)
Kurkov, Maxim A.; Lizzi, Fedele; Sakellariadou, Mairi; Watcharangkool, Apimook
2015-03-01
In this paper we propose a novel definition of the bosonic spectral action using zeta function regularization, in order to address the issues of renormalizability and spectral dimensions. We compare the zeta spectral action with the usual (cutoff-based) spectral action and discuss its origin and predictive power, stressing the importance of the issue of the three dimensionful fundamental constants, namely the cosmological constant, the Higgs vacuum expectation value, and the gravitational constant. We emphasize the fundamental role of the neutrino Majorana mass term for the structure of the bosonic action.
Dense Regular Packings of Irregular Nonconvex Particles
NASA Astrophysics Data System (ADS)
de Graaf, Joost; van Roij, René; Dijkstra, Marjolein
2011-10-01
We present a new numerical scheme to study systems of nonconvex, irregular, and punctured particles in an efficient manner. We employ this method to analyze regular packings of odd-shaped bodies, both from a nanoparticle and from a computational geometry perspective. Besides determining close-packed structures for 17 irregular shapes, we confirm several conjectures for the packings of a large set of 142 convex polyhedra and extend upon these. We also prove that we have obtained the densest packing for both rhombicuboctahedra and rhombic enneacontrahedra and we have improved upon the packing of enneagons and truncated tetrahedra.
Accretion onto some well-known regular black holes
NASA Astrophysics Data System (ADS)
Jawad, Abdul; Shahzad, M. Umair
2016-03-01
In this work, we discuss the accretion onto static spherically symmetric regular black holes for specific choices of the equation of state parameter. The underlying regular black holes are charged regular black holes using the Fermi-Dirac distribution, logistic distribution, nonlinear electrodynamics, respectively, and Kehagias-Sftesos asymptotically flat regular black holes. We obtain the critical radius, critical speed, and squared sound speed during the accretion process near the regular black holes. We also study the behavior of radial velocity, energy density, and the rate of change of the mass for each of the regular black holes.
Accelerating Large Data Analysis By Exploiting Regularities
NASA Technical Reports Server (NTRS)
Moran, Patrick J.; Ellsworth, David
2003-01-01
We present techniques for discovering and exploiting regularity in large curvilinear data sets. The data can be based on a single mesh or a mesh composed of multiple submeshes (also known as zones). Multi-zone data are typical to Computational Fluid Dynamics (CFD) simulations. Regularities include axis-aligned rectilinear and cylindrical meshes as well as cases where one zone is equivalent to a rigid-body transformation of another. Our algorithms can also discover rigid-body motion of meshes in time-series data. Next, we describe a data model where we can utilize the results from the discovery process in order to accelerate large data visualizations. Where possible, we replace general curvilinear zones with rectilinear or cylindrical zones. In rigid-body motion cases we replace a time-series of meshes with a transformed mesh object where a reference mesh is dynamically transformed based on a given time value in order to satisfy geometry requests, on demand. The data model enables us to make these substitutions and dynamic transformations transparently with respect to the visualization algorithms. We present results with large data sets where we combine our mesh replacement and transformation techniques with out-of-core paging in order to achieve significant speed-ups in analysis.
Supporting Regularized Logistic Regression Privately and Efficiently.
Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei
2016-01-01
As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc. PMID:27271738
Nonlinear regularization techniques for seismic tomography
Loris, I. Douma, H.; Nolet, G.; Regone, C.
2010-02-01
The effects of several nonlinear regularization techniques are discussed in the framework of 3D seismic tomography. Traditional, linear, l{sub 2} penalties are compared to so-called sparsity promoting l{sub 1} and l{sub 0} penalties, and a total variation penalty. Which of these algorithms is judged optimal depends on the specific requirements of the scientific experiment. If the correct reproduction of model amplitudes is important, classical damping towards a smooth model using an l{sub 2} norm works almost as well as minimizing the total variation but is much more efficient. If gradients (edges of anomalies) should be resolved with a minimum of distortion, we prefer l{sub 1} damping of Daubechies-4 wavelet coefficients. It has the additional advantage of yielding a noiseless reconstruction, contrary to simple l{sub 2} minimization ('Tikhonov regularization') which should be avoided. In some of our examples, the l{sub 0} method produced notable artifacts. In addition we show how nonlinear l{sub 1} methods for finding sparse models can be competitive in speed with the widely used l{sub 2} methods, certainly under noisy conditions, so that there is no need to shun l{sub 1} penalizations.
Tomographic laser absorption spectroscopy using Tikhonov regularization.
Guha, Avishek; Schoegl, Ingmar
2014-12-01
The application of tunable diode laser absorption spectroscopy (TDLAS) to flames with nonhomogeneous temperature and concentration fields is an area where only few studies exist. Experimental work explores the performance of tomographic reconstructions of species concentration and temperature profiles from wavelength-modulated TDLAS measurements within the plume of an axisymmetric McKenna burner. Water vapor transitions at 1391.67 and 1442.67 nm are probed using calibration-free wavelength modulation spectroscopy with second harmonic detection (WMS-2f). A single collimated laser beam is swept parallel to the burner surface, where scans yield pairs of line-of-sight (LOS) data at multiple radial locations. Radial profiles of absorption data are reconstructed using Tikhonov regularized Abel inversion, which suppresses the amplification of experimental noise that is typically observed for reconstructions with high spatial resolution. Based on spectral data reconstructions, temperatures and mole fractions are calculated point-by-point. Here, a least-squares approach addresses difficulties due to modulation depths that cannot be universally optimized due to a nonuniform domain. Experimental results show successful reconstructions of temperature and mole fraction profiles based on two-transition, nonoptimally modulated WMS-2f and Tikhonov regularized Abel inversion, and thus validate the technique as a viable diagnostic tool for flame measurements. PMID:25607968
Supporting Regularized Logistic Regression Privately and Efficiently
Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei
2016-01-01
As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc. PMID:27271738
Regularized Semiparametric Estimation for Ordinary Differential Equations
Li, Yun; Zhu, Ji; Wang, Naisyin
2015-01-01
Ordinary differential equations (ODEs) are widely used in modeling dynamic systems and have ample applications in the fields of physics, engineering, economics and biological sciences. The ODE parameters often possess physiological meanings and can help scientists gain better understanding of the system. One key interest is thus to well estimate these parameters. Ideally, constant parameters are preferred due to their easy interpretation. In reality, however, constant parameters can be too restrictive such that even after incorporating error terms, there could still be unknown sources of disturbance that lead to poor agreement between observed data and the estimated ODE system. In this paper, we address this issue and accommodate short-term interferences by allowing parameters to vary with time. We propose a new regularized estimation procedure on the time-varying parameters of an ODE system so that these parameters could change with time during transitions but remain constants within stable stages. We found, through simulation studies, that the proposed method performs well and tends to have less variation in comparison to the non-regularized approach. On the theoretical front, we derive finite-sample estimation error bounds for the proposed method. Applications of the proposed method to modeling the hare-lynx relationship and the measles incidence dynamic in Ontario, Canada lead to satisfactory and meaningful results. PMID:26392639
The Essential Special Education Guide for the Regular Education Teacher
ERIC Educational Resources Information Center
Burns, Edward
2007-01-01
The Individuals with Disabilities Education Act (IDEA) of 2004 has placed a renewed emphasis on the importance of the regular classroom, the regular classroom teacher and the general curriculum as the primary focus of special education. This book contains over 100 topics that deal with real issues and concerns regarding the regular classroom and…
Delayed Acquisition of Non-Adjacent Vocalic Distributional Regularities
ERIC Educational Resources Information Center
Gonzalez-Gomez, Nayeli; Nazzi, Thierry
2016-01-01
The ability to compute non-adjacent regularities is key in the acquisition of a new language. In the domain of phonology/phonotactics, sensitivity to non-adjacent regularities between consonants has been found to appear between 7 and 10 months. The present study focuses on the emergence of a posterior-anterior (PA) bias, a regularity involving two…
The Regular Education Initiative: Patent Medicine for Behavioral Disorders.
ERIC Educational Resources Information Center
Braaten, Sheldon; And Others
1988-01-01
Implications of the regular education initiative for behaviorally disordered students are examined in the context of integration and right to treatment. These students are underserved, often cannot be appropriately served in regular classrooms, are not welcomed by most regular classroom teachers, and have treatment rights the initiative does not…
On Regularity Criteria for the 2D Generalized MHD System
NASA Astrophysics Data System (ADS)
Jiang, Zaihong; Wang, Yanan; Zhou, Yong
2016-06-01
This paper deals with the problem of regularity criteria for the 2D generalized MHD system with fractional dissipative terms {-Λ^{2α}u} for the velocity field and {-Λ^{2β}b} for the magnetic field respectively. Various regularity criteria are established to guarantee smoothness of solutions. It turns out that our regularity criteria imply previous global existence results naturally.
29 CFR 778.408 - The specified regular rate.
Code of Federal Regulations, 2011 CFR
2011-07-01
... regular rate. (a) To qualify under section 7(f), the contract must specify “a regular rate of pay of not... section 7(f), must specify a “regular rate,” indicates that this criterion of these two cases is...
Recognition Memory for Novel Stimuli: The Structural Regularity Hypothesis
ERIC Educational Resources Information Center
Cleary, Anne M.; Morris, Alison L.; Langley, Moses M.
2007-01-01
Early studies of human memory suggest that adherence to a known structural regularity (e.g., orthographic regularity) benefits memory for an otherwise novel stimulus (e.g., G. A. Miller, 1958). However, a more recent study suggests that structural regularity can lead to an increase in false-positive responses on recognition memory tests (B. W. A.…
39 CFR 6.1 - Regular meetings, annual meeting.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 39 Postal Service 1 2010-07-01 2010-07-01 false Regular meetings, annual meeting. 6.1 Section 6.1 Postal Service UNITED STATES POSTAL SERVICE THE BOARD OF GOVERNORS OF THE U.S. POSTAL SERVICE MEETINGS (ARTICLE VI) § 6.1 Regular meetings, annual meeting. The Board shall meet regularly on a...
ERIC Educational Resources Information Center
Hawaii Univ., Honolulu. Community Coll. System.
Fall 1978 enrollment data for Hawaii's community colleges and data on selected characteristics of students enrolled in regular credit programs are presented. Of the 27,880 registrants, 74% were regular students, 1% were early admittees, 6% were registered in non-credit apprenticeship programs, and 18% were in special programs. Regular student…
Local orientational mobility in regular hyperbranched polymers
NASA Astrophysics Data System (ADS)
Dolgushev, Maxim; Markelov, Denis A.; Fürstenberg, Florian; Guérin, Thomas
2016-07-01
We study the dynamics of local bond orientation in regular hyperbranched polymers modeled by Vicsek fractals. The local dynamics is investigated through the temporal autocorrelation functions of single bonds and the corresponding relaxation forms of the complex dielectric susceptibility. We show that the dynamic behavior of single segments depends on their remoteness from the periphery rather than on the size of the whole macromolecule. Remarkably, the dynamics of the core segments (which are most remote from the periphery) shows a scaling behavior that differs from the dynamics obtained after structural average. We analyze the most relevant processes of single segment motion and provide an analytic approximation for the corresponding relaxation times. Furthermore, we describe an iterative method to calculate the orientational dynamics in the case of very large macromolecular sizes.
Features of the regular F2-layer
NASA Astrophysics Data System (ADS)
Besprozvannaia, A. S.
1987-10-01
Results of the empirical modeling of cyclic and seasonal variations of the daytime regular F2-layer are presented. It is shown that the formation of the seasonal anomaly in years of high solar activity is determined mainly by a summer anomaly. This summer anomaly is connected with an increase in the content of molecular nitrogen in the polar ionosphere during summer months due to additional heating and turbulent mixing in connection with intense dissipation of the three-dimensional current system under high-conductivity conditions. In solar-minimum years the seasonal anomaly is determined mainly by seasonal variations of the composition of the neutral atmosphere in the passage from winter to summer.
Generalized equations of state and regular universes
NASA Astrophysics Data System (ADS)
Contreras, F.; Cruz, N.; González, E.
2016-05-01
We found non singular solutions for universes filled with a fluid which obey a Generalized Equation of State of the form P(ρ) = – Aρ + γρλ. An emergent universe is obtained if A =1 and λ = 1/2. If the matter source is reinterpret as that of a scalar matter field with some potential, the corresponding potential is derived. For a closed universe, an exact bounce solution is found for A = 1/3 and the same λ. We also explore how the composition of theses universes ean be interpreted in terms of known fluids. It is of interest to note that accelerated solutions previously found for the late time evolution also represent regular solutions at early times.
Dyslexia in regular orthographies: manifestation and causation.
Wimmer, Heinz; Schurz, Matthias
2010-11-01
This article summarizes our research on the manifestation of dyslexia in German and on cognitive deficits, which may account for the severe reading speed deficit and the poor orthographic spelling performance that characterize dyslexia in regular orthographies. An only limited causal role of phonological deficits (phonological awareness, phonological STM, and rapid naming) for the emergence of reading fluency and spelling deficits is inferred from two large longitudinal studies with assessments of phonology before learning to read. A review of our cross-sectional studies provides no support for several cognitive deficits (visual-attention deficit, magnocellular dysfunction, skill automatization deficit, and visual-sequential memory deficit), which were proposed as alternatives to the phonological deficit account. Finally, a revised version of the phonological deficit account in terms of a dysfunction in orthographic-phonological connectivity is proposed. PMID:20957684
Local orientational mobility in regular hyperbranched polymers.
Dolgushev, Maxim; Markelov, Denis A; Fürstenberg, Florian; Guérin, Thomas
2016-07-01
We study the dynamics of local bond orientation in regular hyperbranched polymers modeled by Vicsek fractals. The local dynamics is investigated through the temporal autocorrelation functions of single bonds and the corresponding relaxation forms of the complex dielectric susceptibility. We show that the dynamic behavior of single segments depends on their remoteness from the periphery rather than on the size of the whole macromolecule. Remarkably, the dynamics of the core segments (which are most remote from the periphery) shows a scaling behavior that differs from the dynamics obtained after structural average. We analyze the most relevant processes of single segment motion and provide an analytic approximation for the corresponding relaxation times. Furthermore, we describe an iterative method to calculate the orientational dynamics in the case of very large macromolecular sizes. PMID:27575171
Black hole mimickers: Regular versus singular behavior
Lemos, Jose P. S.; Zaslavskii, Oleg B.
2008-07-15
Black hole mimickers are possible alternatives to black holes; they would look observationally almost like black holes but would have no horizon. The properties in the near-horizon region where gravity is strong can be quite different for both types of objects, but at infinity it could be difficult to discern black holes from their mimickers. To disentangle this possible confusion, we examine the near-horizon properties, and their connection with far away asymptotic properties, of some candidates to black mimickers. We study spherically symmetric uncharged or charged but nonextremal objects, as well as spherically symmetric charged extremal objects. Within the uncharged or charged but nonextremal black hole mimickers, we study nonextremal {epsilon}-wormholes on the threshold of the formation of an event horizon, of which a subclass are called black foils, and gravastars. Within the charged extremal black hole mimickers we study extremal {epsilon}-wormholes on the threshold of the formation of an event horizon, quasi-black holes, and wormholes on the basis of quasi-black holes from Bonnor stars. We elucidate whether or not the objects belonging to these two classes remain regular in the near-horizon limit. The requirement of full regularity, i.e., finite curvature and absence of naked behavior, up to an arbitrary neighborhood of the gravitational radius of the object enables one to rule out potential mimickers in most of the cases. A list ranking the best black hole mimickers up to the worst, both nonextremal and extremal, is as follows: wormholes on the basis of extremal black holes or on the basis of quasi-black holes, quasi-black holes, wormholes on the basis of nonextremal black holes (black foils), and gravastars. Since in observational astrophysics it is difficult to find extremal configurations (the best mimickers in the ranking), whereas nonextremal configurations are really bad mimickers, the task of distinguishing black holes from their mimickers seems to
Regularization of Instantaneous Frequency Attribute Computations
NASA Astrophysics Data System (ADS)
Yedlin, M. J.; Margrave, G. F.; Van Vorst, D. G.; Ben Horin, Y.
2014-12-01
We compare two different methods of computation of a temporally local frequency:1) A stabilized instantaneous frequency using the theory of the analytic signal.2) A temporally variant centroid (or dominant) frequency estimated from a time-frequency decomposition.The first method derives from Taner et al (1979) as modified by Fomel (2007) and utilizes the derivative of the instantaneous phase of the analytic signal. The second method computes the power centroid (Cohen, 1995) of the time-frequency spectrum, obtained using either the Gabor or Stockwell Transform. Common to both methods is the necessity of division by a diagonal matrix, which requires appropriate regularization.We modify Fomel's (2007) method by explicitly penalizing the roughness of the estimate. Following Farquharson and Oldenburg (2004), we employ both the L curve and GCV methods to obtain the smoothest model that fits the data in the L2 norm.Using synthetic data, quarry blast, earthquakes and the DPRK tests, our results suggest that the optimal method depends on the data. One of the main applications for this work is the discrimination between blast events and earthquakesFomel, Sergey. " Local seismic attributes." , Geophysics, 72.3 (2007): A29-A33.Cohen, Leon. " Time frequency analysis theory and applications." USA: Prentice Hall, (1995).Farquharson, Colin G., and Douglas W. Oldenburg. "A comparison of automatic techniques for estimating the regularization parameter in non-linear inverse problems." Geophysical Journal International 156.3 (2004): 411-425.Taner, M. Turhan, Fulton Koehler, and R. E. Sheriff. " Complex seismic trace analysis." Geophysics, 44.6 (1979): 1041-1063.
Cates, Christopher J; Lasserson, Toby J
2014-01-01
Background An increase in serious adverse events with both regular formoterol and regular salmeterol in chronic asthma has been demonstrated in previous Cochrane reviews. Objectives We set out to compare the risks of mortality and non-fatal serious adverse events in trials which have randomised patients with chronic asthma to regular formoterol versus regular salmeterol. Search methods We identified trials using the Cochrane Airways Group Specialised Register of trials. We checked manufacturers’ websites of clinical trial registers for unpublished trial data and also checked Food and Drug Administration (FDA) submissions in relation to formoterol and salmeterol. The date of the most recent search was January 2012. Selection criteria We included controlled, parallel-design clinical trials on patients of any age and with any severity of asthma if they randomised patients to treatment with regular formoterol versus regular salmeterol (without randomised inhaled corticosteroids), and were of at least 12 weeks’ duration. Data collection and analysis Two authors independently selected trials for inclusion in the review and extracted outcome data. We sought unpublished data on mortality and serious adverse events from the sponsors and authors. Main results The review included four studies (involving 1116 adults and 156 children). All studies were open label and recruited patients who were already taking inhaled corticosteroids for their asthma, and all studies contributed data on serious adverse events. All studies compared formoterol 12 μg versus salmeterol 50 μg twice daily. The adult studies were all comparing Foradil Aerolizer with Serevent Diskus, and the children’s study compared Oxis Turbohaler to Serevent Accuhaler. There was only one death in an adult (which was unrelated to asthma) and none in children, and there were no significant differences in non-fatal serious adverse events comparing formoterol to salmeterol in adults (Peto odds ratio (OR) 0.77; 95
Preparation of Regular Specimens for Atom Probes
NASA Technical Reports Server (NTRS)
Kuhlman, Kim; Wishard, James
2003-01-01
A method of preparation of specimens of non-electropolishable materials for analysis by atom probes is being developed as a superior alternative to a prior method. In comparison with the prior method, the present method involves less processing time. Also, whereas the prior method yields irregularly shaped and sized specimens, the present developmental method offers the potential to prepare specimens of regular shape and size. The prior method is called the method of sharp shards because it involves crushing the material of interest and selecting microscopic sharp shards of the material for use as specimens. Each selected shard is oriented with its sharp tip facing away from the tip of a stainless-steel pin and is glued to the tip of the pin by use of silver epoxy. Then the shard is milled by use of a focused ion beam (FIB) to make the shard very thin (relative to its length) and to make its tip sharp enough for atom-probe analysis. The method of sharp shards is extremely time-consuming because the selection of shards must be performed with the help of a microscope, the shards must be positioned on the pins by use of micromanipulators, and the irregularity of size and shape necessitates many hours of FIB milling to sharpen each shard. In the present method, a flat slab of the material of interest (e.g., a polished sample of rock or a coated semiconductor wafer) is mounted in the sample holder of a dicing saw of the type conventionally used to cut individual integrated circuits out of the wafers on which they are fabricated in batches. A saw blade appropriate to the material of interest is selected. The depth of cut and the distance between successive parallel cuts is made such that what is left after the cuts is a series of thin, parallel ridges on a solid base. Then the workpiece is rotated 90 and the pattern of cuts is repeated, leaving behind a square array of square posts on the solid base. The posts can be made regular, long, and thin, as required for samples
NASA Astrophysics Data System (ADS)
Larios, Adam; Titi, Edriss S.
2014-03-01
We prove existence, uniqueness, and higher-order global regularity of strong solutions to a particular Voigt-regularization of the three-dimensional inviscid resistive magnetohydrodynamic (MHD) equations. Specifically, the coupling of a resistive magnetic field to the Euler-Voigt model is introduced to form an inviscid regularization of the inviscid resistive MHD system. The results hold in both the whole space and in the context of periodic boundary conditions. Weak solutions for this regularized model are also considered, and proven to exist globally in time, but the question of uniqueness for weak solutions is still open. Furthermore, we show that the solutions of the Voigt regularized system converge, as the regularization parameter , to strong solutions of the original inviscid resistive MHD, on the corresponding time interval of existence of the latter. Moreover, we also establish a new criterion for blow-up of solutions to the original MHD system inspired by this Voigt regularization.
NASA Astrophysics Data System (ADS)
Larios, Adam; Titi, Edriss S.
2013-05-01
We prove existence, uniqueness, and higher-order global regularity of strong solutions to a particular Voigt-regularization of the three-dimensional inviscid resistive magnetohydrodynamic (MHD) equations. Specifically, the coupling of a resistive magnetic field to the Euler-Voigt model is introduced to form an inviscid regularization of the inviscid resistive MHD system. The results hold in both the whole space {{R}^3} and in the context of periodic boundary conditions. Weak solutions for this regularized model are also considered, and proven to exist globally in time, but the question of uniqueness for weak solutions is still open. Furthermore, we show that the solutions of the Voigt regularized system converge, as the regularization parameter {α → 0}, to strong solutions of the original inviscid resistive MHD, on the corresponding time interval of existence of the latter. Moreover, we also establish a new criterion for blow-up of solutions to the original MHD system inspired by this Voigt regularization.
Mapping algorithms on regular parallel architectures
Lee, P.
1989-01-01
It is significant that many of time-intensive scientific algorithms are formulated as nested loops, which are inherently regularly structured. In this dissertation the relations between the mathematical structure of nested loop algorithms and the architectural capabilities required for their parallel execution are studied. The architectural model considered in depth is that of an arbitrary dimensional systolic array. The mathematical structure of the algorithm is characterized by classifying its data-dependence vectors according to the new ZERO-ONE-INFINITE property introduced. Using this classification, the first complete set of necessary and sufficient conditions for correct transformation of a nested loop algorithm onto a given systolic array of an arbitrary dimension by means of linear mappings is derived. Practical methods to derive optimal or suboptimal systolic array implementations are also provided. The techniques developed are used constructively to develop families of implementations satisfying various optimization criteria and to design programmable arrays efficiently executing classes of algorithms. In addition, a Computer-Aided Design system running on SUN workstations has been implemented to help in the design. The methodology, which deals with general algorithms, is illustrated by synthesizing linear and planar systolic array algorithms for matrix multiplication, a reindexed Warshall-Floyd transitive closure algorithm, and the longest common subsequence algorithm.
Color correction optimization with hue regularization
NASA Astrophysics Data System (ADS)
Zhang, Heng; Liu, Huaping; Quan, Shuxue
2011-01-01
Previous work has suggested that observers are capable of judging the quality of an image without any knowledge of the original scene. When no reference is available, observers can extract the apparent objects in an image and compare them with the typical colors of similar objects recalled from their memories. Some generally agreed upon research results indicate that although perfect colorimetric rendering is not conspicuous and color errors can be well tolerated, the appropriate rendition of certain memory colors such as skin, grass, and sky is an important factor in the overall perceived image quality. These colors are appreciated in a fairly consistent manner and are memorized with slightly different hues and higher color saturation. The aim of color correction for a digital color pipeline is to transform the image data from a device dependent color space to a target color space, usually through a color correction matrix which in its most basic form is optimized through linear regressions between the two sets of data in two color spaces in the sense of minimized Euclidean color error. Unfortunately, this method could result in objectionable distortions if the color error biased certain colors undesirably. In this paper, we propose a color correction optimization method with preferred color reproduction in mind through hue regularization and present some experimental results.
Determinants of Scanpath Regularity in Reading.
von der Malsburg, Titus; Kliegl, Reinhold; Vasishth, Shravan
2015-09-01
Scanpaths have played an important role in classic research on reading behavior. Nevertheless, they have largely been neglected in later research perhaps due to a lack of suitable analytical tools. Recently, von der Malsburg and Vasishth (2011) proposed a new measure for quantifying differences between scanpaths and demonstrated that this measure can recover effects that were missed with the traditional eyetracking measures. However, the sentences used in that study were difficult to process and scanpath effects accordingly strong. The purpose of the present study was to test the validity, sensitivity, and scope of applicability of the scanpath measure, using simple sentences that are typically read from left to right. We derived predictions for the regularity of scanpaths from the literature on oculomotor control, sentence processing, and cognitive aging and tested these predictions using the scanpath measure and a large database of eye movements. All predictions were confirmed: Sentences with short words and syntactically more difficult sentences elicited more irregular scanpaths. Also, older readers produced more irregular scanpaths than younger readers. In addition, we found an effect that was not reported earlier: Syntax had a smaller influence on the eye movements of older readers than on those of young readers. We discuss this interaction of syntactic parsing cost with age in terms of shifts in processing strategies and a decline of executive control as readers age. Overall, our results demonstrate the validity and sensitivity of the scanpath measure and thus establish it as a productive and versatile tool for reading research. PMID:25530253
Identifying Cognitive States Using Regularity Partitions
2015-01-01
Functional Magnetic Resonance (fMRI) data can be used to depict functional connectivity of the brain. Standard techniques have been developed to construct brain networks from this data; typically nodes are considered as voxels or sets of voxels with weighted edges between them representing measures of correlation. Identifying cognitive states based on fMRI data is connected with recording voxel activity over a certain time interval. Using this information, network and machine learning techniques can be applied to discriminate the cognitive states of the subjects by exploring different features of data. In this work we wish to describe and understand the organization of brain connectivity networks under cognitive tasks. In particular, we use a regularity partitioning algorithm that finds clusters of vertices such that they all behave with each other almost like random bipartite graphs. Based on the random approximation of the graph, we calculate a lower bound on the number of triangles as well as the expectation of the distribution of the edges in each subject and state. We investigate the results by comparing them to the state of the art algorithms for exploring connectivity and we argue that during epochs that the subject is exposed to stimulus, the inspected part of the brain is organized in an efficient way that enables enhanced functionality. PMID:26317983
Wave dynamics of regular and chaotic rays
McDonald, S.W.
1983-09-01
In order to investigate general relationships between waves and rays in chaotic systems, I study the eigenfunctions and spectrum of a simple model, the two-dimensional Helmholtz equation in a stadium boundary, for which the rays are ergodic. Statistical measurements are performed so that the apparent randomness of the stadium modes can be quantitatively contrasted with the familiar regularities observed for the modes in a circular boundary (with integrable rays). The local spatial autocorrelation of the eigenfunctions is constructed in order to indirectly test theoretical predictions for the nature of the Wigner distribution corresponding to chaotic waves. A portion of the large-eigenvalue spectrum is computed and reported in an appendix; the probability distribution of successive level spacings is analyzed and compared with theoretical predictions. The two principal conclusions are: 1) waves associated with chaotic rays may exhibit randomly situated localized regions of high intensity; 2) the Wigner function for these waves may depart significantly from being uniformly distributed over the surface of constant frequency in the ray phase space.
Grouping pursuit through a regularization solution surface *
Shen, Xiaotong; Huang, Hsin-Cheng
2010-01-01
Summary Extracting grouping structure or identifying homogenous subgroups of predictors in regression is crucial for high-dimensional data analysis. A low-dimensional structure in particular–grouping, when captured in a regression model, enables to enhance predictive performance and to facilitate a model's interpretability Grouping pursuit extracts homogenous subgroups of predictors most responsible for outcomes of a response. This is the case in gene network analysis, where grouping reveals gene functionalities with regard to progression of a disease. To address challenges in grouping pursuit, we introduce a novel homotopy method for computing an entire solution surface through regularization involving a piecewise linear penalty. This nonconvex and overcomplete penalty permits adaptive grouping and nearly unbiased estimation, which is treated with a novel concept of grouped subdifferentials and difference convex programming for efficient computation. Finally, the proposed method not only achieves high performance as suggested by numerical analysis, but also has the desired optimality with regard to grouping pursuit and prediction as showed by our theoretical results. PMID:20689721
Compression and regularization with the information bottleneck
NASA Astrophysics Data System (ADS)
Strouse, Dj; Schwab, David
Compression fundamentally involves a decision about what is relevant and what is not. The information bottleneck (IB) by Tishby, Pereira, and Bialek formalized this notion as an information-theoretic optimization problem and proposed an optimal tradeoff between throwing away as many bits as possible, and selectively keeping those that are most important. The IB has also recently been proposed as a theory of sensory gating and predictive computation in the retina by Palmer et al. Here, we introduce an alternative formulation of the IB, the deterministic information bottleneck (DIB), that we argue better captures the notion of compression, including that done by the brain. As suggested by its name, the solution to the DIB problem is a deterministic encoder, as opposed to the stochastic encoder that is optimal under the IB. We then compare the IB and DIB on synthetic data, showing that the IB and DIB perform similarly in terms of the IB cost function, but that the DIB vastly outperforms the IB in terms of the DIB cost function. Our derivation of the DIB also provides a family of models which interpolates between the DIB and IB by adding noise of a particular form. We discuss the role of this noise as a regularizer.
Energy Scaling Law for the Regular Cone
NASA Astrophysics Data System (ADS)
Olbermann, Heiner
2016-04-01
We consider a thin elastic sheet in the shape of a disk whose reference metric is that of a singular cone. That is, the reference metric is flat away from the center and has a defect there. We define a geometrically fully nonlinear free elastic energy and investigate the scaling behavior of this energy as the thickness h tends to 0. We work with two simplifying assumptions: Firstly, we think of the deformed sheet as an immersed 2-dimensional Riemannian manifold in Euclidean 3-space and assume that the exponential map at the origin (the center of the sheet) supplies a coordinate chart for the whole manifold. Secondly, the energy functional penalizes the difference between the induced metric and the reference metric in L^∞ (instead of, as is usual, in L^2). Under these assumptions, we show that the elastic energy per unit thickness of the regular cone in the leading order of h is given by C^*h^2|log h|, where the value of C^* is given explicitly.
Regularities in movement of subsurface condensated fluids
Ayre, A.G. )
1990-05-01
Darcy's law is traditionally considered to be a major filtration law. However, molecular and kinetic analyses of fluid movement in a porous medium with regard for physical interaction between liquids and rocks enabled the authors to derive a new, more general law: {anti V} = Ko (1{minus}Jo/J){sup 2} J, where: {anti v} = filtration rate, J = head gradient, Jo = initial filtration gradient of Ko = V/J with J Jo, i.e., Darcy's permeability coefficient. With J > Jo, this law is transformed into Darcy's law. With J > Jo, filtration stops as any multi-molecular liquid flow, and with J < Jo, it is transformed into an individual molecular movement called filling. Filling rate is determined using the law V = {lambda}J, where {lambda} is filling coefficient. The concept of initial filtration gradient gets a new interpretation. It is now considered as gradient with which pore-liquid movement is transformed from filtration type to a filling one. These regularities are important in evaluating subsurface fluid movement in the original environments or at some distance from exciting wells. In particular, it is found that pore-liquid flow in a natural environment is of filling type, and during this process separation of solution ingredients occurs. Final sizes of a depression cone of a functioning well or mine are controlled by existence of interactions between water and rock.
Sparsity-Regularized HMAX for Visual Recognition
Hu, Xiaolin; Zhang, Jianwei; Li, Jianmin; Zhang, Bo
2014-01-01
About ten years ago, HMAX was proposed as a simple and biologically feasible model for object recognition, based on how the visual cortex processes information. However, the model does not encompass sparse firing, which is a hallmark of neurons at all stages of the visual pathway. The current paper presents an improved model, called sparse HMAX, which integrates sparse firing. This model is able to learn higher-level features of objects on unlabeled training images. Unlike most other deep learning models that explicitly address global structure of images in every layer, sparse HMAX addresses local to global structure gradually along the hierarchy by applying patch-based learning to the output of the previous layer. As a consequence, the learning method can be standard sparse coding (SSC) or independent component analysis (ICA), two techniques deeply rooted in neuroscience. What makes SSC and ICA applicable at higher levels is the introduction of linear higher-order statistical regularities by max pooling. After training, high-level units display sparse, invariant selectivity for particular individuals or for image categories like those observed in human inferior temporal cortex (ITC) and medial temporal lobe (MTL). Finally, on an image classification benchmark, sparse HMAX outperforms the original HMAX by a large margin, suggesting its great potential for computer vision. PMID:24392078
Temporal Regularity of the Environment Drives Time Perception
2016-01-01
It’s reasonable to assume that a regularly paced sequence should be perceived as regular, but here we show that perceived regularity depends on the context in which the sequence is embedded. We presented one group of participants with perceptually regularly paced sequences, and another group of participants with mostly irregularly paced sequences (75% irregular, 25% regular). The timing of the final stimulus in each sequence could be varied. In one experiment, we asked whether the last stimulus was regular or not. We found that participants exposed to an irregular environment frequently reported perfectly regularly paced stimuli to be irregular. In a second experiment, we asked participants to judge whether the final stimulus was presented before or after a flash. In this way, we were able to determine distortions in temporal perception as changes in the timing necessary for the sound and the flash to be perceived synchronous. We found that within a regular context, the perceived timing of deviant last stimuli changed so that the relative anisochrony appeared to be perceptually decreased. In the irregular context, the perceived timing of irregular stimuli following a regular sequence was not affected. These observations suggest that humans use temporal expectations to evaluate the regularity of sequences and that expectations are combined with sensory stimuli to adapt perceived timing to follow the statistics of the environment. Expectations can be seen as a-priori probabilities on which perceived timing of stimuli depend. PMID:27441686
TRANSIENT LUNAR PHENOMENA: REGULARITY AND REALITY
Crotts, Arlin P. S.
2009-05-20
Transient lunar phenomena (TLPs) have been reported for centuries, but their nature is largely unsettled, and even their existence as a coherent phenomenon is controversial. Nonetheless, TLP data show regularities in the observations; a key question is whether this structure is imposed by processes tied to the lunar surface, or by terrestrial atmospheric or human observer effects. I interrogate an extensive catalog of TLPs to gauge how human factors determine the distribution of TLP reports. The sample is grouped according to variables which should produce differing results if determining factors involve humans, and not reflecting phenomena tied to the lunar surface. Features dependent on human factors can then be excluded. Regardless of how the sample is split, the results are similar: {approx}50% of reports originate from near Aristarchus, {approx}16% from Plato, {approx}6% from recent, major impacts (Copernicus, Kepler, Tycho, and Aristarchus), plus several at Grimaldi. Mare Crisium produces a robust signal in some cases (however, Crisium is too large for a 'feature' as defined). TLP count consistency for these features indicates that {approx}80% of these may be real. Some commonly reported sites disappear from the robust averages, including Alphonsus, Ross D, and Gassendi. These reports begin almost exclusively after 1955, when TLPs became widely known and many more (and inexperienced) observers searched for TLPs. In a companion paper, we compare the spatial distribution of robust TLP sites to transient outgassing (seen by Apollo and Lunar Prospector instruments). To a high confidence, robust TLP sites and those of lunar outgassing correlate strongly, further arguing for the reality of TLPs.
Gauge approach to gravitation and regular Big Bang theory
NASA Astrophysics Data System (ADS)
Minkevich, A. V.
2006-03-01
Field theoretical scheme of regular Big Bang in 4-dimensional physical space-time, built in the framework of gauge approach to gravitation, is discussed. Regular bouncing character of homogeneous isotropic cosmological models is ensured by gravitational repulsion effect at extreme conditions without quantum gravitational corrections. The most general properties of regular inflationary cosmological models are examined. Developing theory is valid, if energy density of gravitating matter is positive and energy dominance condition is fulfilled.
5 CFR 532.203 - Structure of regular wage schedules.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Each nonsupervisory and leader regular wage schedule shall have 15 grades, which shall be designated as follows: (1) WG means an appropriated fund nonsupervisory grade; (2) WL means an appropriated fund leader... leader grade. (b) Each supervisory regular wage schedule shall have 19 grades, which shall be...
5 CFR 551.421 - Regular working hours.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Regular working hours. 551.421 Section 551.421 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS PAY... have a regularly scheduled administrative workweek. However, under title 5 United States Code, and...
Endemic infections are always possible on regular networks
NASA Astrophysics Data System (ADS)
Del Genio, Charo I.; House, Thomas
2013-10-01
We study the dependence of the largest component in regular networks on the clustering coefficient, showing that its size changes smoothly without undergoing a phase transition. We explain this behavior via an analytical approach based on the network structure, and provide an exact equation describing the numerical results. Our work indicates that intrinsic structural properties always allow the spread of epidemics on regular networks.
Pairing renormalization and regularization within the local density approximation
Borycki, P.J.; Dobaczewski, J.; Nazarewicz, W.; Stoitsov, M.V.
2006-04-15
We discuss methods used in mean-field theories to treat pairing correlations within the local density approximation. Pairing renormalization and regularization procedures are compared in spherical and deformed nuclei. Both prescriptions give fairly similar results, although the theoretical motivation, simplicity, and stability of the regularization procedure make it a method of choice for future applications.
20 CFR 216.13 - Regular current connection test.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Regular current connection test. 216.13... ELIGIBILITY FOR AN ANNUITY Current Connection With the Railroad Industry § 216.13 Regular current connection test. An employee has a current connection with the railroad industry if he or she meets one of...
20 CFR 216.13 - Regular current connection test.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Regular current connection test. 216.13... ELIGIBILITY FOR AN ANNUITY Current Connection With the Railroad Industry § 216.13 Regular current connection test. An employee has a current connection with the railroad industry if he or she meets one of...
20 CFR 216.13 - Regular current connection test.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Regular current connection test. 216.13... ELIGIBILITY FOR AN ANNUITY Current Connection With the Railroad Industry § 216.13 Regular current connection test. An employee has a current connection with the railroad industry if he or she meets one of...
20 CFR 216.13 - Regular current connection test.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Regular current connection test. 216.13... ELIGIBILITY FOR AN ANNUITY Current Connection With the Railroad Industry § 216.13 Regular current connection test. An employee has a current connection with the railroad industry if he or she meets one of...
20 CFR 216.13 - Regular current connection test.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Regular current connection test. 216.13... ELIGIBILITY FOR AN ANNUITY Current Connection With the Railroad Industry § 216.13 Regular current connection test. An employee has a current connection with the railroad industry if he or she meets one of...
Cognitive Aspects of Regularity Exhibit When Neighborhood Disappears
ERIC Educational Resources Information Center
Chen, Sau-Chin; Hu, Jon-Fan
2015-01-01
Although regularity refers to the compatibility between pronunciation of character and sound of phonetic component, it has been suggested as being part of consistency, which is defined by neighborhood characteristics. Two experiments demonstrate how regularity effect is amplified or reduced by neighborhood characteristics and reveals the…
Myth 13: The Regular Classroom Teacher Can "Go It Alone"
ERIC Educational Resources Information Center
Sisk, Dorothy
2009-01-01
With most gifted students being educated in a mainstream model of education, the prevailing myth that the regular classroom teacher can "go it alone" and the companion myth that the teacher can provide for the education of gifted students through differentiation are alive and well. In reality, the regular classroom teacher is too often concerned…
The Inclusion of Differently Abled Students in the Regular Classroom.
ERIC Educational Resources Information Center
Lewis, Angela
This study sought to evaluate the implementation of a program to foster the inclusion of differently abled students into a regular elementary school classroom. The report is based on interviews with eight regular and two special education teachers, as well as the school principal, along with classroom materials and information on inclusion…
29 CFR 778.500 - Artificial regular rates.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 3 2011-07-01 2011-07-01 false Artificial regular rates. 778.500 Section 778.500 Labor... Circumvent the Act Devices to Evade the Overtime Requirements § 778.500 Artificial regular rates. (a) Since... of his compensation. Payment for overtime on the basis of an artificial “regular” rate will...
29 CFR 778.500 - Artificial regular rates.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 3 2010-07-01 2010-07-01 false Artificial regular rates. 778.500 Section 778.500 Labor... Circumvent the Act Devices to Evade the Overtime Requirements § 778.500 Artificial regular rates. (a) Since... of his compensation. Payment for overtime on the basis of an artificial “regular” rate will...
29 CFR 778.500 - Artificial regular rates.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 3 2013-07-01 2013-07-01 false Artificial regular rates. 778.500 Section 778.500 Labor... Circumvent the Act Devices to Evade the Overtime Requirements § 778.500 Artificial regular rates. (a) Since... of his compensation. Payment for overtime on the basis of an artificial “regular” rate will...
29 CFR 778.500 - Artificial regular rates.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 3 2012-07-01 2012-07-01 false Artificial regular rates. 778.500 Section 778.500 Labor... Circumvent the Act Devices to Evade the Overtime Requirements § 778.500 Artificial regular rates. (a) Since... of his compensation. Payment for overtime on the basis of an artificial “regular” rate will...
29 CFR 778.500 - Artificial regular rates.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 3 2014-07-01 2014-07-01 false Artificial regular rates. 778.500 Section 778.500 Labor... Circumvent the Act Devices to Evade the Overtime Requirements § 778.500 Artificial regular rates. (a) Since... of his compensation. Payment for overtime on the basis of an artificial “regular” rate will...
77 FR 76078 - Regular Board of Directors Sunshine Act Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... From the Federal Register Online via the Government Publishing Office NEIGHBORHOOD REINVESTMENT CORPORATION Regular Board of Directors Sunshine Act Meeting TIME & DATE: 2:00 p.m., Wednesday, January 9, 2013.... Call to Order II. Executive Session III. Approval of the Regular Board of Directors Meeting Minutes...
47 CFR 76.614 - Cable television system regular monitoring.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 4 2012-10-01 2012-10-01 false Cable television system regular monitoring. 76... SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Technical Standards § 76.614 Cable television system regular monitoring. Cable television operators transmitting carriers in the frequency bands...
47 CFR 76.614 - Cable television system regular monitoring.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 4 2013-10-01 2013-10-01 false Cable television system regular monitoring. 76... SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Technical Standards § 76.614 Cable television system regular monitoring. Cable television operators transmitting carriers in the frequency bands...
Analysis of regularized Navier-Stokes equations, 2
NASA Technical Reports Server (NTRS)
Ou, Yuh-Roung; Sritharan, S. S.
1989-01-01
A practically important regularization of the Navier-Stokes equations was analyzed. As a continuation of the previous work, the structure of the attractors characterizing the solutins was studied. Local as well as global invariant manifolds were found. Regularity properties of these manifolds are analyzed.
Fundamental and Regular Elementary Schools: Do Differences Exist?
ERIC Educational Resources Information Center
Weber, Larry J.; And Others
This study compared the academic achievement and other outcomes of three public fundamental elementary schools with three regular elementary schools in a metropolitan school district. Modeled after the John Marshal Fundamental School in Pasadena, California, which opened in the fall of 1973, fundamental schools differ from regular schools in that…
Inclusion Professional Development Model and Regular Middle School Educators
ERIC Educational Resources Information Center
Royster, Otelia; Reglin, Gary L.; Losike-Sedimo, Nonofo
2014-01-01
The purpose of this study was to determine the impact of a professional development model on regular education middle school teachers' knowledge of best practices for teaching inclusive classes and attitudes toward teaching these classes. There were 19 regular education teachers who taught the core subjects. Findings for Research Question 1…
29 CFR 553.233 - “Regular rate” defined.
Code of Federal Regulations, 2010 CFR
2010-07-01
... OF THE FAIR LABOR STANDARDS ACT TO EMPLOYEES OF STATE AND LOCAL GOVERNMENTS Fire Protection and Law Enforcement Employees of Public Agencies Overtime Compensation Rules § 553.233 “Regular rate” defined. The rules for computing an employee's “regular rate”, for purposes of the Act's overtime pay...
Regular expression order-sorted unification and matching
Kutsia, Temur; Marin, Mircea
2015-01-01
We extend order-sorted unification by permitting regular expression sorts for variables and in the domains of function symbols. The obtained signature corresponds to a finite bottom-up unranked tree automaton. We prove that regular expression order-sorted (REOS) unification is of type infinitary and decidable. The unification problem presented by us generalizes some known problems, such as, e.g., order-sorted unification for ranked terms, sequence unification, and word unification with regular constraints. Decidability of REOS unification implies that sequence unification with regular hedge language constraints is decidable, generalizing the decidability result of word unification with regular constraints to terms. A sort weakening algorithm helps to construct a minimal complete set of REOS unifiers from the solutions of sequence unification problems. Moreover, we design a complete algorithm for REOS matching, and show that this problem is NP-complete and the corresponding counting problem is #P-complete. PMID:26523088
Two hybrid regularization frameworks for solving the electrocardiography inverse problem
NASA Astrophysics Data System (ADS)
Jiang, Mingfeng; Xia, Ling; Shou, Guofa; Liu, Feng; Crozier, Stuart
2008-09-01
In this paper, two hybrid regularization frameworks, LSQR-Tik and Tik-LSQR, which integrate the properties of the direct regularization method (Tikhonov) and the iterative regularization method (LSQR), have been proposed and investigated for solving ECG inverse problems. The LSQR-Tik method is based on the Lanczos process, which yields a sequence of small bidiagonal systems to approximate the original ill-posed problem and then the Tikhonov regularization method is applied to stabilize the projected problem. The Tik-LSQR method is formulated as an iterative LSQR inverse, augmented with a Tikhonov-like prior information term. The performances of these two hybrid methods are evaluated using a realistic heart-torso model simulation protocol, in which the heart surface source method is employed to calculate the simulated epicardial potentials (EPs) from the action potentials (APs), and then the acquired EPs are used to calculate simulated body surface potentials (BSPs). The results show that the regularized solutions obtained by the LSQR-Tik method are approximate to those of the Tikhonov method, the computational cost of the LSQR-Tik method, however, is much less than that of the Tikhonov method. Moreover, the Tik-LSQR scheme can reconstruct the epcicardial potential distribution more accurately, specifically for the BSPs with large noisy cases. This investigation suggests that hybrid regularization methods may be more effective than separate regularization approaches for ECG inverse problems.
Semisupervised Support Vector Machines With Tangent Space Intrinsic Manifold Regularization.
Sun, Shiliang; Xie, Xijiong
2016-09-01
Semisupervised learning has been an active research topic in machine learning and data mining. One main reason is that labeling examples is expensive and time-consuming, while there are large numbers of unlabeled examples available in many practical problems. So far, Laplacian regularization has been widely used in semisupervised learning. In this paper, we propose a new regularization method called tangent space intrinsic manifold regularization. It is intrinsic to data manifold and favors linear functions on the manifold. Fundamental elements involved in the formulation of the regularization are local tangent space representations, which are estimated by local principal component analysis, and the connections that relate adjacent tangent spaces. Simultaneously, we explore its application to semisupervised classification and propose two new learning algorithms called tangent space intrinsic manifold regularized support vector machines (TiSVMs) and tangent space intrinsic manifold regularized twin SVMs (TiTSVMs). They effectively integrate the tangent space intrinsic manifold regularization consideration. The optimization of TiSVMs can be solved by a standard quadratic programming, while the optimization of TiTSVMs can be solved by a pair of standard quadratic programmings. The experimental results of semisupervised classification problems show the effectiveness of the proposed semisupervised learning algorithms. PMID:26277005
A local-order regularization for geophysical inverse problems
NASA Astrophysics Data System (ADS)
Gheymasi, H. Mohammadi; Gholami, A.
2013-11-01
Different types of regularization have been developed to obtain stable solutions to linear inverse problems. Among these, total variation (TV) is known as an edge preserver method, which leads to piecewise constant solutions and has received much attention for solving inverse problems arising in geophysical studies. However, the method shows staircase effects and is not suitable for the models including smooth regions. To overcome the staircase effect, we present a method, which employs a local-order difference operator in the regularization term. This method is performed in two steps: First, we apply a pre-processing step to find the edge locations in the regularized solution using a properly defined minmod limiter, where the edges are determined by a comparison of the solutions obtained using different order regularizations of the TV types. Then, we construct a local-order difference operator based on the information obtained from the pre-processing step about the edge locations, which is subsequently used as a regularization operator in the final sparsity-promoting regularization. Experimental results from the synthetic and real seismic traveltime tomography show that the proposed inversion method is able to retain the smooth regions of the regularized solution, while preserving sharp transitions presented in it.
Structural source identification using a generalized Tikhonov regularization
NASA Astrophysics Data System (ADS)
Aucejo, M.
2014-10-01
This paper addresses the problem of identifying mechanical exciting forces from vibration measurements. The proposed approach is based on a generalized Tikhonov regularization that allows taking into account prior information on the measurement noise as well as on the main characteristics of sources to identify like its sparsity or regularity. To solve such a regularization problem efficiently, a Generalized Iteratively Reweighted Least-Squares (GIRLS) algorithm is introduced. Proposed numerical and experimental validations reveal the crucial role of prior information in the quality of the source identification and the performance of the GIRLS algorithm.
Regularity criterion for the 3D Hall-magneto-hydrodynamics
NASA Astrophysics Data System (ADS)
Dai, Mimi
2016-07-01
This paper studies the regularity problem for the 3D incompressible resistive viscous Hall-magneto-hydrodynamic (Hall-MHD) system. The Kolmogorov 41 phenomenological theory of turbulence [14] predicts that there exists a critical wavenumber above which the high frequency part is dominated by the dissipation term in the fluid equation. Inspired by this idea, we apply an approach of splitting the wavenumber combined with an estimate of the energy flux to obtain a new regularity criterion. The regularity condition presented here is weaker than conditions in the existing criteria (Prodi-Serrin type criteria) for the 3D Hall-MHD system.
Regular modes in a mixed-dynamics-based optical fiber.
Michel, C; Allgaier, M; Doya, V
2016-02-01
A multimode optical fiber with a truncated transverse cross section acts as a powerful versatile support to investigate the wave features of complex ray dynamics. In this paper, we concentrate on the case of a geometry inducing mixed dynamics. We highlight that regular modes associated with stable periodic orbits present an enhanced spatial intensity localization. We report the statistics of the inverse participation ratio whose features are analogous to those of Anderson localized modes. Our study is supported by both numerical and experimental results on the spatial localization and spectral regularity of the regular modes. PMID:26986325
Exploring the spectrum of regularized bosonic string theory
Ambjørn, J. Makeenko, Y.
2015-03-15
We implement a UV regularization of the bosonic string by truncating its mode expansion and keeping the regularized theory “as diffeomorphism invariant as possible.” We compute the regularized determinant of the 2d Laplacian for the closed string winding around a compact dimension, obtaining the effective action in this way. The minimization of the effective action reliably determines the energy of the string ground state for a long string and/or for a large number of space-time dimensions. We discuss the possibility of a scaling limit when the cutoff is taken to infinity.
Some results on the spectra of strongly regular graphs
NASA Astrophysics Data System (ADS)
Vieira, Luís António de Almeida; Mano, Vasco Moço
2016-06-01
Let G be a strongly regular graph whose adjacency matrix is A. We associate a real finite dimensional Euclidean Jordan algebra 𝒱, of rank three to the strongly regular graph G, spanned by I and the natural powers of A, endowed with the Jordan product of matrices and with the inner product as being the usual trace of matrices. Finally, by the analysis of the binomial Hadamard series of an element of 𝒱, we establish some inequalities on the parameters and on the spectrum of a strongly regular graph like those established in theorems 3 and 4.
Blind image deblurring with edge enhancing total variation regularization
NASA Astrophysics Data System (ADS)
Shi, Yu; Hong, Hanyu; Song, Jie; Hua, Xia
2015-04-01
Blind image deblurring is an important issue. In this paper, we focus on solving this issue by constrained regularization method. Motivated by the importance of edges to visual perception, the edge-enhancing indicator is introduced to constrain the total variation regularization, and the bilateral filter is used for edge-preserving smoothing. The proposed edge enhancing regularization method aims to smooth preferably within each region and preserve edges. Experiments on simulated and real motion blurred images show that the proposed method is competitive with recent state-of-the-art total variation methods.
Loop Invariants, Exploration of Regularities, and Mathematical Games.
ERIC Educational Resources Information Center
Ginat, David
2001-01-01
Presents an approach for illustrating, on an intuitive level, the significance of loop invariants for algorithm design and analysis. The illustration is based on mathematical games that require the exploration of regularities via problem-solving heuristics. (Author/MM)
Regularized Chapman-Enskog expansion for scalar conservation laws
NASA Technical Reports Server (NTRS)
Schochet, Steven; Tadmor, Eitan
1990-01-01
Rosenau has recently proposed a regularized version of the Chapman-Enskog expansion of hydrodynamics. This regularized expansion resembles the usual Navier-Stokes viscosity terms at law wave-numbers, but unlike the latter, it has the advantage of being a bounded macroscopic approximation to the linearized collision operator. The behavior of Rosenau regularization of the Chapman-Enskog expansion (RCE) is studied in the context of scalar conservation laws. It is shown that thie RCE model retains the essential properties of the usual viscosity approximation, e.g., existence of traveling waves, monotonicity, upper-Lipschitz continuity..., and at the same time, it sharpens the standard viscous shock layers. It is proved that the regularized RCE approximation converges to the underlying inviscid entropy solution as its mean-free-path epsilon approaches 0, and the convergence rate is estimated.
Vectorial total variation-based regularization for variational image registration.
Chumchob, Noppadol
2013-11-01
To use interdependence between the primary components of the deformation field for smooth and non-smooth registration problems, the channel-by-channel total variation- or standard vectorial total variation (SVTV)-based regularization has been extended to a more flexible and efficient technique, allowing high quality regularization procedures. Based on this method, this paper proposes a fast nonlinear multigrid (NMG) method for solving the underlying Euler-Lagrange system of two coupled second-order nonlinear partial differential equations. Numerical experiments using both synthetic and realistic images not only confirm that the recommended VTV-based regularization yields better registration qualities for a wide range of applications than those of the SVTV-based regularization, but also that the proposed NMG method is fast, accurate, and reliable in delivering visually-pleasing registration results. PMID:23893729
Regular Doctor Visits Can Help Spot Colon Cancer
... 159699.html Regular Doctor Visits Can Help Spot Colon Cancer Early detection improves likelihood of survival, researchers ... increases the odds you'll be screened for colon cancer, a new study says. Colon cancer is ...
Generic quantum walks with memory on regular graphs
NASA Astrophysics Data System (ADS)
Li, Dan; Mc Gettrick, Michael; Gao, Fei; Xu, Jie; Wen, Qiao-Yan
2016-04-01
Quantum walks with memory (QWM) are a type of modified quantum walks that record the walker's latest path. As we know, only two kinds of QWM have been presented up to now. It is desired to design more QWM for research, so that we can explore the potential of QWM. In this work, by presenting the one-to-one correspondence between QWM on a regular graph and quantum walks without memory (QWoM) on a line digraph of the regular graph, we construct a generic model of QWM on regular graphs. This construction gives a general scheme for building all possible standard QWM on regular graphs and makes it possible to study properties of different kinds of QWM. Here, by taking the simplest example, which is QWM with one memory on the line, we analyze some properties of QWM, such as variance, occupancy rate, and localization.
Analytic regularization for landmark-based image registration
NASA Astrophysics Data System (ADS)
Shusharina, Nadezhda; Sharp, Gregory
2012-03-01
Landmark-based registration using radial basis functions (RBF) is an efficient and mathematically transparent method for the registration of medical images. To ensure invertibility and diffeomorphism of the RBF-based vector field, various regularization schemes have been suggested. Here, we report a novel analytic method of RBF regularization and demonstrate its power for Gaussian RBF. Our analytic formula can be used to obtain a regularized vector field from the solution of a system of linear equations, exactly as in traditional RBF, and can be generalized to any RBF with infinite support. We statistically validate the method on global registration of synthetic and pulmonary images. Furthermore, we present several clinical examples of multistage intensity/landmark-based registrations, where regularized Gaussian RBF are successful in correcting locally misregistered areas resulting from automatic B-spline registration. The intended ultimate application of our method is rapid, interactive local correction of deformable registration with a small number of mouse clicks.
32 CFR 901.14 - Regular airmen category.
Code of Federal Regulations, 2010 CFR
2010-07-01
... status when appointed as cadets. (b) Regular category applicants must arrange to have their high school.... Applicants not selected are reassigned on Academy notification to the CBPO. Applicants to technical...
32 CFR 901.14 - Regular airmen category.
Code of Federal Regulations, 2013 CFR
2013-07-01
... status when appointed as cadets. (b) Regular category applicants must arrange to have their high school.... Applicants not selected are reassigned on Academy notification to the CBPO. Applicants to technical...
32 CFR 901.14 - Regular airmen category.
Code of Federal Regulations, 2011 CFR
2011-07-01
... status when appointed as cadets. (b) Regular category applicants must arrange to have their high school.... Applicants not selected are reassigned on Academy notification to the CBPO. Applicants to technical...
5 CFR 550.1307 - Authority to regularize paychecks.
Code of Federal Regulations, 2010 CFR
2010-01-01
... caused by work scheduling cycles that result in varying hours in the firefighters' tours of duty from pay... for regular tours of duty over the firefighter's entire work scheduling cycle must, to the...
Are Pupils in Special Education Too "Special" for Regular Education?
NASA Astrophysics Data System (ADS)
Pijl, Ysbrand J.; Pijl, Sip J.
1998-01-01
In the Netherlands special needs pupils are often referred to separate schools for the Educable Mentally Retarded (EMR) or the Learning Disabled (LD). There is an ongoing debate on how to reduce the growing numbers of special education placements. One of the main issues in this debate concerns the size of the difference in cognitive abilities between pupils in regular education and those eligible for LD or EMR education. In this study meta-analysis techniques were used to synthesize the findings from 31 studies on differences between pupils in regular primary education and those in special education in the Netherlands. Studies were grouped into three categories according to the type of measurements used: achievement, general intelligence and neuropsychological tests. It was found that pupils in regular education and those in special education differ in achievement and general intelligence. Pupils in schools for the educable mentally retarded in particular perform at a much lower level than is common in regular Dutch primary education.
Robust destriping method with unidirectional total variation and framelet regularization.
Chang, Yi; Fang, Houzhang; Yan, Luxin; Liu, Hai
2013-10-01
Multidetector imaging systems often suffer from the problem of stripe noise and random noise, which greatly degrade the imaging quality. In this paper, we propose a variational destriping method that combines unidirectional total variation and framelet regularization. Total-variation-based regularizations are considered effective in removing different kinds of stripe noise, and framelet regularization can efficiently preserve the detail information. In essence, these two regularizations are complementary to each other. Moreover, the proposed method can also efficiently suppress random noise. The split Bregman iteration method is employed to solve the resulting minimization problem. Comparative results demonstrate that the proposed method significantly outperforms state-of-the-art destriping methods on both qualitative and quantitative assessments. PMID:24104244
Aggregation of regularized solutions from multiple observation models
NASA Astrophysics Data System (ADS)
Chen, Jieyang; Pereverzyev, Sergiy, Jr.; Xu, Yuesheng
2015-07-01
Joint inversion of multiple observation models has important applications in many disciplines including geoscience, image processing and computational biology. One of the methodologies for joint inversion of ill-posed observation equations naturally leads to multi-parameter regularization, which has been intensively studied over the last several years. However, problems such as the choice of multiple regularization parameters remain unsolved. In the present study, we discuss a rather general approach to the regularization of multiple observation models, based on the idea of the linear aggregation of approximations corresponding to different values of the regularization parameters. We show how the well-known linear functional strategy can be used for such an aggregation and prove that the error of a constructive aggregator differs from the ideal error value by a quantity of an order higher than the best guaranteed accuracy from the most trustable observation model. The theoretical analysis is illustrated by numerical experiments with simulated data.
Automatic Constraint Detection for 2D Layout Regularization.
Jiang, Haiyong; Nan, Liangliang; Yan, Dong-Ming; Dong, Weiming; Zhang, Xiaopeng; Wonka, Peter
2016-08-01
In this paper, we address the problem of constraint detection for layout regularization. The layout we consider is a set of two-dimensional elements where each element is represented by its bounding box. Layout regularization is important in digitizing plans or images, such as floor plans and facade images, and in the improvement of user-created contents, such as architectural drawings and slide layouts. To regularize a layout, we aim to improve the input by detecting and subsequently enforcing alignment, size, and distance constraints between layout elements. Similar to previous work, we formulate layout regularization as a quadratic programming problem. In addition, we propose a novel optimization algorithm that automatically detects constraints. We evaluate the proposed framework using a variety of input layouts from different applications. Our results demonstrate that our method has superior performance to the state of the art. PMID:26394426
A novel regularized edge-preserving super-resolution algorithm
NASA Astrophysics Data System (ADS)
Yu, Hui; Chen, Fu-sheng; Zhang, Zhi-jie; Wang, Chen-sheng
2013-09-01
Using super-resolution (SR) technology is a good approach to obtain high-resolution infrared image. However, Image super-resolution reconstruction is essentially an ill-posed problem, it is important to design an effective regularization term (image prior). Gaussian prior is widely used in the regularization term, but the reconstructed SR image becomes over-smoothness. Here, a novel regularization term called non-local means (NLM) term is derived based on the assumption that the natural image content is likely to repeat itself within some neighborhood. In the proposed framework, the estimated high image is obtained by minimizing a cost function. The iteration method is applied to solve the optimum problem. With the progress of iteration, the regularization term is adaptively updated. The proposed algorithm has been tested in several experiments. The experimental results show that the proposed approach is robust and can reconstruct higher quality images both in quantitative term and perceptual effect.
Regular structure in the inner Cassini Division of Saturn's rings
NASA Technical Reports Server (NTRS)
Flynn, Brian C.; Cuzzi, Jeffrey N.
1989-01-01
Voyager imaging, radio occultation, and stellar occultation data for the regular structure of Saturn's inner Cassini Division are presently analyzed. The regular optical depth variation observed by the radio occultation experiment scan and the feature noted in Voyager images is the same structure, namely the gravitational wakes of two 10-km radius satellites orbiting within the division. The structure is azimuthally symmetric, and is judged to rule out the possibility that large moonlets may be responsible for the observed structure.
Regularization methods for Nuclear Lattice Effective Field Theory
NASA Astrophysics Data System (ADS)
Klein, Nico; Lee, Dean; Liu, Weitao; Meißner, Ulf-G.
2015-07-01
We investigate Nuclear Lattice Effective Field Theory for the two-body system for several lattice spacings at lowest order in the pionless as well as in the pionful theory. We discuss issues of regularizations and predictions for the effective range expansion. In the pionless case, a simple Gaussian smearing allows to demonstrate lattice spacing independence over a wide range of lattice spacings. We show that regularization methods known from the continuum formulation are necessary as well as feasible for the pionful approach.
Note on regular black holes in a brane world
NASA Astrophysics Data System (ADS)
Neves, J. C. S.
2015-10-01
In this work, we show that regular black holes in a Randall-Sundrum-type brane world model are generated by the nonlocal bulk influence, expressed by a constant parameter in the brane metric, only in the spherical case. In the axial case (black holes with rotation), this influence forbids them. A nonconstant bulk influence is necessary to generate regular black holes with rotation in this context.
Lesions impairing regular versus irregular past tense production☆
Meteyard, Lotte; Price, Cathy J.; Woollams, Anna M.; Aydelott, Jennifer
2013-01-01
We investigated selective impairments in the production of regular and irregular past tense by examining language performance and lesion sites in a sample of twelve stroke patients. A disadvantage in regular past tense production was observed in six patients when phonological complexity was greater for regular than irregular verbs, and in three patients when phonological complexity was closely matched across regularity. These deficits were not consistently related to grammatical difficulties or phonological errors but were consistently related to lesion site. All six patients with a regular past tense disadvantage had damage to the left ventral pars opercularis (in the inferior frontal cortex), an area associated with articulatory sequencing in prior functional imaging studies. In addition, those that maintained a disadvantage for regular verbs when phonological complexity was controlled had damage to the left ventral supramarginal gyrus (in the inferior parietal lobe), an area associated with phonological short-term memory. When these frontal and parietal regions were spared in patients who had damage to subcortical (n = 2) or posterior temporo-parietal regions (n = 3), past tense production was relatively unimpaired for both regular and irregular forms. The remaining (12th) patient was impaired in producing regular past tense but was significantly less accurate when producing irregular past tense. This patient had frontal, parietal, subcortical and posterior temporo-parietal damage, but was distinguished from the other patients by damage to the left anterior temporal cortex, an area associated with semantic processing. We consider how our lesion site and behavioral observations have implications for theoretical accounts of past tense production. PMID:24273726
New solutions of charged regular black holes and their stability
NASA Astrophysics Data System (ADS)
Uchikata, Nami; Yoshida, Shijun; Futamase, Toshifumi
2012-10-01
We construct new regular black hole solutions by matching the de Sitter solution and the Reissner-Nordström solution with a timelike thin shell. The thin shell is assumed to have mass but no pressure and obeys an equation of motion derived from Israel’s junction conditions. By investigating the equation of motion for the shell, we obtain stationary solutions of charged regular black holes and examine stability of the solutions. Stationary solutions are found in limited ranges of 0.87L≤m≤1.99L, and they are stable against small radial displacement of the shell with fixed values of m, M, and Q if M>0, where L is the de Sitter horizon radius, m the black hole mass, M the proper mass of the shell, and Q the black hole charge. All the solutions obtained are highly charged in the sense of Q/m>23≈0.866. By taking the massless limit of the shell in the present regular black hole solutions, we obtain the charged regular black hole with a massless shell obtained by Lemos and Zanchin and investigate stability of the solutions. It is found that Lemos and Zanchin’s regular black hole solutions given by the massless limit of the present regular black hole solutions permit stable solutions, which are obtained by the limit of M→0.
The relationship between lifestyle regularity and subjective sleep quality
NASA Technical Reports Server (NTRS)
Monk, Timothy H.; Reynolds, Charles F 3rd; Buysse, Daniel J.; DeGrazia, Jean M.; Kupfer, David J.
2003-01-01
In previous work we have developed a diary instrument-the Social Rhythm Metric (SRM), which allows the assessment of lifestyle regularity-and a questionnaire instrument--the Pittsburgh Sleep Quality Index (PSQI), which allows the assessment of subjective sleep quality. The aim of the present study was to explore the relationship between lifestyle regularity and subjective sleep quality. Lifestyle regularity was assessed by both standard (SRM-17) and shortened (SRM-5) metrics; subjective sleep quality was assessed by the PSQI. We hypothesized that high lifestyle regularity would be conducive to better sleep. Both instruments were given to a sample of 100 healthy subjects who were studied as part of a variety of different experiments spanning a 9-yr time frame. Ages ranged from 19 to 49 yr (mean age: 31.2 yr, s.d.: 7.8 yr); there were 48 women and 52 men. SRM scores were derived from a two-week diary. The hypothesis was confirmed. There was a significant (rho = -0.4, p < 0.001) correlation between SRM (both metrics) and PSQI, indicating that subjects with higher levels of lifestyle regularity reported fewer sleep problems. This relationship was also supported by a categorical analysis, where the proportion of "poor sleepers" was doubled in the "irregular types" group as compared with the "non-irregular types" group. Thus, there appears to be an association between lifestyle regularity and good sleep, though the direction of causality remains to be tested.
Nonlocal means-based regularizations for statistical CT reconstruction
NASA Astrophysics Data System (ADS)
Zhang, Hao; Ma, Jianhua; Liu, Yan; Han, Hao; Li, Lihong; Wang, Jing; Liang, Zhengrong
2014-03-01
Statistical iterative reconstruction (SIR) methods have shown remarkable gains over the conventional filtered backprojection (FBP) method in improving image quality for low-dose computed tomography (CT). They reconstruct the CT images by maximizing/minimizing a cost function in a statistical sense, where the cost function usually consists of two terms: the data-fidelity term modeling the statistics of measured data, and the regularization term reflecting a prior information. The regularization term in SIR plays a critical role for successful image reconstruction, and an established family of regularizations is based on the Markov random field (MRF) model. Inspired by the success of nonlocal means (NLM) algorithm in image processing applications, we proposed, in this work, a family of generic and edgepreserving NLM-based regularizations for SIR. We evaluated one of them where the potential function takes the quadratic-form. Experimental results with both digital and physical phantoms clearly demonstrated that SIR with the proposed regularization can achieve more significant gains than SIR with the widely-used Gaussian MRF regularization and the conventional FBP method, in terms of image noise reduction and resolution preservation.
Regular treatment with salmeterol for chronic asthma: serious adverse events
Cates, Christopher J; Cates, Matthew J
2014-01-01
Background Epidemiological evidence has suggested a link between beta2-agonists and increases in asthma mortality. There has been much debate about possible causal links for this association, and whether regular (daily) long-acting beta2-agonists are safe. Objectives The aim of this review is to assess the risk of fatal and non-fatal serious adverse events in trials that randomised patients with chronic asthma to regular salmeterol versus placebo or regular short-acting beta2-agonists. Search methods We identified trials using the Cochrane Airways Group Specialised Register of trials. We checked websites of clinical trial registers for unpublished trial data and FDA submissions in relation to salmeterol. The date of the most recent search was August 2011. Selection criteria We included controlled parallel design clinical trials on patients of any age and severity of asthma if they randomised patients to treatment with regular salmeterol and were of at least 12 weeks’ duration. Concomitant use of inhaled corticosteroids was allowed, as long as this was not part of the randomised treatment regimen. Data collection and analysis Two authors independently selected trials for inclusion in the review. One author extracted outcome data and the second checked them. We sought unpublished data on mortality and serious adverse events. Main results The review includes 26 trials comparing salmeterol to placebo and eight trials comparing with salbutamol. These included 62,815 participants with asthma (including 2,599 children). In six trials (2,766 patients), no serious adverse event data could be obtained. All-cause mortality was higher with regular salmeterol than placebo but the increase was not significant (Peto odds ratio (OR) 1.33 (95% CI 0.85 to 2.08)). Non-fatal serious adverse events were significantly increased when regular salmeterol was compared with placebo (OR 1.15 95% CI 1.02 to 1.29). One extra serious adverse event occurred over 28 weeks for every 188 people
An adaptive regularization parameter choice strategy for multispectral bioluminescence tomography
Feng Jinchao; Qin Chenghu; Jia Kebin; Han Dong; Liu Kai; Zhu Shouping; Yang Xin; Tian Jie
2011-11-15
Purpose: Bioluminescence tomography (BLT) provides an effective tool for monitoring physiological and pathological activities in vivo. However, the measured data in bioluminescence imaging are corrupted by noise. Therefore, regularization methods are commonly used to find a regularized solution. Nevertheless, for the quality of the reconstructed bioluminescent source obtained by regularization methods, the choice of the regularization parameters is crucial. To date, the selection of regularization parameters remains challenging. With regards to the above problems, the authors proposed a BLT reconstruction algorithm with an adaptive parameter choice rule. Methods: The proposed reconstruction algorithm uses a diffusion equation for modeling the bioluminescent photon transport. The diffusion equation is solved with a finite element method. Computed tomography (CT) images provide anatomical information regarding the geometry of the small animal and its internal organs. To reduce the ill-posedness of BLT, spectral information and the optimal permissible source region are employed. Then, the relationship between the unknown source distribution and multiview and multispectral boundary measurements is established based on the finite element method and the optimal permissible source region. Since the measured data are noisy, the BLT reconstruction is formulated as l{sub 2} data fidelity and a general regularization term. When choosing the regularization parameters for BLT, an efficient model function approach is proposed, which does not require knowledge of the noise level. This approach only requests the computation of the residual and regularized solution norm. With this knowledge, we construct the model function to approximate the objective function, and the regularization parameter is updated iteratively. Results: First, the micro-CT based mouse phantom was used for simulation verification. Simulation experiments were used to illustrate why multispectral data were used
Early family regularity protects against later disruptive behavior.
Rijlaarsdam, Jolien; Tiemeier, Henning; Ringoot, Ank P; Ivanova, Masha Y; Jaddoe, Vincent W V; Verhulst, Frank C; Roza, Sabine J
2016-07-01
Infants' temperamental anger or frustration reactions are highly stable, but are also influenced by maturation and experience. It is yet unclear why some infants high in anger or frustration reactions develop disruptive behavior problems whereas others do not. We examined family regularity, conceptualized as the consistency of mealtime and bedtime routines, as a protective factor against the development of oppositional and aggressive behavior. This study used prospectively collected data from 3136 families participating in the Generation R Study. Infant anger or frustration reactions and family regularity were reported by mothers when children were ages 6 months and 2-4 years, respectively. Multiple informants (parents, teachers, and children) and methods (questionnaire and interview) were used in the assessment of children's oppositional and aggressive behavior at age 6. Higher levels of family regularity were associated with lower levels of child aggression independent of temperamental anger or frustration reactions (β = -0.05, p = 0.003). The association between child oppositional behavior and temperamental anger or frustration reactions was moderated by family regularity and child gender (β = 0.11, p = 0.046): family regularity reduced the risk for oppositional behavior among those boys who showed anger or frustration reactions in infancy. In conclusion, family regularity reduced the risk for child aggression and showed a gender-specific protective effect against child oppositional behavior associated with anger or frustration reactions. Families that ensured regularity of mealtime and bedtime routines buffered their infant sons high in anger or frustration reactions from developing oppositional behavior. PMID:26589300
Particle motion and Penrose processes around rotating regular black hole
NASA Astrophysics Data System (ADS)
Abdujabbarov, Ahmadjon
2016-07-01
The neutral particle motion around rotating regular black hole that was derived from the Ayón-Beato-García (ABG) black hole solution by the Newman-Janis algorithm in the preceding paper (Toshmatov et al., Phys. Rev. D, 89:104017, 2014) has been studied. The dependencies of the ISCO (innermost stable circular orbits along geodesics) and unstable orbits on the value of the electric charge of the rotating regular black hole have been shown. Energy extraction from the rotating regular black hole through various processes has been examined. We have found expression of the center of mass energy for the colliding neutral particles coming from infinity, based on the BSW (Baňados-Silk-West) mechanism. The electric charge Q of rotating regular black hole decreases the potential of the gravitational field as compared to the Kerr black hole and the particles demonstrate less bound energy at the circular geodesics. This causes an increase of efficiency of the energy extraction through BSW process in the presence of the electric charge Q from rotating regular black hole. Furthermore, we have studied the particle emission due to the BSW effect assuming that two neutral particles collide near the horizon of the rotating regular extremal black hole and produce another two particles. We have shown that efficiency of the energy extraction is less than the value 146.6 % being valid for the Kerr black hole. It has been also demonstrated that the efficiency of the energy extraction from the rotating regular black hole via the Penrose process decreases with the increase of the electric charge Q and is smaller in comparison to 20.7 % which is the value for the extreme Kerr black hole with the specific angular momentum a= M.
Regular treatment with formoterol for chronic asthma: serious adverse events
Cates, Christopher J; Cates, Matthew J
2014-01-01
Background Epidemiological evidence has suggested a link between beta2-agonists and increases in asthma mortality. There has been much debate about possible causal links for this association, and whether regular (daily) long-acting beta2-agonists are safe. Objectives The aim of this review is to assess the risk of fatal and non-fatal serious adverse events in trials that randomised patients with chronic asthma to regular formoterol versus placebo or regular short-acting beta2-agonists. Search methods We identified trials using the Cochrane Airways Group Specialised Register of trials. We checked websites of clinical trial registers for unpublished trial data and Food and Drug Administration (FDA) submissions in relation to formoterol. The date of the most recent search was January 2012. Selection criteria We included controlled, parallel design clinical trials on patients of any age and severity of asthma if they randomised patients to treatment with regular formoterol and were of at least 12 weeks’ duration. Concomitant use of inhaled corticosteroids was allowed, as long as this was not part of the randomised treatment regimen. Data collection and analysis Two authors independently selected trials for inclusion in the review. One author extracted outcome data and the second author checked them. We sought unpublished data on mortality and serious adverse events. Main results The review includes 22 studies (8032 participants) comparing regular formoterol to placebo and salbutamol. Non-fatal serious adverse event data could be obtained for all participants from published studies comparing formoterol and placebo but only 80% of those comparing formoterol with salbutamol or terbutaline. Three deaths occurred on regular formoterol and none on placebo; this difference was not statistically significant. It was not possible to assess disease-specific mortality in view of the small number of deaths. Non-fatal serious adverse events were significantly increased when
Reducing errors in the GRACE gravity solutions using regularization
NASA Astrophysics Data System (ADS)
Save, Himanshu; Bettadpur, Srinivas; Tapley, Byron D.
2012-09-01
The nature of the gravity field inverse problem amplifies the noise in the GRACE data, which creeps into the mid and high degree and order harmonic coefficients of the Earth's monthly gravity fields provided by GRACE. Due to the use of imperfect background models and data noise, these errors are manifested as north-south striping in the monthly global maps of equivalent water heights. In order to reduce these errors, this study investigates the use of the L-curve method with Tikhonov regularization. L-curve is a popular aid for determining a suitable value of the regularization parameter when solving linear discrete ill-posed problems using Tikhonov regularization. However, the computational effort required to determine the L-curve is prohibitively high for a large-scale problem like GRACE. This study implements a parameter-choice method, using Lanczos bidiagonalization which is a computationally inexpensive approximation to L-curve. Lanczos bidiagonalization is implemented with orthogonal transformation in a parallel computing environment and projects a large estimation problem on a problem of the size of about 2 orders of magnitude smaller for computing the regularization parameter. Errors in the GRACE solution time series have certain characteristics that vary depending on the ground track coverage of the solutions. These errors increase with increasing degree and order. In addition, certain resonant and near-resonant harmonic coefficients have higher errors as compared with the other coefficients. Using the knowledge of these characteristics, this study designs a regularization matrix that provides a constraint on the geopotential coefficients as a function of its degree and order. This regularization matrix is then used to compute the appropriate regularization parameter for each monthly solution. A 7-year time-series of the candidate regularized solutions (Mar 2003-Feb 2010) show markedly reduced error stripes compared with the unconstrained GRACE release 4
On Nonperiodic Euler Flows with Hölder Regularity
NASA Astrophysics Data System (ADS)
Isett, Philip; Oh, Sung-Jin
2016-08-01
In (Isett, Regularity in time along the coarse scale flow for the Euler equations, 2013), the first author proposed a strengthening of Onsager's conjecture on the failure of energy conservation for incompressible Euler flows with Hölder regularity not exceeding {1/3}. This stronger form of the conjecture implies that anomalous dissipation will fail for a generic Euler flow with regularity below the Onsager critical space {L_t^∞ B_{3,∞}^{1/3}} due to low regularity of the energy profile. This paper is the first and main paper in a series of two, the results of which may be viewed as first steps towards establishing the conjectured failure of energy regularity for generic solutions with Hölder exponent less than {1/5}. The main result of the present paper shows that any given smooth Euler flow can be perturbed in {C^{1/5-ɛ}_{t,x}} on any pre-compact subset of R× R^3 to violate energy conservation. Furthermore, the perturbed solution is no smoother than {C^{1/5-ɛ}_{t,x}}. As a corollary of this theorem, we show the existence of nonzero {C^{1/5-ɛ}_{t,x}} solutions to Euler with compact space-time support, generalizing previous work of the first author (Isett, Hölder continuous Euler flows in three dimensions with compact support in time, 2012) to the nonperiodic setting.
X-ray computed tomography using curvelet sparse regularization
Wieczorek, Matthias Vogel, Jakob; Lasser, Tobias; Frikel, Jürgen; Demaret, Laurent; Eggl, Elena; Pfeiffer, Franz; Kopp, Felix; Noël, Peter B.
2015-04-15
Purpose: Reconstruction of x-ray computed tomography (CT) data remains a mathematically challenging problem in medical imaging. Complementing the standard analytical reconstruction methods, sparse regularization is growing in importance, as it allows inclusion of prior knowledge. The paper presents a method for sparse regularization based on the curvelet frame for the application to iterative reconstruction in x-ray computed tomography. Methods: In this work, the authors present an iterative reconstruction approach based on the alternating direction method of multipliers using curvelet sparse regularization. Results: Evaluation of the method is performed on a specifically crafted numerical phantom dataset to highlight the method’s strengths. Additional evaluation is performed on two real datasets from commercial scanners with different noise characteristics, a clinical bone sample acquired in a micro-CT and a human abdomen scanned in a diagnostic CT. The results clearly illustrate that curvelet sparse regularization has characteristic strengths. In particular, it improves the restoration and resolution of highly directional, high contrast features with smooth contrast variations. The authors also compare this approach to the popular technique of total variation and to traditional filtered backprojection. Conclusions: The authors conclude that curvelet sparse regularization is able to improve reconstruction quality by reducing noise while preserving highly directional features.
Regularized image system for Stokes flow outside a solid sphere
NASA Astrophysics Data System (ADS)
Wróbel, Jacek K.; Cortez, Ricardo; Varela, Douglas; Fauci, Lisa
2016-07-01
The image system for a three-dimensional flow generated by regularized forces outside a solid sphere is formulated and implemented as an extension of the method of regularized Stokeslets. The method is based on replacing a point force given by a delta distribution with a smooth localized function and deriving the exact velocity field produced by the forcing. In order to satisfy zero-flow boundary conditions at a solid sphere, the image system for singular Stokeslets is generalized to give exact cancellation of the regularized flow at the surface of the sphere. The regularized image system contains the same elements as the singular counterpart but with coefficients that depend on a regularization parameter. As this parameter vanishes, the expressions reduce to the image system of the singular Stokeslet. The expression relating force and velocity can be inverted to compute the forces that generate a given velocity boundary condition elsewhere in the flow. We present several examples within the context of biological flows at the microscale in order to validate and highlight the usefulness of the image system in computations.
Incorporating anatomical side information into PET reconstruction using nonlocal regularization.
Nguyen, Van-Giang; Lee, Soo-Jin
2013-10-01
With the introduction of combined positron emission tomography (PET)/computed tomography (CT) or PET/magnetic resonance imaging (MRI) scanners, there is an increasing emphasis on reconstructing PET images with the aid of the anatomical side information obtained from X-ray CT or MRI scanners. In this paper, we propose a new approach to incorporating prior anatomical information into PET reconstruction using the nonlocal regularization method. The nonlocal regularizer developed for this application is designed to selectively consider the anatomical information only when it is reliable. As our proposed nonlocal regularization method does not directly use anatomical edges or boundaries which are often used in conventional methods, it is not only free from additional processes to extract anatomical boundaries or segmented regions, but also more robust to the signal mismatch problem that is caused by the indirect relationship between the PET image and the anatomical image. We perform simulations with digital phantoms. According to our experimental results, compared to the conventional method based on the traditional local regularization method, our nonlocal regularization method performs well even with the imperfect prior anatomical information or in the presence of signal mismatch between the PET image and the anatomical image. PMID:23744678
In vivo impedance imaging with total variation regularization.
Borsic, Andrea; Graham, Brad M; Adler, Andy; Lionheart, William R B
2010-01-01
We show that electrical impedance tomography (EIT) image reconstruction algorithms with regularization based on the total variation (TV) functional are suitable for in vivo imaging of physiological data. This reconstruction approach helps to preserve discontinuities in reconstructed profiles, such as step changes in electrical properties at interorgan boundaries, which are typically smoothed by traditional reconstruction algorithms. The use of the TV functional for regularization leads to the minimization of a nondifferentiable objective function in the inverse formulation. This cannot be efficiently solved with traditional optimization techniques such as the Newton method. We explore two implementations methods for regularization with the TV functional: the lagged diffusivity method and the primal dual-interior point method (PD-IPM). First we clarify the implementation details of these algorithms for EIT reconstruction. Next, we analyze the performance of these algorithms on noisy simulated data. Finally, we show reconstructed EIT images of in vivo data for ventilation and gastric emptying studies. In comparison to traditional quadratic regularization, TV regularization shows improved ability to reconstruct sharp contrasts. PMID:20051330
SPECT reconstruction using DCT-induced tight framelet regularization
NASA Astrophysics Data System (ADS)
Zhang, Jiahan; Li, Si; Xu, Yuesheng; Schmidtlein, C. R.; Lipson, Edward D.; Feiglin, David H.; Krol, Andrzej
2015-03-01
Wavelet transforms have been successfully applied in many fields of image processing. Yet, to our knowledge, they have never been directly incorporated to the objective function in Emission Computed Tomography (ECT) image reconstruction. Our aim has been to investigate if the ℓ1-norm of non-decimated discrete cosine transform (DCT) coefficients of the estimated radiotracer distribution could be effectively used as the regularization term for the penalized-likelihood (PL) reconstruction, where a regularizer is used to enforce the image smoothness in the reconstruction. In this study, the ℓ1-norm of 2D DCT wavelet decomposition was used as a regularization term. The Preconditioned Alternating Projection Algorithm (PAPA), which we proposed in earlier work to solve penalized likelihood (PL) reconstruction with non-differentiable regularizers, was used to solve this optimization problem. The DCT wavelet decompositions were performed on the transaxial reconstructed images. We reconstructed Monte Carlo simulated SPECT data obtained for a numerical phantom with Gaussian blobs as hot lesions and with a warm random lumpy background. Reconstructed images using the proposed method exhibited better noise suppression and improved lesion conspicuity, compared with images reconstructed using expectation maximization (EM) algorithm with Gaussian post filter (GPF). Also, the mean square error (MSE) was smaller, compared with EM-GPF. A critical and challenging aspect of this method was selection of optimal parameters. In summary, our numerical experiments demonstrated that the ℓ1-norm of discrete cosine transform (DCT) wavelet frame transform DCT regularizer shows promise for SPECT image reconstruction using PAPA method.
Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery.
Feng, Yunlong; Lv, Shao-Gao; Hang, Hanyuan; Suykens, Johan A K
2016-03-01
Kernelized elastic net regularization (KENReg) is a kernelization of the well-known elastic net regularization (Zou & Hastie, 2005 ). The kernel in KENReg is not required to be a Mercer kernel since it learns from a kernelized dictionary in the coefficient space. Feng, Yang, Zhao, Lv, and Suykens ( 2014 ) showed that KENReg has some nice properties including stability, sparseness, and generalization. In this letter, we continue our study on KENReg by conducting a refined learning theory analysis. This letter makes the following three main contributions. First, we present refined error analysis on the generalization performance of KENReg. The main difficulty of analyzing the generalization error of KENReg lies in characterizing the population version of its empirical target function. We overcome this by introducing a weighted Banach space associated with the elastic net regularization. We are then able to conduct elaborated learning theory analysis and obtain fast convergence rates under proper complexity and regularity assumptions. Second, we study the sparse recovery problem in KENReg with fixed design and show that the kernelization may improve the sparse recovery ability compared to the classical elastic net regularization. Finally, we discuss the interplay among different properties of KENReg that include sparseness, stability, and generalization. We show that the stability of KENReg leads to generalization, and its sparseness confidence can be derived from generalization. Moreover, KENReg is stable and can be simultaneously sparse, which makes it attractive theoretically and practically. PMID:26735744
Fast multislice fluorescence molecular tomography using sparsity-inducing regularization
NASA Astrophysics Data System (ADS)
Hejazi, Sedigheh Marjaneh; Sarkar, Saeed; Darezereshki, Ziba
2016-02-01
Fluorescence molecular tomography (FMT) is a rapidly growing imaging method that facilitates the recovery of small fluorescent targets within biological tissue. The major challenge facing the FMT reconstruction method is the ill-posed nature of the inverse problem. In order to overcome this problem, the acquisition of large FMT datasets and the utilization of a fast FMT reconstruction algorithm with sparsity regularization have been suggested recently. Therefore, the use of a joint L1/total-variation (TV) regularization as a means of solving the ill-posed FMT inverse problem is proposed. A comparative quantified analysis of regularization methods based on L1-norm and TV are performed using simulated datasets, and the results show that the fast composite splitting algorithm regularization method can ensure the accuracy and robustness of the FMT reconstruction. The feasibility of the proposed method is evaluated in an in vivo scenario for the subcutaneous implantation of a fluorescent-dye-filled capillary tube in a mouse, and also using hybrid FMT and x-ray computed tomography data. The results show that the proposed regularization overcomes the difficulties created by the ill-posed inverse problem.
Regularized total least squares approach for nonconvolutional linear inverse problems.
Zhu, W; Wang, Y; Galatsanos, N P; Zhang, J
1999-01-01
In this correspondence, a solution is developed for the regularized total least squares (RTLS) estimate in linear inverse problems where the linear operator is nonconvolutional. Our approach is based on a Rayleigh quotient (RQ) formulation of the TLS problem, and we accomplish regularization by modifying the RQ function to enforce a smooth solution. A conjugate gradient algorithm is used to minimize the modified RQ function. As an example, the proposed approach has been applied to the perturbation equation encountered in optical tomography. Simulation results show that this method provides more stable and accurate solutions than the regularized least squares and a previously reported total least squares approach, also based on the RQ formulation. PMID:18267442
Regularity based descriptor computed from local image oscillations.
Trujillo, Leonardo; Olague, Gustavo; Legrand, Pierrick; Lutton, Evelyne
2007-05-14
This work presents a novel local image descriptor based on the concept of pointwise signal regularity. Local image regions are extracted using either an interest point or an interest region detector, and discriminative feature vectors are constructed by uniformly sampling the pointwise Hölderian regularity around each region center. Regularity estimation is performed using local image oscillations, the most straightforward method directly derived from the definition of the Hölder exponent. Furthermore, estimating the Hölder exponent in this manner has proven to be superior, in most cases, when compared to wavelet based estimation as was shown in previous work. Our detector shows invariance to illumination change, JPEG compression, image rotation and scale change. Results show that the proposed descriptor is stable with respect to variations in imaging conditions, and reliable performance metrics prove it to be comparable and in some instances better than SIFT, the state-of-the-art in local descriptors. PMID:19546918
Selecting protein families for environmental features based on manifold regularization.
Jiang, Xingpeng; Xu, Weiwei; Park, E K; Li, Guangrong
2014-06-01
Recently, statistics and machine learning have been developed to identify functional or taxonomic features of environmental features or physiological status. Important proteins (or other functional and taxonomic entities) to environmental features can be potentially used as biosensors. A major challenge is how the distribution of protein and gene functions embodies the adaption of microbial communities across environments and host habitats. In this paper, we propose a novel regularization method for linear regression to adapt the challenge. The approach is inspired by local linear embedding (LLE) and we call it a manifold-constrained regularization for linear regression (McRe). The novel regularization procedure also has potential to be used in solving other linear systems. We demonstrate the efficiency and the performance of the approach in both simulation and real data. PMID:24802701
Breast ultrasound tomography with total-variation regularization
Huang, Lianjie; Li, Cuiping; Duric, Neb
2009-01-01
Breast ultrasound tomography is a rapidly developing imaging modality that has the potential to impact breast cancer screening and diagnosis. A new ultrasound breast imaging device (CURE) with a ring array of transducers has been designed and built at Karmanos Cancer Institute, which acquires both reflection and transmission ultrasound signals. To extract the sound-speed information from the breast data acquired by CURE, we have developed an iterative sound-speed image reconstruction algorithm for breast ultrasound transmission tomography based on total-variation (TV) minimization. We investigate applicability of the TV tomography algorithm using in vivo ultrasound breast data from 61 patients, and compare the results with those obtained using the Tikhonov regularization method. We demonstrate that, compared to the Tikhonov regularization scheme, the TV regularization method significantly improves image quality, resulting in sound-speed tomography images with sharp (preserved) edges of abnormalities and few artifacts.
Wavelet domain image restoration with adaptive edge-preserving regularization.
Belge, M; Kilmer, M E; Miller, E L
2000-01-01
In this paper, we consider a wavelet based edge-preserving regularization scheme for use in linear image restoration problems. Our efforts build on a collection of mathematical results indicating that wavelets are especially useful for representing functions that contain discontinuities (i.e., edges in two dimensions or jumps in one dimension). We interpret the resulting theory in a statistical signal processing framework and obtain a highly flexible framework for adapting the degree of regularization to the local structure of the underlying image. In particular, we are able to adapt quite easily to scale-varying and orientation-varying features in the image while simultaneously retaining the edge preservation properties of the regularizer. We demonstrate a half-quadratic algorithm for obtaining the restorations from observed data. PMID:18255433
Analysis of the "Learning in Regular Classrooms" movement in China.
Deng, M; Manset, G
2000-04-01
The Learning in Regular Classrooms experiment has evolved in response to China's efforts to educate its large population of students with disabilities who, until the mid-1980s, were denied a free education. In the Learning in Regular Classrooms, students with disabilities (primarily sensory impairments or mild mental retardation) are educated in neighborhood schools in mainstream classrooms. Despite difficulties associated with developing effective inclusive programming, this approach has contributed to a major increase in the enrollment of students with disabilities and increased involvement of schools, teachers, and parents in China's newly developing special education system. Here we describe the development of the Learning in Regular Classroom approach and the challenges associated with educating students with disabilities in China. PMID:10804702
Hybrid regularization image restoration algorithm based on total variation
NASA Astrophysics Data System (ADS)
Zhang, Hongmin; Wang, Yan
2013-09-01
To reduce the noise amplification and ripple phenomenon in the restoration result by using the traditional Richardson-Lucy deconvolution method, a novel hybrid regularization image restoration algorithm based on total variation is proposed in this paper. The key ides is that the hybrid regularization terms are employed according to the characteristics of different regions in the image itself. At the same time, the threshold between the different regularization terms is selected according to the golden section point which takes into account the human eye's visual feeling. Experimental results show that the restoration results of the proposed method are better than that of the total variation Richardson-Lucy algorithm both in PSNR and MSE, and it has the better visual effect simultaneously.
Structural characterization of the packings of granular regular polygons.
Wang, Chuncheng; Dong, Kejun; Yu, Aibing
2015-12-01
By using a recently developed method for discrete modeling of nonspherical particles, we simulate the random packings of granular regular polygons with three to 11 edges under gravity. The effects of shape and friction on the packing structures are investigated by various structural parameters, including packing fraction, the radial distribution function, coordination number, Voronoi tessellation, and bond-orientational order. We find that packing fraction is generally higher for geometrically nonfrustrated regular polygons, and can be increased by the increase of edge number and decrease of friction. The changes of packing fraction are linked with those of the microstructures, such as the variations of the translational and orientational orders and local configurations. In particular, the free areas of Voronoi tessellations (which are related to local packing fractions) can be described by log-normal distributions for all polygons. The quantitative analyses establish a clearer picture for the packings of regular polygons. PMID:26764678
Manufacture of Regularly Shaped Sol-Gel Pellets
NASA Technical Reports Server (NTRS)
Leventis, Nicholas; Johnston, James C.; Kinder, James D.
2006-01-01
An extrusion batch process for manufacturing regularly shaped sol-gel pellets has been devised as an improved alternative to a spray process that yields irregularly shaped pellets. The aspect ratio of regularly shaped pellets can be controlled more easily, while regularly shaped pellets pack more efficiently. In the extrusion process, a wet gel is pushed out of a mold and chopped repetitively into short, cylindrical pieces as it emerges from the mold. The pieces are collected and can be either (1) dried at ambient pressure to xerogel, (2) solvent exchanged and dried under ambient pressure to ambigels, or (3) supercritically dried to aerogel. Advantageously, the extruded pellets can be dropped directly in a cross-linking bath, where they develop a conformal polymer coating around the skeletal framework of the wet gel via reaction with the cross linker. These pellets can be dried to mechanically robust X-Aerogel.
Quantum backflow states from eigenstates of the regularized current operator
NASA Astrophysics Data System (ADS)
Halliwell, J. J.; Gillman, E.; Lennon, O.; Patel, M.; Ramirez, I.
2013-11-01
We present an exhaustive class of states with quantum backflow—the phenomenon in which a state consisting entirely of positive momenta has negative current and the probability flows in the opposite direction to the momentum. They are characterized by a general function of momenta subject to very weak conditions. Such a family of states is of interest in the light of a recent experimental proposal to measure backflow. We find one particularly simple state which has surprisingly large backflow—about 41% of the lower bound on flux derived by Bracken and Melloy. We study the eigenstates of a regularized current operator and we show how some of these states, in a certain limit, lead to our class of backflow states. This limit also clarifies the correspondence between the spectrum of the regularized current operator, which has just two non-zero eigenvalues in our chosen regularization, and the usual current operator.
Local conservative regularizations of compressible magnetohydrodynamic and neutral flows
NASA Astrophysics Data System (ADS)
Krishnaswami, Govind S.; Sachdev, Sonakshi; Thyagaraja, A.
2016-02-01
Ideal systems like magnetohydrodynamics (MHD) and Euler flow may develop singularities in vorticity ( w =∇×v ). Viscosity and resistivity provide dissipative regularizations of the singularities. In this paper, we propose a minimal, local, conservative, nonlinear, dispersive regularization of compressible flow and ideal MHD, in analogy with the KdV regularization of the 1D kinematic wave equation. This work extends and significantly generalizes earlier work on incompressible Euler and ideal MHD. It involves a micro-scale cutoff length λ which is a function of density, unlike in the incompressible case. In MHD, it can be taken to be of order the electron collisionless skin depth c/ωpe. Our regularization preserves the symmetries of the original systems and, with appropriate boundary conditions, leads to associated conservation laws. Energy and enstrophy are subject to a priori bounds determined by initial data in contrast to the unregularized systems. A Hamiltonian and Poisson bracket formulation is developed and applied to generalize the constitutive relation to bound higher moments of vorticity. A "swirl" velocity field is identified, and shown to transport w/ρ and B/ρ, generalizing the Kelvin-Helmholtz and Alfvén theorems. The steady regularized equations are used to model a rotating vortex, MHD pinch, and a plane vortex sheet. The proposed regularization could facilitate numerical simulations of fluid/MHD equations and provide a consistent statistical mechanics of vortices/current filaments in 3D, without blowup of enstrophy. Implications for detailed analyses of fluid and plasma dynamic systems arising from our work are briefly discussed.
Processing SPARQL queries with regular expressions in RDF databases
2011-01-01
Background As the Resource Description Framework (RDF) data model is widely used for modeling and sharing a lot of online bioinformatics resources such as Uniprot (dev.isb-sib.ch/projects/uniprot-rdf) or Bio2RDF (bio2rdf.org), SPARQL - a W3C recommendation query for RDF databases - has become an important query language for querying the bioinformatics knowledge bases. Moreover, due to the diversity of users’ requests for extracting information from the RDF data as well as the lack of users’ knowledge about the exact value of each fact in the RDF databases, it is desirable to use the SPARQL query with regular expression patterns for querying the RDF data. To the best of our knowledge, there is currently no work that efficiently supports regular expression processing in SPARQL over RDF databases. Most of the existing techniques for processing regular expressions are designed for querying a text corpus, or only for supporting the matching over the paths in an RDF graph. Results In this paper, we propose a novel framework for supporting regular expression processing in SPARQL query. Our contributions can be summarized as follows. 1) We propose an efficient framework for processing SPARQL queries with regular expression patterns in RDF databases. 2) We propose a cost model in order to adapt the proposed framework in the existing query optimizers. 3) We build a prototype for the proposed framework in C++ and conduct extensive experiments demonstrating the efficiency and effectiveness of our technique. Conclusions Experiments with a full-blown RDF engine show that our framework outperforms the existing ones by up to two orders of magnitude in processing SPARQL queries with regular expression patterns. PMID:21489225
Regular heartbeat dynamics are associated with cardiac health.
Cysarz, Dirk; Lange, Silke; Matthiessen, Peter F; Leeuwen, Peter van
2007-01-01
The human heartbeat series is more variable and, hence, more complex in healthy subjects than in congestive heart failure (CHF) patients. However, little is known about the complexity of the heart rate variations on a beat-to-beat basis. We present an analysis based on symbolic dynamics that focuses on the dynamic features of such beat-to-beat variations on a small time scale. The sequence of acceleration and deceleration of eight successive heartbeats is represented by a binary sequence consisting of ones and zeros. The regularity of such binary patterns is quantified using approximate entropy (ApEn). Holter electrocardiograms from 30 healthy subjects, 15 patients with CHF, and their surrogate data were analyzed with respect to the regularity of such binary sequences. The results are compared with spectral analysis and ApEn of heart rate variability. Counterintuitively, healthy subjects show a large amount of regular beat-to-beat patterns in addition to a considerable amount of irregular patterns. CHF patients show a predominance of one regular beat-to-beat pattern (alternation of acceleration and deceleration), as well as some irregular patterns similar to the patterns observed in the surrogate data. In healthy subjects, regular beat-to-beat patterns reflect the physiological adaptation to different activities, i.e., sympathetic modulation, whereas irregular patterns may arise from parasympathetic modulation. The patterns observed in CHF patients indicate a largely reduced influence of the autonomic nervous system. In conclusion, analysis of short beat-to-beat patterns with respect to regularity leads to a considerable increase of information compared with spectral analysis or ApEn of heart-rate variations. PMID:16973939
Regularization of languages by adults and children: A mathematical framework.
Rische, Jacquelyn L; Komarova, Natalia L
2016-02-01
The fascinating ability of humans to modify the linguistic input and "create" a language has been widely discussed. In the work of Newport and colleagues, it has been demonstrated that both children and adults have some ability to process inconsistent linguistic input and "improve" it by making it more consistent. In Hudson Kam and Newport (2009), artificial miniature language acquisition from an inconsistent source was studied. It was shown that (i) children are better at language regularization than adults and that (ii) adults can also regularize, depending on the structure of the input. In this paper we create a learning algorithm of the reinforcement-learning type, which exhibits patterns reported in Hudson Kam and Newport (2009) and suggests a way to explain them. It turns out that in order to capture the differences between children's and adults' learning patterns, we need to introduce a certain asymmetry in the learning algorithm. Namely, we have to assume that the reaction of the learners differs depending on whether or not the source's input coincides with the learner's internal hypothesis. We interpret this result in the context of a different reaction of children and adults to implicit, expectation-based evidence, positive or negative. We propose that a possible mechanism that contributes to the children's ability to regularize an inconsistent input is related to their heightened sensitivity to positive evidence rather than the (implicit) negative evidence. In our model, regularization comes naturally as a consequence of a stronger reaction of the children to evidence supporting their preferred hypothesis. In adults, their ability to adequately process implicit negative evidence prevents them from regularizing the inconsistent input, resulting in a weaker degree of regularization. PMID:26580218
Zigzag stacks and m-regular linear stacks.
Chen, William Y C; Guo, Qiang-Hui; Sun, Lisa H; Wang, Jian
2014-12-01
The contact map of a protein fold is a graph that represents the patterns of contacts in the fold. It is known that the contact map can be decomposed into stacks and queues. RNA secondary structures are special stacks in which the degree of each vertex is at most one and each arc has length of at least two. Waterman and Smith derived a formula for the number of RNA secondary structures of length n with exactly k arcs. Höner zu Siederdissen et al. developed a folding algorithm for extended RNA secondary structures in which each vertex has maximum degree two. An equation for the generating function of extended RNA secondary structures was obtained by Müller and Nebel by using a context-free grammar approach, which leads to an asymptotic formula. In this article, we consider m-regular linear stacks, where each arc has length at least m and the degree of each vertex is bounded by two. Extended RNA secondary structures are exactly 2-regular linear stacks. For any m ≥ 2, we obtain an equation for the generating function of the m-regular linear stacks. For given m, we deduce a recurrence relation and an asymptotic formula for the number of m-regular linear stacks on n vertices. To establish the equation, we use the reduction operation of Chen, Deng, and Du to transform an m-regular linear stack to an m-reduced zigzag (or alternating) stack. Then we find an equation for m-reduced zigzag stacks leading to an equation for m-regular linear stacks. PMID:25455155
Charged scalar perturbations around a regular magnetic black hole
NASA Astrophysics Data System (ADS)
Huang, Yang; Liu, Dao-Jun
2016-05-01
We study charged scalar perturbations in the background of a regular magnetic black hole. In this case, the charged scalar perturbation does not result in superradiance. By using a careful time-domain analysis, we show that the charge of the scalar field can change the real part of the quasinormal frequency, but has little impact on the imaginary part of the quasinormal frequency and the behavior of the late-time tail. Therefore, the regular magnetic black hole may be stable under the perturbations of a charged scalar field at the linear level.
Mixing of regular and chaotic orbits in beams
Courtlandt L. Bohn et al.
2002-09-04
Phase mixing of chaotic orbits exponentially distributes the orbits through their accessible phase space. This phenomenon, commonly called ''chaotic mixing'', stands in marked contrast to phase mixing of regular orbits which proceeds as a power law in time. It is inherently irreversible; hence, its associated e-folding time scale sets a condition on any process envisioned for emittance compensation. We numerically investigate phase mixing in the presence of space charge, distinguish between the evolution of regular and chaotic orbits, and discuss how phase mixing potentially influences macroscopic properties of high-brightness beams.
On Vertex Covering Transversal Domination Number of Regular Graphs
Vasanthi, R.; Subramanian, K.
2016-01-01
A simple graph G = (V, E) is said to be r-regular if each vertex of G is of degree r. The vertex covering transversal domination number γvct(G) is the minimum cardinality among all vertex covering transversal dominating sets of G. In this paper, we analyse this parameter on different kinds of regular graphs especially for Qn and H3,n. Also we provide an upper bound for γvct of a connected cubic graph of order n ≥ 8. Then we try to provide a more stronger relationship between γ and γvct. PMID:27119089
The cardiovascular effects of regular and decaffeinated coffee.
Smits, P; Thien, T; Van 't Laar, A
1985-01-01
In a single-blind study the effects of drinking two cups of regular or decaffeinated coffee on blood pressure, heart rate, forearm blood flow and plasma concentrations of caffeine, renin and catecholamines were studied in 12 normotensive subjects. Drinking regular coffee led to a rise of blood pressure, a fall of heart rate and an increase of plasma catecholamines. Decaffeinated coffee induced a smaller increase of diastolic blood pressure without changing other parameters. This study shows that the cardiovascular effects of drinking coffee are mainly the result of its caffeine content. PMID:4027129
Surface-based prostate registration with biomechanical regularization
NASA Astrophysics Data System (ADS)
van de Ven, Wendy J. M.; Hu, Yipeng; Barentsz, Jelle O.; Karssemeijer, Nico; Barratt, Dean; Huisman, Henkjan J.
2013-03-01
Adding MR-derived information to standard transrectal ultrasound (TRUS) images for guiding prostate biopsy is of substantial clinical interest. A tumor visible on MR images can be projected on ultrasound by using MRUS registration. A common approach is to use surface-based registration. We hypothesize that biomechanical modeling will better control deformation inside the prostate than a regular surface-based registration method. We developed a novel method by extending a surface-based registration with finite element (FE) simulation to better predict internal deformation of the prostate. For each of six patients, a tetrahedral mesh was constructed from the manual prostate segmentation. Next, the internal prostate deformation was simulated using the derived radial surface displacement as boundary condition. The deformation field within the gland was calculated using the predicted FE node displacements and thin-plate spline interpolation. We tested our method on MR guided MR biopsy imaging data, as landmarks can easily be identified on MR images. For evaluation of the registration accuracy we used 45 anatomical landmarks located in all regions of the prostate. Our results show that the median target registration error of a surface-based registration with biomechanical regularization is 1.88 mm, which is significantly different from 2.61 mm without biomechanical regularization. We can conclude that biomechanical FE modeling has the potential to improve the accuracy of multimodal prostate registration when comparing it to regular surface-based registration.
5 CFR 550.1307 - Authority to regularize paychecks.
Code of Federal Regulations, 2011 CFR
2011-01-01
... PAY ADMINISTRATION (GENERAL) Firefighter Pay § 550.1307 Authority to regularize paychecks. Upon a... an agency's plan to reduce or eliminate variation in the amount of firefighters' biweekly paychecks caused by work scheduling cycles that result in varying hours in the firefighters' tours of duty from...
Rotating bearings in regular and irregular granular shear packings
NASA Astrophysics Data System (ADS)
Ström, J. A. Ã.
2008-01-01
For 2D regular dense packings of solid mono-size non-sliding disks there is a mechanism for bearing formation under shear that can be explained theoretically. There is, however, no easy way to extend this model to include random dense packings which would better describe natural packings. A numerical model that simulates shear deformation for both near-regular and irregular packings is used to demonstrate that rotating bearings appear roughly with the same density in random and regular packings. The main difference appears in the size distribution of the rotating clusters near the jamming threshold. The size distribution is well described by a scaling form with a large-size cut-off that seems to grow without bounds for regular packings at the jamming threshold, while it remains finite for irregular packings. At packing densities above the jamming transition there can be no shear, unless the disks are allowed to break. Breaking of disks induces a large number of small local bearings. Clusters of rotating particles may contribute to e.g. pre-rupture yielding in landslides, snow avalanches and to the formation of aseismic gaps in tectonic fault zones.
Adult Regularization of Inconsistent Input Depends on Pragmatic Factors
ERIC Educational Resources Information Center
Perfors, Amy
2016-01-01
In a variety of domains, adults who are given input that is only partially consistent do not discard the inconsistent portion (regularize) but rather maintain the probability of consistent and inconsistent portions in their behavior (probability match). This research investigates the possibility that adults probability match, at least in part,…
Advance Organizer Strategy for Educable Mentally Retarded and Regular Children.
ERIC Educational Resources Information Center
Chang, Moon K.
The study examined the effects of an advance organizer on the learning and retention of facts and concepts obtained from a sound film by educable mentally retarded (N=30) and regular children (N=30) in a mainstreamed secondary public school class. Also examined was the interaction between the advance organizer and ability levels of the Ss. Results…
Relativistic regular approximations revisited: An infinite-order relativistic approximation
Dyall, K.G.; van Lenthe, E.
1999-07-01
The concept of the regular approximation is presented as the neglect of the energy dependence of the exact Foldy{endash}Wouthuysen transformation of the Dirac Hamiltonian. Expansion of the normalization terms leads immediately to the zeroth-order regular approximation (ZORA) and first-order regular approximation (FORA) Hamiltonians as the zeroth- and first-order terms of the expansion. The expansion may be taken to infinite order by using an un-normalized Foldy{endash}Wouthuysen transformation, which results in the ZORA Hamiltonian and a nonunit metric. This infinite-order regular approximation, IORA, has eigenvalues which differ from the Dirac eigenvalues by order E{sup 3}/c{sup 4} for a hydrogen-like system, which is a considerable improvement over the ZORA eigenvalues, and similar to the nonvariational FORA energies. A further perturbation analysis yields a third-order correction to the IORA energies, TIORA. Results are presented for several systems including the neutral U atom. The IORA eigenvalues for all but the 1s spinor of the neutral system are superior even to the scaled ZORA energies, which are exact for the hydrogenic system. The third-order correction reduces the IORA error for the inner orbitals to a very small fraction of the Dirac eigenvalue. {copyright} {ital 1999 American Institute of Physics.}
Multiple Learning Strategies Project. Medical Assistant. [Regular Vocational. Vol. 3.
ERIC Educational Resources Information Center
Varney, Beverly; And Others
This instructional package, one of four designed for regular vocational students, focuses on the vocational area of medical assistant. Contained in this document are forty learning modules organized into four units: office surgery; telephoning; bandaging; and medications and treatments. Each module includes these elements: a performance objective…
77 FR 15142 - Regular Board of Directors Meeting; Sunshine Act
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-14
.... Executive Session III. Approval of the Regular Board of Directors Meeting Minutes IV. Approval of the Audit Committee Meeting Minutes V. Approval of the Finance, Budget and Program Committee Meeting Minutes VI. Approval of the Corporate Administration Committee Meeting Minutes VII. Approval of FY 2011 Audit...
47 CFR 76.614 - Cable television system regular monitoring.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 4 2010-10-01 2010-10-01 false Cable television system regular monitoring. 76.614 Section 76.614 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Technical Standards § 76.614 Cable...
Sparsely sampling the sky: Regular vs. random sampling
NASA Astrophysics Data System (ADS)
Paykari, P.; Pires, S.; Starck, J.-L.; Jaffe, A. H.
2015-09-01
Aims: The next generation of galaxy surveys, aiming to observe millions of galaxies, are expensive both in time and money. This raises questions regarding the optimal investment of this time and money for future surveys. In a previous work, we have shown that a sparse sampling strategy could be a powerful substitute for the - usually favoured - contiguous observation of the sky. In our previous paper, regular sparse sampling was investigated, where the sparse observed patches were regularly distributed on the sky. The regularity of the mask introduces a periodic pattern in the window function, which induces periodic correlations at specific scales. Methods: In this paper, we use a Bayesian experimental design to investigate a "random" sparse sampling approach, where the observed patches are randomly distributed over the total sparsely sampled area. Results: We find that in this setting, the induced correlation is evenly distributed amongst all scales as there is no preferred scale in the window function. Conclusions: This is desirable when we are interested in any specific scale in the galaxy power spectrum, such as the matter-radiation equality scale. As the figure of merit shows, however, there is no preference between regular and random sampling to constrain the overall galaxy power spectrum and the cosmological parameters.
75 FR 13598 - Regular Board of Directors Meeting; Sunshine Act
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
... From the Federal Register Online via the Government Publishing Office NEIGHBORHOOD REINVESTMENT CORPORATION Regular Board of Directors Meeting; Sunshine Act TIME AND DATE: 10 a.m., Monday, March 22, 2010. PLACE: 1325 G Street, NW., Suite 800 Boardroom, Washington, DC 20005. STATUS: Open. CONTACT PERSON...
78 FR 36794 - Regular Board of Directors Meeting; Sunshine Act
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-19
... From the Federal Register Online via the Government Publishing Office NEIGHBORHOOD REINVESTMENT CORPORATION Regular Board of Directors Meeting; Sunshine Act TIME AND DATE: 9:30 a.m., Tuesday, June 25, 2013. PLACE: 999 North Capitol St NE., Suite 900, Gramlich Boardroom, Washington, DC 20002. STATUS:...
Learning With l1 -Regularizer Based on Markov Resampling.
Gong, Tieliang; Zou, Bin; Xu, Zongben
2016-05-01
Learning with l1 -regularizer has brought about a great deal of research in learning theory community. Previous known results for the learning with l1 -regularizer are based on the assumption that samples are independent and identically distributed (i.i.d.), and the best obtained learning rate for the l1 -regularization type algorithms is O(1/√m) , where m is the samples size. This paper goes beyond the classic i.i.d. framework and investigates the generalization performance of least square regression with l1 -regularizer ( l1 -LSR) based on uniformly ergodic Markov chain (u.e.M.c) samples. On the theoretical side, we prove that the learning rate of l1 -LSR for u.e.M.c samples l1 -LSR(M) is with the order of O(1/m) , which is faster than O(1/√m) for the i.i.d. counterpart. On the practical side, we propose an algorithm based on resampling scheme to generate u.e.M.c samples. We show that the proposed l1 -LSR(M) improves on the l1 -LSR(i.i.d.) in generalization error at the low cost of u.e.M.c resampling. PMID:26011874
A Response to the Regular Education/Special Education Initiative.
ERIC Educational Resources Information Center
McCarthy, Jeanne McCrae
1987-01-01
The position paper of the Division for Learning Disabilities of the Council for Exceptional Children proposes seven components (including the differentiation of learning disabilities from learning problems) of the final policy of the Office of Special Education and Rehabilitative Services concerning the Regular Education/Special Education…
Preverbal Infants Infer Intentional Agents from the Perception of Regularity
ERIC Educational Resources Information Center
Ma, Lili; Xu, Fei
2013-01-01
Human adults have a strong bias to invoke intentional agents in their intuitive explanations of ordered wholes or regular compositions in the world. Less is known about the ontogenetic origin of this bias. In 4 experiments, we found that 9-to 10-month-old infants expected a human hand, but not a mechanical tool with similar affordances, to be the…
New Technologies in Portugal: Regular Middle and High School
ERIC Educational Resources Information Center
Florentino, Teresa; Sanchez, Lucas; Joyanes, Luis
2010-01-01
Purpose: The purpose of this paper is to elaborate upon the relation between information and communication technologies (ICT), particularly web-based resources, and their use, programs and learning in Portuguese middle and high regular public schools. Design/methodology/approach: Adding collected documentation on curriculum, laws and other related…
Regularization in Short-Term Memory for Serial Order
ERIC Educational Resources Information Center
Botvinick, Matthew; Bylsma, Lauren M.
2005-01-01
Previous research has shown that short-term memory for serial order can be influenced by background knowledge concerning regularities of sequential structure. Specifically, it has been shown that recall is superior for sequences that fit well with familiar sequencing constraints. The authors report a corresponding effect pertaining to serial…
Spectral Regularization Algorithms for Learning Large Incomplete Matrices.
Mazumder, Rahul; Hastie, Trevor; Tibshirani, Robert
2010-03-01
We use convex relaxation techniques to provide a sequence of regularized low-rank solutions for large-scale matrix completion problems. Using the nuclear norm as a regularizer, we provide a simple and very efficient convex algorithm for minimizing the reconstruction error subject to a bound on the nuclear norm. Our algorithm Soft-Impute iteratively replaces the missing elements with those obtained from a soft-thresholded SVD. With warm starts this allows us to efficiently compute an entire regularization path of solutions on a grid of values of the regularization parameter. The computationally intensive part of our algorithm is in computing a low-rank SVD of a dense matrix. Exploiting the problem structure, we show that the task can be performed with a complexity linear in the matrix dimensions. Our semidefinite-programming algorithm is readily scalable to large matrices: for example it can obtain a rank-80 approximation of a 10(6) × 10(6) incomplete matrix with 10(5) observed entries in 2.5 hours, and can fit a rank 40 approximation to the full Netflix training set in 6.6 hours. Our methods show very good performance both in training and test error when compared to other competitive state-of-the art techniques. PMID:21552465
12 CFR 311.5 - Regular procedure for closing meetings.
Code of Federal Regulations, 2010 CFR
2010-01-01
... RULES GOVERNING PUBLIC OBSERVATION OF MEETINGS OF THE CORPORATION'S BOARD OF DIRECTORS § 311.5 Regular... a meeting will be taken only when a majority of the entire Board votes to take such action. In deciding whether to close a meeting or portion of a meeting, the Board will consider whether the...
Regular Class Participation System (RCPS). A Final Report.
ERIC Educational Resources Information Center
Ferguson, Dianne L.; And Others
The Regular Class Participation System (RCPS) project attempted to develop, implement, and validate a system for placing and maintaining students with severe disabilities in general education classrooms, with a particular emphasis on achieving both social and learning outcomes for students. A teacher-based planning strategy was developed and…
The Regular Educator's Role in the Individual Education Plan Process.
ERIC Educational Resources Information Center
Weishaar, Mary Konya
2001-01-01
Presents a case that demonstrates why general educators must be knowledgeable about and involved in the individual education plan for a student with a disability. Describes new regulations in individual education plan processes. Concludes that the overall intent of the changes is to bring special educators and regular educators together for the…
Cost Effectiveness of Premium Versus Regular Gasoline in MCPS Buses.
ERIC Educational Resources Information Center
Baacke, Clifford M.; Frankel, Steven M.
The primary question posed in this study is whether premium or regular gasoline is more cost effective for the Montgomery County Public School (MCPS) bus fleet, as a whole, when miles-per-gallon, cost-per-gallon, and repair costs associated with mileage are considered. On average, both miles-per-gallon, and repair costs-per-mile favor premium…
New vision based navigation clue for a regular colonoscope's tip
NASA Astrophysics Data System (ADS)
Mekaouar, Anouar; Ben Amar, Chokri; Redarce, Tanneguy
2009-02-01
Regular colonoscopy has always been regarded as a complicated procedure requiring a tremendous amount of skill to be safely performed. In deed, the practitioner needs to contend with both the tortuousness of the colon and the mastering of a colonoscope. So, he has to take the visual data acquired by the scope's tip into account and rely mostly on his common sense and skill to steer it in a fashion promoting a safe insertion of the device's shaft. In that context, we do propose a new navigation clue for the tip of regular colonoscope in order to assist surgeons over a colonoscopic examination. Firstly, we consider a patch of the inner colon depicted in a regular colonoscopy frame. Then we perform a sketchy 3D reconstruction of the corresponding 2D data. Furthermore, a suggested navigation trajectory ensued on the basis of the obtained relief. The visible and invisible lumen cases are considered. Due to its low cost reckoning, such strategy would allow for the intraoperative configuration changes and thus cut back the non-rigidity effect of the colon. Besides, it would have the trend to provide a safe navigation trajectory through the whole colon, since this approach is aiming at keeping the extremity of the instrument as far as possible from the colon wall during navigation. In order to make effective the considered process, we replaced the original manual control system of a regular colonoscope by a motorized one allowing automatic pan and tilt motions of the device's tip.
Effect of regular and decaffeinated coffee on serum gastrin levels.
Acquaviva, F; DeFrancesco, A; Andriulli, A; Piantino, P; Arrigoni, A; Massarenti, P; Balzola, F
1986-04-01
We evaluated the hypothesis that the noncaffeine gastric acid stimulant effect of coffee might be by way of serum gastrin release. After 10 healthy volunteers drank 50 ml of coffee solution corresponding to one cup of home-made regular coffee containing 10 g of sugar and 240 mg/100 ml of caffeine, serum total gastrin levels peaked at 10 min and returned to basal values within 30 min; the response was of little significance (1.24 times the median basal value). Drinking 100 ml of sugared water (as control) resulted in occasional random elevations of serum gastrin which were not statistically significant. Drinking 100 ml of regular or decaffeinated coffee resulted in a prompt and lasting elevation of total gastrin; mean integrated outputs after regular or decaffeinated coffee were, respectively, 2.3 and 1.7 times the values in the control test. Regular and decaffeinated coffees share a strong gastrin-releasing property. Neither distension, osmolarity, calcium, nor amino acid content of the coffee solution can account for this property, which should be ascribed to some other unidentified ingredient. This property is at least partially lost during the process of caffeine removal. PMID:3745848
Integration of Dependent Handicapped Classes into the Regular School.
ERIC Educational Resources Information Center
Alberta Dept. of Education, Edmonton.
Guidelines are provided for integrating the dependent handicapped student (DHS) into the regular school in Alberta, Canada. A short overview comprises the introduction. Identified are two types of integration: (1) incidental contact and (2) planned contact for social, recreational, and educational activities with other students. Noted are types of…
Information fusion in regularized inversion of tomographic pumping tests
Bohling, G.C.
2008-01-01
In this chapter we investigate a simple approach to incorporating geophysical information into the analysis of tomographic pumping tests for characterization of the hydraulic conductivity (K) field in an aquifer. A number of authors have suggested a tomographic approach to the analysis of hydraulic tests in aquifers - essentially simultaneous analysis of multiple tests or stresses on the flow system - in order to improve the resolution of the estimated parameter fields. However, even with a large amount of hydraulic data in hand, the inverse problem is still plagued by non-uniqueness and ill-conditioning and the parameter space for the inversion needs to be constrained in some sensible fashion in order to obtain plausible estimates of aquifer properties. For seismic and radar tomography problems, the parameter space is often constrained through the application of regularization terms that impose penalties on deviations of the estimated parameters from a prior or background model, with the tradeoff between data fit and model norm explored through systematic analysis of results for different levels of weighting on the regularization terms. In this study we apply systematic regularized inversion to analysis of tomographic pumping tests in an alluvial aquifer, taking advantage of the steady-shape flow regime exhibited in these tests to expedite the inversion process. In addition, we explore the possibility of incorporating geophysical information into the inversion through a regularization term relating the estimated K distribution to ground penetrating radar velocity and attenuation distributions through a smoothing spline model. ?? 2008 Springer-Verlag Berlin Heidelberg.
Regular rotating de Sitter–Kerr black holes and solitons
NASA Astrophysics Data System (ADS)
Dymnikova, Irina; Galaktionov, Evgeny
2016-07-01
We study the basic generic properties of the class of regular rotating solutions asymptotically Kerr for a distant observer, obtained with using the Gürses–Gürsey algorithm from regular spherically symmetric solutions specified by {T}tt={T}rr which belong to the Kerr–Schild metrics. All regular solutions obtained with the Newman–Janis complex translation from the known spherical solutions, belong to this class. Spherical solutions with {T}tt={T}rr satisfying the weak energy condition (WEC), have obligatory de Sitter center. Rotation transforms the de Sitter center into the interior de Sitter vacuum disk. Regular de Sitter–Kerr solutions have at most two horizons and two ergospheres, and two different kinds of interiors. In the case when an original spherical solution satisfies the dominant energy condition, there can exist the interior de Sitter vacuum { S }-surface which contains the de Sitter disk as a bridge. The WEC is violated in the internal cavities between the { S }-surface and the disk, which are filled thus with a phantom fluid. In the case when a related spherical solution violates the dominant energy condition, vacuum interior of a rotating solution reduces to the de Sitter disk only.
47 CFR 76.614 - Cable television system regular monitoring.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 4 2011-10-01 2011-10-01 false Cable television system regular monitoring. 76.614 Section 76.614 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Technical Standards § 76.614 Cable...
Factors Contributing to Regular Smoking in Adolescents in Turkey
ERIC Educational Resources Information Center
Can, Gamze; Topbas, Murat; Oztuna, Funda; Ozgun, Sukru; Can, Emine; Yavuzyilmaz, Asuman
2009-01-01
Purpose: The objectives of this study were to determine the levels of lifetime cigarette use, daily use, and current use among young people (aged 15-19 years) and to examine the risk factors contributing to regular smoking. Methods: The number of students was determined proportionately to the numbers of students in all the high schools in the…
From Numbers to Letters: Feedback Regularization in Visual Word Recognition
ERIC Educational Resources Information Center
Molinaro, Nicola; Dunabeitia, Jon Andoni; Marin-Gutierrez, Alejandro; Carreiras, Manuel
2010-01-01
Word reading in alphabetic languages involves letter identification, independently of the format in which these letters are written. This process of letter "regularization" is sensitive to word context, leading to the recognition of a word even when numbers that resemble letters are inserted among other real letters (e.g., M4TERI4L). The present…
Rhythm's Gonna Get You: Regular Meter Facilitates Semantic Sentence Processing
ERIC Educational Resources Information Center
Rothermich, Kathrin; Schmidt-Kassow, Maren; Kotz, Sonja A.
2012-01-01
Rhythm is a phenomenon that fundamentally affects the perception of events unfolding in time. In language, we define "rhythm" as the temporal structure that underlies the perception and production of utterances, whereas "meter" is defined as the regular occurrence of beats (i.e. stressed syllables). In stress-timed languages such as German, this…
Laplacian Regularized Low-Rank Representation and Its Applications.
Yin, Ming; Gao, Junbin; Lin, Zhouchen
2016-03-01
Low-rank representation (LRR) has recently attracted a great deal of attention due to its pleasing efficacy in exploring low-dimensional subspace structures embedded in data. For a given set of observed data corrupted with sparse errors, LRR aims at learning a lowest-rank representation of all data jointly. LRR has broad applications in pattern recognition, computer vision and signal processing. In the real world, data often reside on low-dimensional manifolds embedded in a high-dimensional ambient space. However, the LRR method does not take into account the non-linear geometric structures within data, thus the locality and similarity information among data may be missing in the learning process. To improve LRR in this regard, we propose a general Laplacian regularized low-rank representation framework for data representation where a hypergraph Laplacian regularizer can be readily introduced into, i.e., a Non-negative Sparse Hyper-Laplacian regularized LRR model (NSHLRR). By taking advantage of the graph regularizer, our proposed method not only can represent the global low-dimensional structures, but also capture the intrinsic non-linear geometric information in data. The extensive experimental results on image clustering, semi-supervised image classification and dimensionality reduction tasks demonstrate the effectiveness of the proposed method. PMID:27046494
Identifying and Exploiting Spatial Regularity in Data Memory References
Mohan, T; de Supinski, B R; McKee, S A; Mueller, F; Yoo, A; Schulz, M
2003-07-24
The growing processor/memory performance gap causes the performance of many codes to be limited by memory accesses. If known to exist in an application, strided memory accesses forming streams can be targeted by optimizations such as prefetching, relocation, remapping, and vector loads. Undetected, they can be a significant source of memory stalls in loops. Existing stream-detection mechanisms either require special hardware, which may not gather statistics for subsequent analysis, or are limited to compile-time detection of array accesses in loops. Formally, little treatment has been accorded to the subject; the concept of locality fails to capture the existence of streams in a program's memory accesses. The contributions of this paper are as follows. First, we define spatial regularity as a means to discuss the presence and effects of streams. Second, we develop measures to quantify spatial regularity, and we design and implement an on-line, parallel algorithm to detect streams - and hence regularity - in running applications. Third, we use examples from real codes and common benchmarks to illustrate how derived stream statistics can be used to guide the application of profile-driven optimizations. Overall, we demonstrate the benefits of our novel regularity metric as a low-cost instrument to detect potential for code optimizations affecting memory performance.
MIT image reconstruction based on edge-preserving regularization.
Casanova, R; Silva, A; Borges, A R
2004-02-01
Tikhonov regularization has been widely used in electrical tomography to deal with the ill-posedness of the inverse problem. However, due to the fact that discontinuities are strongly penalized, this approach tends to produce blurred images. Recently, a lot of interest has been devoted to methods with edge-preserving properties, such as those related to total variation, wavelets and half-quadratic regularization. In the present work, the performance of an edge-preserving regularization method, called ARTUR, is evaluated in the context of magnetic induction tomography (MIT). ARTUR is a deterministic method based on half-quadratic regularization, where complementary a priori information may be introduced in the reconstruction algorithm by the use of a nonnegativity constraint. The method is first tested using an MIT analytical model that generates projection data given the position, the radius and the magnetic permeability of a single nonconductive cylindrical object. It is shown that even in the presence of strong Gaussian additive noise, it is still able to recover the main features of the object. Secondly, reconstructions based on real data for different configurations of conductive nonmagnetic cylindrical objects are presented and some of their parameters estimated. PMID:15005316
Psychological Benefits of Regular Physical Activity: Evidence from Emerging Adults
ERIC Educational Resources Information Center
Cekin, Resul
2015-01-01
Emerging adulthood is a transitional stage between late adolescence and young adulthood in life-span development that requires significant changes in people's lives. Therefore, identifying protective factors for this population is crucial. This study investigated the effects of regular physical activity on self-esteem, optimism, and happiness in…
The Student with Albinism in the Regular Classroom.
ERIC Educational Resources Information Center
Ashley, Julia Robertson
This booklet, intended for regular education teachers who have children with albinism in their classes, begins with an explanation of albinism, then discusses the special needs of the student with albinism in the classroom, and presents information about adaptations and other methods for responding to these needs. Special social and emotional…
Identifying basketball performance indicators in regular season and playoff games.
García, Javier; Ibáñez, Sergio J; De Santos, Raúl Martinez; Leite, Nuno; Sampaio, Jaime
2013-03-01
The aim of the present study was to identify basketball game performance indicators which best discriminate winners and losers in regular season and playoffs. The sample used was composed by 323 games of ACB Spanish Basketball League from the regular season (n=306) and from the playoffs (n=17). A previous cluster analysis allowed splitting the sample in balanced (equal or below 12 points), unbalanced (between 13 and 28 points) and very unbalanced games (above 28 points). A discriminant analysis was used to identify the performance indicators either in regular season and playoff games. In regular season games, the winning teams dominated in assists, defensive rebounds, successful 2 and 3-point field-goals. However, in playoff games the winning teams' superiority was only in defensive rebounding. In practical applications, these results may help the coaches to accurately design training programs to reflect the importance of having different offensive set plays and also have specific conditioning programs to prepare for defensive rebounding. PMID:23717365
12 CFR 311.5 - Regular procedure for closing meetings.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 12 Banks and Banking 4 2011-01-01 2011-01-01 false Regular procedure for closing meetings. 311.5 Section 311.5 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION PROCEDURE AND RULES OF PRACTICE RULES GOVERNING PUBLIC OBSERVATION OF MEETINGS OF THE CORPORATION'S BOARD OF DIRECTORS § 311.5...
Image super-resolution via adaptive filtering and regularization
NASA Astrophysics Data System (ADS)
Ren, Jingbo; Wu, Hao; Dong, Weisheng; Shi, Guangming
2014-11-01
Image super-resolution (SR) is widely used in the fields of civil and military, especially for the low-resolution remote sensing images limited by the sensor. Single-image SR refers to the task of restoring a high-resolution (HR) image from the low-resolution image coupled with some prior knowledge as a regularization term. One classic method regularizes image by total variation (TV) and/or wavelet or some other transform which introduce some artifacts. To compress these shortages, a new framework for single image SR is proposed by utilizing an adaptive filter before regularization. The key of our model is that the adaptive filter is used to remove the spatial relevance among pixels first and then only the high frequency (HF) part, which is sparser in TV and transform domain, is considered as the regularization term. Concretely, through transforming the original model, the SR question can be solved by two alternate iteration sub-problems. Before each iteration, the adaptive filter should be updated to estimate the initial HF. A high quality HF part and HR image can be obtained by solving the first and second sub-problem, respectively. In experimental part, a set of remote sensing images captured by Landsat satellites are tested to demonstrate the effectiveness of the proposed framework. Experimental results show the outstanding performance of the proposed method in quantitative evaluation and visual fidelity compared with the state-of-the-art methods.
Involving Impaired, Disabled, and Handicapped Persons in Regular Camp Programs.
ERIC Educational Resources Information Center
American Alliance for Health, Physical Education, and Recreation, Washington, DC. Information and Research Utilization Center.
The publication provides some broad guidelines for serving impaired, disabled, and handicapped children in nonspecialized or regular day and residential camps. Part One on the rationale and basis for integrated camping includes three chapters which cover mainstreaming and the normalization principle, the continuum of services (or Cascade System)…
Regular and homeward travel speeds of arctic wolves
Mech, L.D.
1994-01-01
Single wolves (Canis lupus arctos), a pair, and a pack of five habituated to the investigator on an all-terrain vehicle were followed on Ellesmere Island, Northwest Territories, Canada, during summer. Their mean travel speed was measured on barren ground at 8.7 km/h during regular travel and 10.0 km/h when returning to a den.
Nonnative Processing of Verbal Morphology: In Search of Regularity
ERIC Educational Resources Information Center
Gor, Kira; Cook, Svetlana
2010-01-01
There is little agreement on the mechanisms involved in second language (L2) processing of regular and irregular inflectional morphology and on the exact role of age, amount, and type of exposure to L2 resulting in differences in L2 input and use. The article contributes to the ongoing debates by reporting the results of two experiments on Russian…
Multiple Learning Strategies Project. Medical Assistant. [Regular Vocational. Vol. 4.
ERIC Educational Resources Information Center
Varney, Beverly; And Others
This instructional package, one of four designed for regular vocational students, focuses on the vocational area of medical assistant. Contained in this document are forty-seven learning modules organized into nine units: review for competency; third-party billing; patient teaching; skill building; bookkeeping; interpersonal relationships; medical…
Low-Rank Regularization for Learning Gene Expression Programs
Ye, Guibo; Tang, Mengfan; Cai, Jian-Feng; Nie, Qing; Xie, Xiaohui
2013-01-01
Learning gene expression programs directly from a set of observations is challenging due to the complexity of gene regulation, high noise of experimental measurements, and insufficient number of experimental measurements. Imposing additional constraints with strong and biologically motivated regularizations is critical in developing reliable and effective algorithms for inferring gene expression programs. Here we propose a new form of regulation that constrains the number of independent connectivity patterns between regulators and targets, motivated by the modular design of gene regulatory programs and the belief that the total number of independent regulatory modules should be small. We formulate a multi-target linear regression framework to incorporate this type of regulation, in which the number of independent connectivity patterns is expressed as the rank of the connectivity matrix between regulators and targets. We then generalize the linear framework to nonlinear cases, and prove that the generalized low-rank regularization model is still convex. Efficient algorithms are derived to solve both the linear and nonlinear low-rank regularized problems. Finally, we test the algorithms on three gene expression datasets, and show that the low-rank regularization improves the accuracy of gene expression prediction in these three datasets. PMID:24358148
Distances and isomorphisms in 4-regular circulant graphs
NASA Astrophysics Data System (ADS)
Donno, Alfredo; Iacono, Donatella
2016-06-01
We compute the Wiener index and the Hosoya polynomial of the Cayley graph of some cyclic groups, with all possible generating sets containing four elements, up to isomorphism. We find out that the order 17 is the smallest case providing two non-isomorphic 4-regular circulant graphs with the same Wiener index. Some open problems and questions are listed.
Elementary Teachers' Perspectives of Inclusion in the Regular Education Classroom
ERIC Educational Resources Information Center
Olinger, Becky Lorraine
2013-01-01
The purpose of this qualitative study was to examine regular education and special education teacher perceptions of inclusion services in an elementary school setting. In this phenomenological study, purposeful sampling techniques and data were used to conduct a study of inclusion in the elementary schools. In-depth one-to-one interviews with 8…
A Unified Approach for Solving Nonlinear Regular Perturbation Problems
ERIC Educational Resources Information Center
Khuri, S. A.
2008-01-01
This article describes a simple alternative unified method of solving nonlinear regular perturbation problems. The procedure is based upon the manipulation of Taylor's approximation for the expansion of the nonlinear term in the perturbed equation. An essential feature of this technique is the relative simplicity used and the associated unified…
Implicit Learning of L2 Word Stress Regularities
ERIC Educational Resources Information Center
Chan, Ricky K. W.; Leung, Janny H. C.
2014-01-01
This article reports an experiment on the implicit learning of second language stress regularities, and presents a methodological innovation on awareness measurement. After practising two-syllable Spanish words, native Cantonese speakers with English as a second language (L2) completed a judgement task. Critical items differed only in placement of…
Acquisition of Formulaic Sequences in Intensive and Regular EFL Programmes
ERIC Educational Resources Information Center
Serrano, Raquel; Stengers, Helene; Housen, Alex
2015-01-01
This paper aims to analyse the role of time concentration of instructional hours on the acquisition of formulaic sequences in English as a foreign language (EFL). Two programme types that offer the same amount of hours of instruction are considered: intensive (110 hours/1 month) and regular (110 hours/7 months). The EFL learners under study are…
A simple way to measure daily lifestyle regularity
NASA Technical Reports Server (NTRS)
Monk, Timothy H.; Frank, Ellen; Potts, Jaime M.; Kupfer, David J.
2002-01-01
A brief diary instrument to quantify daily lifestyle regularity (SRM-5) is developed and compared with a much longer version of the instrument (SRM-17) described and used previously. Three studies are described. In Study 1, SRM-17 scores (2 weeks) were collected from a total of 293 healthy control subjects (both genders) aged between 19 and 92 years. Five items (1) Get out of bed, (2) First contact with another person, (3) Start work, housework or volunteer activities, (4) Have dinner, and (5) Go to bed were then selected from the 17 items and SRM-5 scores calculated as if these five items were the only ones collected. Comparisons were made with SRM-17 scores from the same subject-weeks, looking at correlations between the two SRM measures, and the effects of age and gender on lifestyle regularity as measured by the two instruments. In Study 2 this process was repeated in a group of 27 subjects who were in remission from unipolar depression after treatment with psychotherapy and who completed SRM-17 for at least 20 successive weeks. SRM-5 and SRM-17 scores were then correlated within an individual using time as the random variable, allowing an indication of how successful SRM-5 was in tracking changes in lifestyle regularity (within an individual) over time. In Study 3 an SRM-5 diary instrument was administered to 101 healthy control subjects (both genders, aged 20-59 years) for two successive weeks to obtain normative measures and to test for correlations with age and morningness. Measures of lifestyle regularity from SRM-5 correlated quite well (about 0.8) with those from SRM-17 both between subjects, and within-subjects over time. As a detector of irregularity as defined by SRM-17, the SRM-5 instrument showed acceptable values of kappa (0.69), sensitivity (74%) and specificity (95%). There were, however, differences in mean level, with SRM-5 scores being about 0.9 units [about one standard deviation (SD)] above SRM-17 scores from the same subject-weeks. SRM-5
Pitch strength of regular-interval click trains with different length “runs” of regular intervals
Yost, William A.; Mapes-Riordan, Dan; Shofner, William; Dye, Raymond; Sheft, Stanley
2009-01-01
Click trains were generated with first- and second-order statistics following Kaernbach and Demany [J. Acoust. Soc. Am. 104, 2298–2306 (1998)]. First-order intervals are between successive clicks, while second-order intervals are those between every other click. Click trains were generated with a repeating alternation of fixed and random intervals which produce a pitch at the reciprocal of the duration of the fixed interval. The intervals were then randomly shuffled and compared to the unshuffled, alternating click trains in pitch-strength comparison experiments. In almost all comparisons for the first-order interval stimuli, the shuffled-interval click trains had a stronger pitch strength than the unshuffled-interval click trains. The shuffled-interval click trains only produced stronger pitches for second-order interval stimuli when the click trains were unfiltered. Several experimental conditions and an analysis of runs of regular and random intervals in these click trains suggest that the auditory system is sensitive to runs of regular intervals in a stimulus that contains a mix of regular and random intervals. These results indicate that fine-structure regularity plays a more important role in pitch perception than randomness, and that the long-term autocorrelation function or spectra of these click trains are not good predictors of pitch strength. PMID:15957774
20 CFR 220.17 - Recovery from disability for work in the regular occupation.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Work in an Employee's Regular Railroad Occupation § 220.17 Recovery from disability for work in the regular occupation. (a) General. Disability for work in the regular occupation will end if— (1) There is... the duties of his or her regular occupation. The Board provides a trial work period before...
20 CFR 220.17 - Recovery from disability for work in the regular occupation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Work in an Employee's Regular Railroad Occupation § 220.17 Recovery from disability for work in the regular occupation. (a) General. Disability for work in the regular occupation will end if— (1) There is... the duties of his or her regular occupation. The Board provides a trial work period before...
Regularizing the r-mode Problem for Nonbarotropic Relativistic Stars
NASA Technical Reports Server (NTRS)
Lockitch, Keith H.; Andersson, Nils; Watts, Anna L.
2004-01-01
We present results for r-modes of relativistic nonbarotropic stars. We show that the main differential equation, which is formally singular at lowest order in the slow-rotation expansion, can be regularized if one considers the initial value problem rather than the normal mode problem. However, a more physically motivated way to regularize the problem is to include higher order terms. This allows us to develop a practical approach for solving the problem and we provide results that support earlier conclusions obtained for uniform density stars. In particular, we show that there will exist a single r-mode for each permissible combination of 1 and m. We discuss these results and provide some caveats regarding their usefulness for estimates of gravitational-radiation reaction timescales. The close connection between the seemingly singular relativistic r-mode problem and issues arising because of the presence of co-rotation points in differentially rotating stars is also clarified.
Mechanisms of evolution of avalanches in regular graphs
NASA Astrophysics Data System (ADS)
Handford, Thomas P.; Pérez-Reche, Francisco J.; Taraskin, Sergei N.
2013-06-01
A mapping of avalanches occurring in the zero-temperature random-field Ising model to life periods of a population experiencing immigration is established. Such a mapping allows the microscopic criteria for the occurrence of an infinite avalanche in a q-regular graph to be determined. A key factor for an avalanche of spin flips to become infinite is that it interacts in an optimal way with previously flipped spins. Based on these criteria, we explain why an infinite avalanche can occur in q-regular graphs only for q>3 and suggest that this criterion might be relevant for other systems. The generating function techniques developed for branching processes are applied to obtain analytical expressions for the durations, pulse shapes, and power spectra of the avalanches. The results show that only very long avalanches exhibit a significant degree of universality.
Generalization Bounds Derived IPM-Based Regularization for Domain Adaptation.
Meng, Juan; Hu, Guyu; Li, Dong; Zhang, Yanyan; Pan, Zhisong
2016-01-01
Domain adaptation has received much attention as a major form of transfer learning. One issue that should be considered in domain adaptation is the gap between source domain and target domain. In order to improve the generalization ability of domain adaption methods, we proposed a framework for domain adaptation combining source and target data, with a new regularizer which takes generalization bounds into account. This regularization term considers integral probability metric (IPM) as the distance between the source domain and the target domain and thus can bound up the testing error of an existing predictor from the formula. Since the computation of IPM only involves two distributions, this generalization term is independent with specific classifiers. With popular learning models, the empirical risk minimization is expressed as a general convex optimization problem and thus can be solved effectively by existing tools. Empirical studies on synthetic data for regression and real-world data for classification show the effectiveness of this method. PMID:26819589
Giant regular polyhedra from calixarene carboxylates and uranyl
Pasquale, Sara; Sattin, Sara; Escudero-Adán, Eduardo C.; Martínez-Belmonte, Marta; de Mendoza, Javier
2012-01-01
Self-assembly of large multi-component systems is a common strategy for the bottom-up construction of discrete, well-defined, nanoscopic-sized cages. Icosahedral or pseudospherical viral capsids, built up from hundreds of identical proteins, constitute typical examples of the complexity attained by biological self-assembly. Chemical versions of the so-called 5 Platonic regular or 13 Archimedean semi-regular polyhedra are usually assembled combining molecular platforms with metals with commensurate coordination spheres. Here we report novel, self-assembled cages, using the conical-shaped carboxylic acid derivatives of calix[4]arene and calix[5]arene as ligands, and the uranyl cation UO22+ as a metallic counterpart, which coordinates with three carboxylates at the equatorial plane, giving rise to hexagonal bipyramidal architectures. As a result, octahedral and icosahedral anionic metallocages of nanoscopic dimensions are formed with an unusually small number of components. PMID:22510690
Resolving intravoxel fiber architecture using nonconvex regularized blind compressed sensing
NASA Astrophysics Data System (ADS)
Chu, C. Y.; Huang, J. P.; Sun, C. Y.; Liu, W. Y.; Zhu, Y. M.
2015-03-01
In diffusion magnetic resonance imaging, accurate and reliable estimation of intravoxel fiber architectures is a major prerequisite for tractography algorithms or any other derived statistical analysis. Several methods have been proposed that estimate intravoxel fiber architectures using low angular resolution acquisitions owing to their shorter acquisition time and relatively low b-values. But these methods are highly sensitive to noise. In this work, we propose a nonconvex regularized blind compressed sensing approach to estimate intravoxel fiber architectures in low angular resolution acquisitions. The method models diffusion-weighted (DW) signals as a sparse linear combination of unfixed reconstruction basis functions and introduces a nonconvex regularizer to enhance the noise immunity. We present a general solving framework to simultaneously estimate the sparse coefficients and the reconstruction basis. Experiments on synthetic, phantom, and real human brain DW images demonstrate the superiority of the proposed approach.
Generalization of visual regularities in newly hatched chicks (Gallus gallus).
Santolin, Chiara; Rosa-Salva, Orsola; Regolin, Lucia; Vallortigara, Giorgio
2016-09-01
Evidence of learning and generalization of visual regularities in a newborn organism is provided in the present research. Domestic chicks have been trained to discriminate visual triplets of simultaneously presented shapes, implementing AAB versus ABA (Experiment 1), AAB versus ABB and AAB versus BAA (Experiment 2). Chicks distinguished pattern-following and pattern-violating novel test triplets in all comparisons, showing no preference for repetition-based patterns. The animals generalized to novel instances even when the patterns compared were not discriminable by the presence or absence of reduplicated elements or by symmetry (e.g., AAB vs. ABB). These findings represent the first evidence of learning and generalization of regularities at the onset of life in an animal model, revealing intriguing differences with respect to human newborns and infants. Extensive prior experience seems to be unnecessary to drive the process, suggesting that chicks are predisposed to detect patterns characterizing the visual world. PMID:27287627
Special education and the regular education initiative: basic assumptions.
Jenkins, J R; Pious, C G; Jewell, M
1990-04-01
The regular education initiative (REI) is a thoughtful response to identified problems in our system for educating low-performing children, but it is a not a detailed blueprint for changing the system. Educators must achieve consensus on what the REI actually proposes. The authors infer from the REI literature five assumptions regarding the roles and responsibilities of elementary regular classroom teachers, concluding that these teachers and specialists form a partnership, but the classroom teachers are ultimately in charge of the instruction of all children in their classrooms, including those who are not succeeding in the mainstream. A discussion of the target population and of several partnership models further delineates REI issues and concerns. PMID:2185027
Boundary values of the Schwarzian derivative of a regular function
Dubinin, Vladimir N
2011-05-31
Regular functions f in the half-plane Imz>0 admitting an asymptotic expansion f(z)=c{sub 1}z+c{sub 2}z{sup 2}+c{sub 3}z{sup 3}+{gamma}(z)z{sup 3}, where c{sub 1}>0, Imc{sub 2}=0 and the angular limit
Regular Expression-Based Learning for METs Value Extraction.
Redd, Douglas; Kuang, Jinqiu; Mohanty, April; Bray, Bruce E; Zeng-Treitler, Qing
2016-01-01
Functional status as measured by exercise capacity is an important clinical variable in the care of patients with cardiovascular diseases. Exercise capacity is commonly reported in terms of Metabolic Equivalents (METs). In the medical records, METs can often be found in a variety of clinical notes. To extract METs values, we adapted a machine-learning algorithm called REDEx to automatically generate regular expressions. Trained and tested on a set of 2701 manually annotated text snippets (i.e. short pieces of text), the regular expressions were able to achieve good accuracy and F-measure of 0.89 and 0.86. This extraction tool will allow us to process the notes of millions of cardiovascular patients and extract METs value for use by researchers and clinicians. PMID:27570673
The Unique Maximal GF-Regular Submodule of a Module
Abduldaim, Areej M.; Chen, Sheng
2013-01-01
An R-module A is called GF-regular if, for each a ∈ A and r ∈ R, there exist t ∈ R and a positive integer n such that rntrna = rna. We proved that each unitary R-module A contains a unique maximal GF-regular submodule, which we denoted by MGF(A). Furthermore, the radical properties of A are investigated; we proved that if A is an R-module and K is a submodule of A, then MGF(K) = K∩MGF(A). Moreover, if A is projective, then MGF(A) is a G-pure submodule of A and MGF(A) = M(R) · A. PMID:24163628
Tikhonov regularization-based operational transfer path analysis
NASA Astrophysics Data System (ADS)
Cheng, Wei; Lu, Yingying; Zhang, Zhousuo
2016-06-01
To overcome ill-posed problems in operational transfer path analysis (OTPA), and improve the stability of solutions, this paper proposes a novel OTPA based on Tikhonov regularization, which considers both fitting degrees and stability of solutions. Firstly, fundamental theory of Tikhonov regularization-based OTPA is presented, and comparative studies are provided to validate the effectiveness on ill-posed problems. Secondly, transfer path analysis and source contribution evaluations for numerical cases studies on spherical radiating acoustical sources are comparatively studied. Finally, transfer path analysis and source contribution evaluations for experimental case studies on a test bed with thin shell structures are provided. This study provides more accurate transfer path analysis for mechanical systems, which can benefit for vibration reduction by structural path optimization. Furthermore, with accurate evaluation of source contributions, vibration monitoring and control by active controlling vibration sources can be effectively carried out.
Detection of Fukushima plume within regular Slovenian environmental radioactivity surveillance.
Glavič-Cindro, D; Benedik, L; Kožar Logar, J; Vodenik, B; Zorko, B
2013-11-01
After the Fukushima accident aerosol and rain water samples collected within regular national monitoring programmes were carefully analysed. In rain water samples, aerosol and iodine filters collected in the second half of March and in April 2011 I-131, Cs-134 and Cs-137 were detected. In May 2011 the activities of I-131 and Cs-134 were close or below the detection limit and Cs-137 reached values from the period before the Fukushima accident. Additionally plutonium and americium activity concentrations in aerosol filters were analysed. These measured data were compared with measured data after the Chernobyl contamination in Slovenia in 1986. We can conclude that with adequate regular monitoring programmes influences of radioactivity contamination due to nuclear accidents worldwide can be properly assessed. PMID:23611815
Partial Regularity for Holonomic Minimisers of Quasiconvex Functionals
NASA Astrophysics Data System (ADS)
Hopper, Christopher P.
2016-05-01
We prove partial regularity for local minimisers of certain strictly quasiconvex integral functionals, over a class of Sobolev mappings into a compact Riemannian manifold, to which such mappings are said to be holonomically constrained. Our approach uses the lifting of Sobolev mappings to the universal covering space, the connectedness of the covering space, an application of Ekeland's variational principle and a certain tangential A -harmonic approximation lemma obtained directly via a Lipschitz approximation argument. This allows regularity to be established directly on the level of the gradient. Several applications to variational problems in condensed matter physics with broken symmetries are also discussed, in particular those concerning the superfluidity of liquid helium-3 and nematic liquid crystals.
Total Variation Regularization Used in Electrical Capacitance Tomography
NASA Astrophysics Data System (ADS)
Wang, Huaxiang; Tang, Lei
2007-06-01
To solve ill-posed problem and poor resolution in electrical capacitance tomography (ECT), a new image reconstruction algorithm based on total variation (TV) regularization is proposed and a new self-adaptive mesh refinement strategy is put forward. Compared with the conventional Tikhonov regularization, this new algorithm not only stabilizes the reconstruction, but also enhances the distinguishability of the reconstruction image in areas with discontinuous medium distribution. It possesses a good edge-preserving property. The self-adaptive mesh generation technique based on this algorithm can refine the mesh automatically in specific areas according to medium distribution. This strategy keeps high resolution as refining all elements over the region but reduces calculation loads, thereby speeds up the reconstruction. Both simulation and experimental results show that this algorithm has advantages in terms of the resolution and real-time performance.
Regular expressions for decoding of neural network outputs.
Strauß, Tobias; Leifert, Gundram; Grüning, Tobias; Labahn, Roger
2016-07-01
This article proposes a convenient tool for decoding the output of neural networks trained by Connectionist Temporal Classification (CTC) for handwritten text recognition. We use regular expressions to describe the complex structures expected in the writing. The corresponding finite automata are employed to build a decoder. We analyze theoretically which calculations are relevant and which can be avoided. A great speed-up results from an approximation. We conclude that the approximation most likely fails if the regular expression does not match the ground truth which is not harmful for many applications since the low probability will be even underestimated. The proposed decoder is very efficient compared to other decoding methods. The variety of applications reaches from information retrieval to full text recognition. We refer to applications where we integrated the proposed decoder successfully. PMID:27078574
Simulation Of Attenuation Regularity Of Detonation Wave In Pmma
NASA Astrophysics Data System (ADS)
Lan, Wei; Xiaomian, Hu
2012-03-01
Polymethyl methacrylate (PMMA) is often used as clapboard or protective medium in the parameter measurement of detonation wave propagation. Theoretical and experimental researches show that the pressure of shock wave in condensed material has the regularity of exponential attenuation with the distance of propagation. Simulation of detonation produced shock wave propagation in PMMA was conducted using a two-dimensional Lagrangian computational fluid dynamics program, and results were compared with the experimental data. Different charge diameters and different angles between the direction of detonation wave propagation and the normal direction of confined boundary were considered during the calculation. Results show that the detonation produced shock wave propagation in PMMA accords with the exponential regularity of shock wave attenuation in condensed material, and several factors are relevant to the attenuation coefficient, such as charge diameter and interface angle.
Simulation of attenuation regularity of detonation wave in PMMA
NASA Astrophysics Data System (ADS)
Lan, Wei; Xiaomian, Hu
2011-06-01
Polymethyl methacrylate (PMMA) is often used as clapboard or protective medium in the parameter measurement of detonation wave propagation, due to its similar shock impedance with the explosive. Theoretical and experimental research show that the pressure of shock wave in condensed material has the regularity of exponential attenuation with the distance of propagation. Simulation of detonation wave propagation in PMMA is conducted using a two-dimensional Lagrangian computational fluid dynamics program, and results are compared with the experimental data. Different charge diameters and different angles between the direction of detonation wave propagation and the normal direction of confined boundary are considered during the calculation. Results show that the detonation wave propagation in PMMA accords with the exponential regularity of shock wave attenuation in condensed material, and several factors are relevant to the attenuation coefficient, such as charge diameter and interface angle.
Generalization Bounds Derived IPM-Based Regularization for Domain Adaptation
Meng, Juan; Hu, Guyu; Zhang, Yanyan
2016-01-01
Domain adaptation has received much attention as a major form of transfer learning. One issue that should be considered in domain adaptation is the gap between source domain and target domain. In order to improve the generalization ability of domain adaption methods, we proposed a framework for domain adaptation combining source and target data, with a new regularizer which takes generalization bounds into account. This regularization term considers integral probability metric (IPM) as the distance between the source domain and the target domain and thus can bound up the testing error of an existing predictor from the formula. Since the computation of IPM only involves two distributions, this generalization term is independent with specific classifiers. With popular learning models, the empirical risk minimization is expressed as a general convex optimization problem and thus can be solved effectively by existing tools. Empirical studies on synthetic data for regression and real-world data for classification show the effectiveness of this method. PMID:26819589
Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms
NASA Astrophysics Data System (ADS)
Lee, Chien-Cheng; Huang, Shin-Sheng; Shih, Cheng-Yuan
2010-12-01
This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB) with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO) algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.
A photometric study of four semi-regular variable stars
NASA Astrophysics Data System (ADS)
Benitez, P. M.; Vargas, M. J.
2002-12-01
A series of photometric observations of four questionable semi-regular variable stars in the V band of the Johnson system were performed using the 0.4m telescope at Extremadura University (Spain). These observations spanned 66 days for SY Leonis, W Sextantis and TZ Hydrae and 61 days for X Sextantis. Only SY Leonis appears undoubtedly classified as SRb in the GCVS4, while the other three stars are classified as semiregular variables with doubtful subclassification. In this work, the SRb classification for SY Leonis is confirmed, and a preliminary classification for the other three stars, based on the shape of their light curves, spectral types and luminosity classes is given. The study demonstrates that relatively modest local facilities can contribute to the knowledge of semi-regular variable objects.
Regular Expression-Based Learning for METs Value Extraction
Redd, Douglas; Kuang, Jinqiu; Mohanty, April; Bray, Bruce E.; Zeng-Treitler, Qing
2016-01-01
Functional status as measured by exercise capacity is an important clinical variable in the care of patients with cardiovascular diseases. Exercise capacity is commonly reported in terms of Metabolic Equivalents (METs). In the medical records, METs can often be found in a variety of clinical notes. To extract METs values, we adapted a machine-learning algorithm called REDEx to automatically generate regular expressions. Trained and tested on a set of 2701 manually annotated text snippets (i.e. short pieces of text), the regular expressions were able to achieve good accuracy and F-measure of 0.89 and 0.86. This extraction tool will allow us to process the notes of millions of cardiovascular patients and extract METs value for use by researchers and clinicians. PMID:27570673
Proper time regularization at finite quark chemical potential
NASA Astrophysics Data System (ADS)
Zhang, Jin-Li; Shi, Yuan-Mei; Xu, Shu-Sheng; Zong, Hong-Shi
2016-04-01
In this paper, we use the two-flavor Nambu-Jona-Lasinio (NJL) model to study the quantum chromodynamics (QCD) chiral phase transition. To deal with the ultraviolet (UV) issue, we adopt the popular proper time regularization (PTR), which is commonly used not only for hadron physics but also for the studies with magnetic fields. This regularization scheme can introduce the infrared (IR) cutoff to include quark confinement. We generalize the PTR to zero temperature and finite chemical potential case use a completely new method, and then study the chiral susceptibility, both in the chiral limit case and with finite current quark mass. The chiral phase transition is second-order in μ = 0 and T = 0 and crossover at μ≠0 and T = 0. Three sets of parameters are used to make sure that the results do not depend on the parameter choice.
Drop impact upon superhydrophobic surfaces with regular and hierarchical roughness
NASA Astrophysics Data System (ADS)
Lv, Cunjing; Hao, Pengfei; Zhang, Xiwen; He, Feng
2016-04-01
Recent studies demonstrate that roughness and morphologies of the textures play essential roles on the dynamics of water drop impacting onto superhydrophobic substrates. Particularly, significant reduction of contact time has greatly attracted people's attention. We experimentally investigate drop impact dynamics onto three types of superhydrophobic surfaces, consisting of regular micropillars, two-tier textures with nano/micro-scale roughness, and hierarchical textures with random roughness. It shows that the contact time is controlled by the Weber number and the roughness of the surface. Compared with drop impact on regular micropillared surfaces, the contact time can be finely reduced by increasing the Weber number on surfaces with two-tier textures, but can be remarkably reduced on surfaces with hierarchical textures resulting from the prompt splash and fragmentation of liquid lamellae. Our study may shed lights on textured materials fabrication, allowing a rapid drop detachment to realize broad applications.
Electromechanical Mode Online Estimation using Regularized Robust RLS Methods
Zhou, Ning; Trudnowski, Daniel; Pierre, John W; Mittelstadt, William
2008-11-01
This paper proposes a regularized robust recursive least square (R3LS) method for on-line estimation of power-system electromechanical modes based on synchronized phasor measurement unit (PMU) data. The proposed method utilizes an autoregressive moving average exogenous (ARMAX) model to account for typical measurement data, which includes low-level pseudo-random probing, ambient, and ringdown data. A robust objective function is utilized to reduce the negative influence from non-typical data, which include outliers and missing data. A dynamic regularization method is introduced to help include a priori knowledge about the system and reduce the influence of under-determined problems. Based on a 17-machine simulation model, it is shown through the Monte-Carlo method that the proposed R3LS method can estimate and track electromechani-cal modes by effectively using combined typical and non-typical measurement data.
Longitudinal structure of solar activity: Regular and stochastic behavior
NASA Astrophysics Data System (ADS)
Erofeev, D. V.
2015-12-01
The ratio of regular and stochastic components in the behavior of the longitudinal-temporal distribution of solar activity is studied with the use of correlation and spectral analysis of data on sunspot groups for 12 solar cycles. It was found that data samples of about 10 years in length often (in 50% of cases) show the presence of regular structures in the longitudinal distribution of sunspot groups. However, these structures are nonstationary; their characteristic scales and rotation periods vary when changing from one 10-year interval to another. The behavior of the longitudinal structure of sunspot activity is mainly stochastic on a long time scale (50-100 years); it is characterized by a wide spectrum of spatial scales and a continuous spectrum of rotation periods, which takes a period from 25.6 to 28.5 days.
Speckle regularization and miniaturization of computer-generated holographic stereograms.
Takaki, Yasuhiro; Taira, Kengo
2016-03-21
Holographic stereograms produce multiple parallax images that are seen from multiple viewpoints. Because random phase distributions are added to the parallax images to remove areas where images cannot be seen in the viewing area, speckles are generated in the reconstructed images. In this study, virtual viewpoints are inserted between the original viewpoints (real viewpoints) to make the interval of the viewpoints smaller than the pupil diameter of the eyes in order to remove the areas without images. In this case, regular interference patterns appear in the reconstructed images instead of the speckle patterns. The proper phase modulation of the parallax images displayed to the real and virtual viewpoints increases the spatial frequencies of the regular interference patterns on retinas so that the eyes cannot perceive them. The proposed technique was combined with the multiview-based holographic stereogram calculation technique and was experimentally verified. PMID:27136824
Statistical Regularities Attract Attention when Task-Relevant.
Alamia, Andrea; Zénon, Alexandre
2016-01-01
Visual attention seems essential for learning the statistical regularities in our environment, a process known as statistical learning. However, how attention is allocated when exploring a novel visual scene whose statistical structure is unknown remains unclear. In order to address this question, we investigated visual attention allocation during a task in which we manipulated the conditional probability of occurrence of colored stimuli, unbeknown to the subjects. Participants were instructed to detect a target colored dot among two dots moving along separate circular paths. We evaluated implicit statistical learning, i.e., the effect of color predictability on reaction times (RTs), and recorded eye position concurrently. Attention allocation was indexed by comparing the Mahalanobis distance between the position, velocity and acceleration of the eyes and the two colored dots. We found that learning the conditional probabilities occurred very early during the course of the experiment as shown by the fact that, starting already from the first block, predictable stimuli were detected with shorter RT than unpredictable ones. In terms of attentional allocation, we found that the predictive stimulus attracted gaze only when it was informative about the occurrence of the target but not when it predicted the occurrence of a task-irrelevant stimulus. This suggests that attention allocation was influenced by regularities only when they were instrumental in performing the task. Moreover, we found that the attentional bias towards task-relevant predictive stimuli occurred at a very early stage of learning, concomitantly with the first effects of learning on RT. In conclusion, these results show that statistical regularities capture visual attention only after a few occurrences, provided these regularities are instrumental to perform the task. PMID:26903846
Statistical Regularities Attract Attention when Task-Relevant
Alamia, Andrea; Zénon, Alexandre
2016-01-01
Visual attention seems essential for learning the statistical regularities in our environment, a process known as statistical learning. However, how attention is allocated when exploring a novel visual scene whose statistical structure is unknown remains unclear. In order to address this question, we investigated visual attention allocation during a task in which we manipulated the conditional probability of occurrence of colored stimuli, unbeknown to the subjects. Participants were instructed to detect a target colored dot among two dots moving along separate circular paths. We evaluated implicit statistical learning, i.e., the effect of color predictability on reaction times (RTs), and recorded eye position concurrently. Attention allocation was indexed by comparing the Mahalanobis distance between the position, velocity and acceleration of the eyes and the two colored dots. We found that learning the conditional probabilities occurred very early during the course of the experiment as shown by the fact that, starting already from the first block, predictable stimuli were detected with shorter RT than unpredictable ones. In terms of attentional allocation, we found that the predictive stimulus attracted gaze only when it was informative about the occurrence of the target but not when it predicted the occurrence of a task-irrelevant stimulus. This suggests that attention allocation was influenced by regularities only when they were instrumental in performing the task. Moreover, we found that the attentional bias towards task-relevant predictive stimuli occurred at a very early stage of learning, concomitantly with the first effects of learning on RT. In conclusion, these results show that statistical regularities capture visual attention only after a few occurrences, provided these regularities are instrumental to perform the task. PMID:26903846
Visual Mismatch Negativity Reveals Automatic Detection of Sequential Regularity Violation
Stefanics, Gábor; Kimura, Motohiro; Czigler, István
2011-01-01
Sequential regularities are abstract rules based on repeating sequences of environmental events, which are useful to make predictions about future events. Here, we tested whether the visual system is capable to detect sequential regularity in unattended stimulus sequences. The visual mismatch negativity (vMMN) component of the event-related potentials is sensitive to the violation of complex regularities (e.g., object-related characteristics, temporal patterns). We used the vMMN component as an index of violation of conditional (if, then) regularities. In the first experiment, to investigate emergence of vMMN and other change-related activity to the violation of conditional rules, red and green disk patterns were delivered in pairs. The majority of pairs comprised of disk patterns with identical colors, whereas in deviant pairs the colors were different. The probabilities of the two colors were equal. The second member of the deviant pairs elicited a vMMN with longer latency and more extended spatial distribution to deviants with lower probability (10 vs. 30%). In the second (control) experiment the emergence of vMMN to violation of a simple, feature-related rule was studied using oddball sequences of stimulus pairs where deviant colors were presented with 20% probabilities. Deviant colored patterns elicited a vMMN, and this component was larger for the second member of the pair, i.e., after a shorter inter-stimulus interval. This result corresponds to the SOA/(v)MMN relationship, expected on the basis of a memory-mismatch process. Our results show that the system underlying vMMN is sensitive to abstract, conditional rules. Representation of such rules implicates expectation of a subsequent event, therefore vMMN can be considered as a correlate of violated predictions about the characteristics of environmental events. PMID:21629766
Regular Scanning Tunneling Microscope Tips can be Intrinsically Chiral
Tierney, Heather L.; Murphy, Colin J.; Sykes, E. Charles H.
2011-01-07
We report our discovery that regular scanning tunneling microscope tips can themselves be chiral. This chirality leads to differences in electron tunneling efficiencies through left- and right-handed molecules, and, when using the tip to electrically excite molecular rotation, large differences in rotation rate were observed which correlated with molecular chirality. As scanning tunneling microscopy is a widely used technique, this result may have unforeseen consequences for the measurement of asymmetric surface phenomena in a variety of important fields.
Avoidance of Cigarette Pack Health Warnings among Regular Cigarette Smokers
Maynard, Olivia M.; Attwood, Angela; O’Brien, Laura; Brooks, Sabrina; Hedge, Craig; Leonards, Ute; Munafò, Marcus R.
2016-01-01
Background Previous research with adults and adolescents indicates that plain cigarette packs increase visual attention to health warnings among non-smokers and non-regular smokers, but not among regular smokers. This may be because regular smokers: 1) are familiar with the health warnings, 2) preferentially attend to branding, or 3) actively avoid health warnings. We sought to distinguish between these explanations using eye-tracking technology. Method A convenience sample of 30 adult dependant smokers were recruited to participate in an eye-tracking study. Participants viewed branded, plain and blank packs of cigarettes with familiar and unfamiliar health warnings. The number of fixations to health warnings and branding on the different pack types were recorded. Results Analysis of variance indicated that regular smokers were biased towards fixating the branding location rather than the health warning location on all three pack types (p < 0.002). This bias was smaller, but still evident, for blank packs, where smokers preferentially attended the blank region over the health warnings. Time-course analysis showed that for branded and plain packs, attention was preferentially directed to the branding location for the entire 10 seconds of the stimulus presentation, while for blank packs this occurred for the last 8 seconds of the stimulus presentation. Familiarity with health warnings had no effect on eye gaze location. Conclusion Smokers actively avoid cigarette pack health warnings, and this remains the case even in the absence of salient branding information. Smokers may have learned to divert their attention away from cigarette pack health warnings. These findings have policy implications for the design of health warning on cigarette packs. PMID:24485554
Manifestly scale-invariant regularization and quantum effective operators
NASA Astrophysics Data System (ADS)
Ghilencea, D. M.
2016-05-01
Scale-invariant theories are often used to address the hierarchy problem. However the regularization of their quantum corrections introduces a dimensionful coupling (dimensional regularization) or scale (Pauli-Villars, etc) which breaks this symmetry explicitly. We show how to avoid this problem and study the implications of a manifestly scale-invariant regularization in (classical) scale-invariant theories. We use a dilaton-dependent subtraction function μ (σ ) which, after spontaneous breaking of the scale symmetry, generates the usual dimensional regularization subtraction scale μ (⟨σ ⟩) . One consequence is that "evanescent" interactions generated by scale invariance of the action in d =4 -2 ɛ (but vanishing in d =4 ) give rise to new, finite quantum corrections. We find a (finite) correction Δ U (ϕ ,σ ) to the one-loop scalar potential for ϕ and σ , beyond the Coleman-Weinberg term. Δ U is due to an evanescent correction (∝ɛ ) to the field-dependent masses (of the states in the loop) which multiplies the pole (∝1 /ɛ ) of the momentum integral to give a finite quantum result. Δ U contains a nonpolynomial operator ˜ϕ6/σ2 of known coefficient and is independent of the subtraction dimensionless parameter. A more general μ (ϕ ,σ ) is ruled out since, in their classical decoupling limit, the visible sector (of the Higgs ϕ ) and hidden sector (dilaton σ ) still interact at the quantum level; thus, the subtraction function must depend on the dilaton only, μ ˜σ . The method is useful in models where preserving scale symmetry at quantum level is important.
Holographic Wilson loops, Hamilton-Jacobi equation, and regularizations
NASA Astrophysics Data System (ADS)
Pontello, Diego; Trinchero, Roberto
2016-04-01
The minimal area for surfaces whose borders are rectangular and circular loops are calculated using the Hamilton-Jacobi (HJ) equation. This amounts to solving the HJ equation for the value of the minimal area, without calculating the shape of the corresponding surface. This is done for bulk geometries that are asymptotically anti-de Sitter (AdS). For the rectangular contour, the HJ equation, which is separable, can be solved exactly. For the circular contour an expansion in powers of the radius is implemented. The HJ approach naturally leads to a regularization which consists in locating the contour away from the border. The results are compared with the ɛ -regularization which leaves the contour at the border and calculates the area of the corresponding minimal surface up to a diameter smaller than the one of the contour at the border. The results for the circular loop do not coincide if the expansion parameter is taken to be the radius of the contour at the border. It is shown that using this expansion parameter the ɛ -regularization leads to incorrect results for certain solvable non-AdS cases. However, if the expansion parameter is taken to be the radius of the minimal surface whose area is computed, then the results coincide with the HJ scheme. This is traced back to the fact that in the HJ case the expansion parameter for the area of a minimal surface is intrinsic to the surface; however, the radius of the contour at the border is related to the way one chooses to regularize in the ɛ -scheme the calculation of this area.
Regularities of catalytic oxidation of carbon by nitrous oxide
Babenko, V.S.; Buyanov, R.A.
1995-07-01
The main regularities of the catalytic oxidation of various carbon materials by nitrous oxide are studied. The compounds of a series of alkaline and alkaline-earth metals are found to be effective catalysts for this process, which decrease the temperature of the beginning of carbon oxidation by {approximately} 150 - 200{degrees}C. The activity of alkaline metals is enhanced with increasing metal atomic mass. The rate of the carbon oxidation depends on the nature of a carbon material.
Perfect state transfer over distance-regular spin networks
Jafarizadeh, M. A.; Sufiani, R.
2008-02-15
Christandl et al. have noted that the d-dimensional hypercube can be projected to a linear chain with d+1 sites so that, by considering fixed but different couplings between the qubits assigned to the sites, the perfect state transfer (PST) can be achieved over arbitrarily long distances in the chain [Phys. Rev. Lett. 92, 187902 (2004); Phys. Rev. A 71, 032312 (2005)]. In this work we consider distance-regular graphs as spin networks and note that any such network (not just the hypercube) can be projected to a linear chain and so can allow PST over long distances. We consider some particular spin Hamiltonians which are the extended version of those of Christandl et al. Then, by using techniques such as stratification of distance-regular graphs and spectral analysis methods, we give a procedure for finding a set of coupling constants in the Hamiltonians so that a particular state initially encoded on one site will evolve freely to the opposite site without any dynamical control, i.e., we show how to derive the parameters of the system so that PST can be achieved. It is seen that PST is only allowed in distance-regular spin networks for which, starting from an arbitrary vertex as reference vertex (prepared in the initial state which we wish to transfer), the last stratum of the networks with respect to the reference state contains only one vertex; i.e., stratification of these networks plays an important role which determines in which kinds of networks and between which vertices of them, PST can be allowed. As examples, the cycle network with even number of vertices and d-dimensional hypercube are considered in details and the method is applied for some important distance-regular networks.
Knowing More than One Can Say: The Early Regular Plural
ERIC Educational Resources Information Center
Zapf, Jennifer A.; Smith, Linda B.
2009-01-01
This paper reports on partial knowledge in two-year-old children's learning of the regular English plural. In Experiments 1 and 2, children were presented with one kind and its label and then were either presented with two of that same kind (A[right arrow]AA) or the initial picture next to a very different thing (A[right arrow]AB). The children in…
Effects of regular exercise training on skeletal muscle contractile function
NASA Technical Reports Server (NTRS)
Fitts, Robert H.
2003-01-01
Skeletal muscle function is critical to movement and one's ability to perform daily tasks, such as eating and walking. One objective of this article is to review the contractile properties of fast and slow skeletal muscle and single fibers, with particular emphasis on the cellular events that control or rate limit the important mechanical properties. Another important goal of this article is to present the current understanding of how the contractile properties of limb skeletal muscle adapt to programs of regular exercise.
Quantum spin chains with regularly alternating bonds and fields
NASA Astrophysics Data System (ADS)
Derzhko, Oleg
2002-01-01
We consider the spin-1/2 XY chain in a transverse field with regularly varying exchange interactions and on-site fields. In two limiting cases of the isotropic ( XX) and extremely anisotropic (Ising) exchange interaction the thermodynamic quantities are calculated rigorously with the help of continued fractions. We discuss peculiarities of the low-temperature magnetic properties and a possibility of the spin-Peierls instability.
Self-organized formation of regular nanostripes on vicinal surfaces
NASA Astrophysics Data System (ADS)
Yu, Yan-Mei; Liu, Bang-Gui
2004-11-01
We explore the mechanism of self-organized formation of regular arrays of nanostripes on vicinal surfaces by using a phase-field model. Epitaxial growth during deposition usually results in both nanostripes and islands on terraces of a vicinal substrate. Postdeposition annealing at elevated temperatures induces growth of the nanostripes but makes the islands shrink. It is a ripening process of the mixed system of the nanostripes and the islands, being dependent upon the temperature and strain. It is accompanied by a transition from the diffusion-limited regime to the detachment-limited regime induced by the strain at high temperatures. This ripening makes the islands diminish and on the other hand makes the nanostripes smoother. As a result, the islands disappear completely and the regular arrays of nanostripes are formed on the vicinal substrate. This theory can explain the self-organized formation of nanostripes and nanowires on vicinal surfaces, such as the intriguing regular arrays of Fe nanostripes on the vicinal W surfaces.
Manifold regularized non-negative matrix factorization with label information
NASA Astrophysics Data System (ADS)
Li, Huirong; Zhang, Jiangshe; Wang, Changpeng; Liu, Junmin
2016-03-01
Non-negative matrix factorization (NMF) as a popular technique for finding parts-based, linear representations of non-negative data has been successfully applied in a wide range of applications, such as feature learning, dictionary learning, and dimensionality reduction. However, both the local manifold regularization of data and the discriminative information of the available label have not been taken into account together in NMF. We propose a new semisupervised matrix decomposition method, called manifold regularized non-negative matrix factorization (MRNMF) with label information, which incorporates the manifold regularization and the label information into the NMF to improve the performance of NMF in clustering tasks. We encode the local geometrical structure of the data space by constructing a nearest neighbor graph and enhance the discriminative ability of different classes by effectively using the label information. Experimental comparisons with the state-of-the-art methods on theCOIL20, PIE, Extended Yale B, and MNIST databases demonstrate the effectiveness of MRNMF.
Matching effective chiral Lagrangians with dimensional and lattice regularizations
NASA Astrophysics Data System (ADS)
Niedermayer, F.; Weisz, P.
2016-04-01
We compute the free energy in the presence of a chemical potential coupled to a conserved charge in effective O( n) scalar field theory (without explicit symmetry breaking terms) to third order for asymmetric volumes in general d-dimensions, using dimensional (DR) and lattice regularizations. This yields relations between the 4-derivative couplings appearing in the effective actions for the two regularizations, which in turn allows us to translate results, e.g. the mass gap in a finite periodic box in d = 3 + 1 dimensions, from one regularization to the other. Consistency is found with a new direct computation of the mass gap using DR. For the case n = 4 , d = 4 the model is the low-energy effective theory of QCD with N f = 2 massless quarks. The results can thus be used to obtain estimates of low energy constants in the effective chiral Lagrangian from measurements of the low energy observables, including the low lying spectrum of N f = 2 QCD in the δ-regime using lattice simulations, as proposed by Peter Hasenfratz, or from the susceptibility corresponding to the chemical potential used.
Regularity and predictability of human mobility in personal space.
Austin, Daniel; Cross, Robin M; Hayes, Tamara; Kaye, Jeffrey
2014-01-01
Fundamental laws governing human mobility have many important applications such as forecasting and controlling epidemics or optimizing transportation systems. These mobility patterns, studied in the context of out of home activity during travel or social interactions with observations recorded from cell phone use or diffusion of money, suggest that in extra-personal space humans follow a high degree of temporal and spatial regularity - most often in the form of time-independent universal scaling laws. Here we show that mobility patterns of older individuals in their home also show a high degree of predictability and regularity, although in a different way than has been reported for out-of-home mobility. Studying a data set of almost 15 million observations from 19 adults spanning up to 5 years of unobtrusive longitudinal home activity monitoring, we find that in-home mobility is not well represented by a universal scaling law, but that significant structure (predictability and regularity) is uncovered when explicitly accounting for contextual data in a model of in-home mobility. These results suggest that human mobility in personal space is highly stereotyped, and that monitoring discontinuities in routine room-level mobility patterns may provide an opportunity to predict individual human health and functional status or detect adverse events and trends. PMID:24587302
On constraining pilot point calibration with regularization in PEST
Fienen, M.N.; Muffels, C.T.; Hunt, R.J.
2009-01-01
Ground water model calibration has made great advances in recent years with practical tools such as PEST being instrumental for making the latest techniques available to practitioners. As models and calibration tools get more sophisticated, however, the power of these tools can be misapplied, resulting in poor parameter estimates and/or nonoptimally calibrated models that do not suit their intended purpose. Here, we focus on an increasingly common technique for calibrating highly parameterized numerical models - pilot point parameterization with Tikhonov regularization. Pilot points are a popular method for spatially parameterizing complex hydrogeologic systems; however, additional flexibility offered by pilot points can become problematic if not constrained by Tikhonov regularization. The objective of this work is to explain and illustrate the specific roles played by control variables in the PEST software for Tikhonov regularization applied to pilot points. A recent study encountered difficulties implementing this approach, but through examination of that analysis, insight into underlying sources of potential misapplication can be gained and some guidelines for overcoming them developed. ?? 2009 National Ground Water Association.
Hessian-Regularized Co-Training for Social Activity Recognition
Liu, Weifeng; Li, Yang; Lin, Xu; Tao, Dacheng; Wang, Yanjiang
2014-01-01
Co-training is a major multi-view learning paradigm that alternately trains two classifiers on two distinct views and maximizes the mutual agreement on the two-view unlabeled data. Traditional co-training algorithms usually train a learner on each view separately and then force the learners to be consistent across views. Although many co-trainings have been developed, it is quite possible that a learner will receive erroneous labels for unlabeled data when the other learner has only mediocre accuracy. This usually happens in the first rounds of co-training, when there are only a few labeled examples. As a result, co-training algorithms often have unstable performance. In this paper, Hessian-regularized co-training is proposed to overcome these limitations. Specifically, each Hessian is obtained from a particular view of examples; Hessian regularization is then integrated into the learner training process of each view by penalizing the regression function along the potential manifold. Hessian can properly exploit the local structure of the underlying data manifold. Hessian regularization significantly boosts the generalizability of a classifier, especially when there are a small number of labeled examples and a large number of unlabeled examples. To evaluate the proposed method, extensive experiments were conducted on the unstructured social activity attribute (USAA) dataset for social activity recognition. Our results demonstrate that the proposed method outperforms baseline methods, including the traditional co-training and LapCo algorithms. PMID:25259945
Blind inpainting using l0 and total variation regularization.
Afonso, Manya V; Sanches, Joao Miguel Raposo
2015-07-01
In this paper, we address the problem of image reconstruction with missing pixels or corrupted with impulse noise, when the locations of the corrupted pixels are not known. A logarithmic transformation is applied to convert the multiplication between the image and binary mask into an additive problem. The image and mask terms are then estimated iteratively with total variation regularization applied on the image, and l0 regularization on the mask term which imposes sparseness on the support set of the missing pixels. The resulting alternating minimization scheme simultaneously estimates the image and mask, in the same iterative process. The logarithmic transformation also allows the method to be extended to the Rayleigh multiplicative and Poisson observation models. The method can also be extended to impulse noise removal by relaxing the regularizer from the l0 norm to the l1 norm. Experimental results show that the proposed method can deal with a larger fraction of missing pixels than two phase methods, which first estimate the mask and then reconstruct the image. PMID:25826806
3D harmonic phase tracking with anatomical regularization.
Zhou, Yitian; Bernard, Olivier; Saloux, Eric; Manrique, Alain; Allain, Pascal; Makram-Ebeid, Sherif; De Craene, Mathieu
2015-12-01
This paper presents a novel algorithm that extends HARP to handle 3D tagged MRI images. HARP results were regularized by an original regularization framework defined in an anatomical space of coordinates. In the meantime, myocardium incompressibility was integrated in order to correct the radial strain which is reported to be more challenging to recover. Both the tracking and regularization of LV displacements were done on a volumetric mesh to be computationally efficient. Also, a window-weighted regression method was extended to cardiac motion tracking which helps maintain a low complexity even at finer scales. On healthy volunteers, the tracking accuracy was found to be as accurate as the best candidates of a recent benchmark. Strain accuracy was evaluated on synthetic data, showing low bias and strain errors under 5% (excluding outliers) for longitudinal and circumferential strains, while the second and third quartiles of the radial strain errors are in the (-5%,5%) range. In clinical data, strain dispersion was shown to correlate with the extent of transmural fibrosis. Also, reduced deformation values were found inside infarcted segments. PMID:26363844
Global Regularization Method for Planar Restricted Three-body Problem
NASA Astrophysics Data System (ADS)
Sharaf, M. A.; Dwidar, H. R.
2015-12-01
In this paper, global regularization method for planar restricted three-body problem is purposed by using the transformation z=x+iy=ν cos n(u+iv), where i=√{-1}, 0 < ν ≤ 1 and n is a positive integer. The method is developed analytically and computationally. For the analytical developments, analytical solutions in power series of the pseudo-time τ are obtained for positions and velocities (u,v,u',v') and (x,y,dot{x},dot{y}) in both regularized and physical planes respectively, the physical time {t} is also obtained as power series in τ. Moreover, relations between the coefficients of the power series are obtained for two consequent values of {n}. Also, we developed analytical solutions in power series form for the inverse problem of finding τ in terms of {t}. As typical examples, three symbolic expressions for the coefficients of the power series were developed in terms of the initial values. As to the computational developments, the global regularized equations of motion are developed together with their initial values in forms suitable for digital computations using any differential equations solver. On the other hand, for the numerical evolutions of power series, an efficient method depending on the continued fraction theory is provided.
Hybrid regularizers-based adaptive anisotropic diffusion for image denoising.
Liu, Kui; Tan, Jieqing; Ai, Liefu
2016-01-01
To eliminate the staircasing effect for total variation filter and synchronously avoid the edges blurring for fourth-order PDE filter, a hybrid regularizers-based adaptive anisotropic diffusion is proposed for image denoising. In the proposed model, the [Formula: see text]-norm is considered as the fidelity term and the regularization term is composed of a total variation regularization and a fourth-order filter. The two filters can be adaptively selected according to the diffusion function. When the pixels locate at the edges, the total variation filter is selected to filter the image, which can preserve the edges. When the pixels belong to the flat regions, the fourth-order filter is adopted to smooth the image, which can eliminate the staircase artifacts. In addition, the split Bregman and relaxation approach are employed in our numerical algorithm to speed up the computation. Experimental results demonstrate that our proposed model outperforms the state-of-the-art models cited in the paper in both the qualitative and quantitative evaluations. PMID:27047730
Regularity vs genericity in the perception of collinearity.
Feldman, J
1996-01-01
The perception of collinearity is investigated, with the focus on the minimal case of three dots. As suggested previously, from the standpoint of probabilistic inference, the observer must classify each dot triplet as having arisen either from a one-dimensional curvilinear process or from a two-dimensional patch. The normative distributions of triplets arising from these two classes are unavailable to the observer, and are in fact somewhat counterintuitive. Hence in order to classify triplets, the observer invents distributions for each of the two opposed types, 'regular' (collinear) triplets and 'generic' (ie not regular) triplets. The collinear prototype is centered at 0 degree (ie perfectly straight), whereas the generic prototype, contrary to the normative statistics, is centered at 120 degrees away from straight-apparently because this is the point most distant in triplet space from straight and thus creates the maximum possible contrast between the two prototypes. By default, these two processes are assumed to be equiprobable in the environment. An experiment designed to investigate how subjects' judgments are affected by conspicuous environmental deviations from this assumption is reported. The results suggest that observers react by elevating or depressing the expected probability of the generic prototype relative to the regular one, leaving the prototype structure otherwise intact. PMID:8804096
Identification and sorting of regular textures according to their similarity
NASA Astrophysics Data System (ADS)
Hernández Mesa, Pilar; Anastasiadis, Johannes; Puente León, Fernando
2015-05-01
Regardless whether mosaics, material surfaces or skin surfaces are inspected their texture plays an important role. Texture is a property which is hard to describe using words but it can easily be described in pictures. Furthermore, a huge amount of digital images containing a visual description of textures already exists. However, this information becomes useless if there are no appropriate methods to browse the data. In addition, depending on the given task some properties like scale, rotation or intensity invariance are desired. In this paper we propose to analyze texture images according to their characteristic pattern. First a classification approach is proposed to separate regular from non-regular textures. The second stage will focus on regular textures suggesting a method to sort them according to their similarity. Different features will be extracted from the texture in order to describe its scale, orientation, texel and the texel's relative position. Depending on the desired invariance of the visual characteristics (like the texture's scale or the texel's form invariance) the comparison of the features between images will be weighted and combined to define the degree of similarity between them. Tuning the weighting parameters allows this search algorithm to be easily adapted to the requirements of the desired task. Not only the total invariance of desired parameters can be adjusted, the weighting of the parameters may also be modified to adapt to an application-specific type of similarity. This search method has been evaluated using different textures and similarity criteria achieving very promising results.
Channeling power across ecological systems: social regularities in community organizing.
Christens, Brian D; Inzeo, Paula Tran; Faust, Victoria
2014-06-01
Relational and social network perspectives provide opportunities for more holistic conceptualizations of phenomena of interest in community psychology, including power and empowerment. In this article, we apply these tools to build on multilevel frameworks of empowerment by proposing that networks of relationships between individuals constitute the connective spaces between ecological systems. Drawing on an example of a model for grassroots community organizing practiced by WISDOM—a statewide federation supporting local community organizing initiatives in Wisconsin—we identify social regularities (i.e., relational and temporal patterns) that promote empowerment and the development and exercise of social power through building and altering relational ties. Through an emphasis on listening-focused one-to-one meetings, reflection, and social analysis, WISDOM organizing initiatives construct and reinforce social regularities that develop social power in the organizing initiatives and advance psychological empowerment among participant leaders in organizing. These patterns are established by organizationally driven brokerage and mobilization of interpersonal ties, some of which span ecological systems.Hence, elements of these power-focused social regularities can be conceptualized as cross-system channels through which micro-level empowerment processes feed into macro-level exercise of social power, and vice versa. We describe examples of these channels in action, and offer recommendations for theory and design of future action research [corrected] . PMID:24398621
MATLAB toolbox for the regularized surface reconstruction from gradients
NASA Astrophysics Data System (ADS)
Harker, Matthew; O'Leary, Paul
2015-04-01
As Photometric Stereo is a means of measuring the gradient field of a surface, an essential step in the measurement of a surface structure is the reconstruction of a surface from its measured gradient field. Given that the surface normals are subject to noise, straightforward integration does not provide an adequate reconstruction of the surface. In fact, if the noise in the gradient can be considered to be Gaussian, the optimal reconstruction based on maximum likelihood principles is obtained by the method of least-squares. However, since the reconstruction of a surface from its gradient is an inverse problem, it is usually necessary to introduce some form of regularization of the solution. This paper describes and demonstrates the functionality of a library of MATLAB functions for the regularized reconstruction of a surface from its measured gradient field. The library of functions, entitled "Surface Reconstruction from Gradient Fields: grad2Surf Version 1.0" is available at the MATLAB file-exchange http://www.mathworks.com/matlabcentral/fileexchange/authors/321598 The toolbox is the culmination of a number of papers on the least-squares reconstruction of a surface from its measured gradient field, regularized solutions to the problem, and real-time implementations of the algorithms.1-4
Metric and geometric quasiconformality in Ahlfors regular Loewner spaces
NASA Astrophysics Data System (ADS)
Tyson, Jeremy T.
Recent developments in geometry have highlighted the need for abstract formulations of the classical theory of quasiconformal mappings. We modify Pansu's generalized modulus to study quasiconformal geometry in spaces with metric and measure-theoretic properties sufficiently similar to Euclidean space. Our basic objects of study are locally compact metric spaces equipped with a Borel measure which is Ahlfors-David regular of dimension Q>1 , and satisfies the Loewner condition of Heinonen-Koskela. For homeomorphisms between open sets in two such spaces, we prove the equivalence of three conditions: a version of metric quasiconformality, local quasisymmetry and geometric quasiconformality. We derive from these results several corollaries. First, we show that the Loewner condition is a quasisymmetric invariant in locally compact Ahlfors regular spaces. Next, we show that a proper Q -regular Loewner space, Q>1 , is not quasiconformally equivalent to any subdomain. (In the Euclidean case, this result is due to Loewner.) Finally, we characterize products of snowflake curves up to quasisymmetric/bi-Lipschitz equivalence: two such products are bi-Lipschitz equivalent if and only if they are isometric and are quasisymmetrically equivalent if and only if they are conformally equivalent.
Isotropic model for cluster growth on a regular lattice
NASA Astrophysics Data System (ADS)
Yates, Christian A.; Baker, Ruth E.
2013-08-01
There exists a plethora of mathematical models for cluster growth and/or aggregation on regular lattices. Almost all suffer from inherent anisotropy caused by the regular lattice upon which they are grown. We analyze the little-known model for stochastic cluster growth on a regular lattice first introduced by Ferreira Jr. and Alves [J. Stat. Mech. Theo. & Exp.1742-546810.1088/1742-5468/2006/11/P11007 (2006) P11007], which produces circular clusters with no discernible anisotropy. We demonstrate that even in the noise-reduced limit the clusters remain circular. We adapt the model by introducing a specific rearrangement algorithm so that, rather than adding elements to the cluster from the outside (corresponding to apical growth), our model uses mitosis-like cell splitting events to increase the cluster size. We analyze the surface scaling properties of our model and compare it to the behavior of more traditional models. In “1+1” dimensions we discover and explore a new, nonmonotonic surface thickness scaling relationship which differs significantly from the Family-Vicsek scaling relationship. This suggests that, for models whose clusters do not grow through particle additions which are solely dependent on surface considerations, the traditional classification into “universality classes” may not be appropriate.
Path integral regularization of pure Yang-Mills theory
Jacquot, J. L.
2009-07-15
In enlarging the field content of pure Yang-Mills theory to a cutoff dependent matrix valued complex scalar field, we construct a vectorial operator, which is by definition invariant with respect to the gauge transformation of the Yang-Mills field and with respect to a Stueckelberg type gauge transformation of the scalar field. This invariant operator converges to the original Yang-Mills field as the cutoff goes to infinity. With the help of cutoff functions, we construct with this invariant a regularized action for the pure Yang-Mills theory. In order to be able to define both the gauge and scalar fields kinetic terms, other invariant terms are added to the action. Since the scalar fields flat measure is invariant under the Stueckelberg type gauge transformation, we obtain a regularized gauge-invariant path integral for pure Yang-Mills theory that is mathematically well defined. Moreover, the regularized Ward-Takahashi identities describing the dynamics of the gauge fields are exactly the same as the formal Ward-Takahashi identities of the unregularized theory.
Representation of Maximally Regular Textures in Human Visual Cortex.
Kohler, Peter J; Clarke, Alasdair; Yakovleva, Alexandra; Liu, Yanxi; Norcia, Anthony M
2016-01-20
Naturalistic textures with an intermediate degree of statistical regularity can capture key structural features of natural images (Freeman and Simoncelli, 2011). V2 and later visual areas are sensitive to these features, while primary visual cortex is not (Freeman et al., 2013). Here we expand on this work by investigating a class of textures that have maximal formal regularity, the 17 crystallographic wallpaper groups (Fedorov, 1891). We used texture stimuli from four of the groups that differ in the maximum order of rotation symmetry they contain, and measured neural responses in human participants using functional MRI and high-density EEG. We found that cortical area V3 has a parametric representation of the rotation symmetries in the textures that is not present in either V1 or V2, the first discovery of a stimulus property that differentiates processing in V3 from that of lower-level areas. Parametric responses were also seen in higher-order ventral stream areas V4, VO1, and lateral occipital complex (LOC), but not in dorsal stream areas. The parametric response pattern was replicated in the EEG data, and source localization indicated that responses in V3 and V4 lead responses in LOC, which is consistent with a feedforward mechanism. Finally, we presented our stimuli to four well developed feedforward models and found that none of them were able to account for our results. Our results highlight structural regularity as an important stimulus dimension for distinguishing the early stages of visual processing, and suggest a previously unrecognized role for V3 in the visual form-processing hierarchy. Significance statement: Hierarchical processing is a fundamental organizing principle in visual neuroscience, with each successive processing stage being sensitive to increasingly complex stimulus properties. Here, we probe the encoding hierarchy in human visual cortex using a class of visual textures--wallpaper patterns--that are maximally regular. Through a
Tomographic inversion using L1-regularization of Wavelet Coefficients
NASA Astrophysics Data System (ADS)
Loris, I.; Nolet, G.; Daubechies, I.; Dahlen, T.
2006-12-01
Like most geophysical inverse problems, the inverse problem in seismic tomography is underdetermined, or at best offers a mix of over- and underdetermined parameters. One usually regularizes the inverse problem by minimizing the norm (|m|) or roughness of the model (|∇ m| or |∇2 m|) to obtain a solution that is void of unwarranted structural detail. The notion that we seek the 'simplest' model that is in agreement with a given data set is intuitively equivalent to the notion that the model should be describable with a small number of parameters. But clearly, limiting the model to a few Fourier coefficients, or large scale blocks, does not necessarily lead to a geophysically plausible solution. We investigate if a wavelet basis can serve as a basis with enough flexibility to represent the class of models we seek. We propose a regularization method based on the assumption that the model m is sparse in a wavelet basis, meaning that it can be faithfully represented by a small number of nonzero wavelet coefficients. This allows for models that vary smoothly without sacrificing the sharp boundaries by a smoothing operator to regularize the inversion. To regularize the inversion, we minimize I= ∥ d - Am ∥2 + 2 τ ∥ w ∥1, where w is a vector of wavelet coefficients (m=Ww), τ the damping parameter, d-Am the vector of data residuals and 1 and 2 denote the ℓ1 and ℓ2 norm, respectively. the system is solved using Landweber iteration: w(n+1)= Sτ [ WATd + (I - WATAWT)w(n)], where Sτ is a soft thresholding operator (Sτ(x)=0 for |x|<τ and x ± τ elsewhere). In synthetic tests on a 2D tomographic model we show that minimizing the ℓ1 norm of a wavelet decomposition of the model leads to tomographic images that are parsimonious in the sense that they represent both smooth and sharp features well without introducing significant blurring or artifacts. The ℓ1 norm performs significantly better than an ℓ2 regularization on either the model or its wavelet
Regular biorthogonal pairs and pseudo-bosonic operators
NASA Astrophysics Data System (ADS)
Inoue, H.; Takakura, M.
2016-08-01
The first purpose of this paper is to show a method of constructing a regular biorthogonal pair based on the commutation rule: ab - ba = I for a pair of operators a and b acting on a Hilbert space H with inner product (ṡ| ṡ ). Here, sequences {ϕn} and {ψn} in a Hilbert space H are biorthogonal if (ϕn|ψm) = δnm, n, m = 0, 1, …, and they are regular if both Dϕ ≡ Span{ϕn} and Dψ ≡ Span{ψn} are dense in H . Indeed, the assumptions to construct the regular biorthogonal pair coincide with the definition of pseudo-bosons as originally given in F. Bagarello ["Pseudobosons, Riesz bases, and coherent states," J. Math. Phys. 51, 023531 (2010)]. Furthermore, we study the connections between the pseudo-bosonic operators a, b, a†, b† and the pseudo-bosonic operators defined by a regular biorthogonal pair ({ϕn}, {ψn}) and an ONB e of H in H. Inoue ["General theory of regular biorthogonal pairs and its physical applications," e-print arXiv:math-ph/1604.01967]. The second purpose is to define and study the notion of D -pseudo-bosons in F. Bagarello ["More mathematics for pseudo-bosons," J. Math. Phys. 54, 063512 (2013)] and F. Bagarello ["From self-adjoint to non self-adjoint harmonic oscillators: Physical consequences and mathematical pitfalls," Phys. Rev. A 88, 032120 (2013)] and give a method of constructing D -pseudo-bosons on some steps. Then it is shown that for any ONB e = {en} in H and any operators T and T-1 in L † ( D ) , we may construct operators A and B satisfying D -pseudo bosons, where D is a dense subspace in a Hilbert space H and L † ( D ) the set of all linear operators T from D to D such that T * D ⊂ D , where T* is the adjoint of T. Finally, we give some physical examples of D -pseudo-bosons based on standard bosons by the method of constructing D -pseudo-bosons stated above.
Applying molecular immunohaematology to regularly transfused thalassaemic patients in Thailand
Rujirojindakul, Pairaya; Flegel, Willy A.
2014-01-01
Background Red blood cell transfusion is the principal therapy in patients with severe thalassaemias and haemoglobinopathies, which are prevalent in Thailand. Serological red blood cell typing is confounded by chronic transfusion, because of circulating donor red blood cells. We evaluated the concordance of serological phenotypes between a routine and a reference laboratory and with red cell genotyping. Materials and methods Ten consecutive Thai patients with β-thalassemia major who received regular transfusions were enrolled in Thailand. Phenotypes were tested serologically at Songklanagarind Hospital and at the National Institutes of Health. Red blood cell genotyping was performed with commercially available kits and a platform. Results In only three patients was the red cell genotyping concordant with the serological phenotypes for five antithetical antigen pairs in four blood group systems at the two institutions. At the National Institutes of Health, 32 of the 100 serological tests yielded invalid or discrepant results. The positive predictive value of serology did not reach 1 for any blood group system at either of the two institutions in this set of ten patients. Discussion Within this small study, numerous discrepancies were observed between serological phenotypes at the two institutes; red cell genotyping enabled determination of the blood group when serology failed due to transfused red blood cells. We question the utility of serological tests in regularly transfused paediatric patients and propose relying solely on red cell genotyping, which requires training for laboratory personnel and physicians. Red cell genotyping outperformed red cell serology by an order of magnitude in regularly transfused patients. PMID:24120606
Quantification of fetal heart rate regularity using symbolic dynamics
NASA Astrophysics Data System (ADS)
van Leeuwen, P.; Cysarz, D.; Lange, S.; Geue, D.; Groenemeyer, D.
2007-03-01
Fetal heart rate complexity was examined on the basis of RR interval time series obtained in the second and third trimester of pregnancy. In each fetal RR interval time series, short term beat-to-beat heart rate changes were coded in 8bit binary sequences. Redundancies of the 28 different binary patterns were reduced by two different procedures. The complexity of these sequences was quantified using the approximate entropy (ApEn), resulting in discrete ApEn values which were used for classifying the sequences into 17 pattern sets. Also, the sequences were grouped into 20 pattern classes with respect to identity after rotation or inversion of the binary value. There was a specific, nonuniform distribution of the sequences in the pattern sets and this differed from the distribution found in surrogate data. In the course of gestation, the number of sequences increased in seven pattern sets, decreased in four and remained unchanged in six. Sequences that occurred less often over time, both regular and irregular, were characterized by patterns reflecting frequent beat-to-beat reversals in heart rate. They were also predominant in the surrogate data, suggesting that these patterns are associated with stochastic heart beat trains. Sequences that occurred more frequently over time were relatively rare in the surrogate data. Some of these sequences had a high degree of regularity and corresponded to prolonged heart rate accelerations or decelerations which may be associated with directed fetal activity or movement or baroreflex activity. Application of the pattern classes revealed that those sequences with a high degree of irregularity correspond to heart rate patterns resulting from complex physiological activity such as fetal breathing movements. The results suggest that the development of the autonomic nervous system and the emergence of fetal behavioral states lead to increases in not only irregular but also regular heart rate patterns. Using symbolic dynamics to
Auditory feedback in error-based learning of motor regularity.
van Vugt, Floris T; Tillmann, Barbara
2015-05-01
Music and speech are skills that require high temporal precision of motor output. A key question is how humans achieve this timing precision given the poor temporal resolution of somatosensory feedback, which is classically considered to drive motor learning. We hypothesise that auditory feedback critically contributes to learn timing, and that, similarly to visuo-spatial learning models, learning proceeds by correcting a proportion of perceived timing errors. Thirty-six participants learned to tap a sequence regularly in time. For participants in the synchronous-sound group, a tone was presented simultaneously with every keystroke. For the jittered-sound group, the tone was presented after a random delay of 10-190 ms following the keystroke, thus degrading the temporal information that the sound provided about the movement. For the mute group, no keystroke-triggered sound was presented. In line with the model predictions, participants in the synchronous-sound group were able to improve tapping regularity, whereas the jittered-sound and mute group were not. The improved tapping regularity of the synchronous-sound group also transferred to a novel sequence and was maintained when sound was subsequently removed. The present findings provide evidence that humans engage in auditory feedback error-based learning to improve movement quality (here reduce variability in sequence tapping). We thus elucidate the mechanism by which high temporal precision of movement can be achieved through sound in a way that may not be possible with less temporally precise somatosensory modalities. Furthermore, the finding that sound-supported learning generalises to novel sequences suggests potential rehabilitation applications. PMID:25721795
Characterizing the functional MRI response using Tikhonov regularization.
Vakorin, Vasily A; Borowsky, Ron; Sarty, Gordon E
2007-09-20
The problem of evaluating an averaged functional magnetic resonance imaging (fMRI) response for repeated block design experiments was considered within a semiparametric regression model with autocorrelated residuals. We applied functional data analysis (FDA) techniques that use a least-squares fitting of B-spline expansions with Tikhonov regularization. To deal with the noise autocorrelation, we proposed a regularization parameter selection method based on the idea of combining temporal smoothing with residual whitening. A criterion based on a generalized chi(2)-test of the residuals for white noise was compared with a generalized cross-validation scheme. We evaluated and compared the performance of the two criteria, based on their effect on the quality of the fMRI response. We found that the regularization parameter can be tuned to improve the noise autocorrelation structure, but the whitening criterion provides too much smoothing when compared with the cross-validation criterion. The ultimate goal of the proposed smoothing techniques is to facilitate the extraction of temporal features in the hemodynamic response for further analysis. In particular, these FDA methods allow us to compute derivatives and integrals of the fMRI signal so that fMRI data may be correlated with behavioral and physiological models. For example, positive and negative hemodynamic responses may be easily and robustly identified on the basis of the first derivative at an early time point in the response. Ultimately, these methods allow us to verify previously reported correlations between the hemodynamic response and the behavioral measures of accuracy and reaction time, showing the potential to recover new information from fMRI data. PMID:17634970
The statistical difference between bending arcs and regular polar arcs
NASA Astrophysics Data System (ADS)
Kullen, A.; Fear, R. C.; Milan, S. E.; Carter, J. A.; Karlsson, T.
2015-12-01
In this work, the Polar UVI data set by Kullen et al. (2002) of 74 polar arcs is reinvestigated, focusing on bending arcs. Bending arcs are typically faint and form (depending on interplanetary magnetic field (IMF) By direction) on the dawnside or duskside oval with the tip of the arc splitting off the dayside oval. The tip subsequently moves into the polar cap in the antisunward direction, while the arc's nightside end remains attached to the oval, eventually becoming hook-shaped. Our investigation shows that bending arcs appear on the opposite oval side from and farther sunward than most regular polar arcs. They form during By-dominated IMF conditions: typically, the IMF clock angle increases from 60 to 90° about 20 min before the arc forms. Antisunward plasma flows from the oval into the polar cap just poleward of bending arcs are seen in Super Dual Auroral Radar Network data, indicating dayside reconnection. For regular polar arcs, recently reported characteristics are confirmed in contrast to bending arcs. This includes plasma flows along the nightside oval that originate close to the initial arc location and a significant delay in the correlation between IMF By and initial arc location. In our data set, the highest correlations are found with IMF By appearing at least 1-2 h before arc formation. In summary, bending arcs are distinctly different from regular arcs and cannot be explained by existing polar arc models. Instead, these results are consistent with the formation mechanism described in Carter et al. (2015), suggesting that bending arcs are caused by dayside reconnection.
Infants use temporal regularities to chunk objects in memory.
Kibbe, Melissa M; Feigenson, Lisa
2016-01-01
Infants, like adults, can maintain only a few items in working memory, but can overcome this limit by creating more efficient representations, or "chunks." Previous research shows that infants can form chunks using shared features or spatial proximity between objects. Here we asked whether infants also can create chunked representations using regularities that unfold over time. Thirteen-month old infants first were familiarized with four objects of different shapes and colors, presented in successive pairs. For some infants, the identities of objects in each pair varied randomly across familiarization (Experiment 1). For others, the objects within a pair always co-occurred, either in consistent relative spatial positions (Experiment 2a) or varying spatial positions (Experiment 2b). Following familiarization, infants saw all four objects hidden behind a screen and then saw the screen lifted to reveal either four objects or only three. Infants in Experiment 1, who had been familiarized with random object pairings, failed to look longer at the unexpected 3-object outcome; they showed the same inability to concurrently represent four objects as in other studies of infant working memory. In contrast, infants in Experiments 2a and 2b, who had been familiarized with regularly co-occurring pairs, looked longer at the unexpected outcome. These infants apparently used the co-occurrence between individual objects during familiarization to form chunked representations that were later deployed to track the objects as they were hidden at test. In Experiment 3, we confirmed that the familiarization affected infants' ability to remember the occluded objects rather than merely establishing longer-term memory for object pairs. Following familiarization to consistent pairs, infants who were not shown a hiding event (but merely saw the same test outcomes as in Experiments 2a and b) showed no preference for arrays of three versus four objects. Finally, in Experiments 4 and 5, we asked
On the low regularity of the Benney-Lin equation
NASA Astrophysics Data System (ADS)
Chen, Wengu; Li, Junfeng
2008-03-01
We consider the low regularity of the Benney-Lin equation ut+uux+uxxx+[beta](uxx+uxxxx)+[eta]uxxxxx=0. We established the global well posedness for the initial value problem of Benney-Lin equation in the Sobolev spaces for 0[greater-or-equal, slanted]s>-2, improving the well-posedness result of Biagioni and Linares [H.A. Biaginoi, F. Linares, On the Benney-Lin and Kawahara equation, J. Math. Anal. Appl. 211 (1997) 131-152]. For s<-2 we also prove some ill-posedness issues.
Path Integrals, BRST Identities, and Regularization Schemes in Nonstandard Gauges
NASA Astrophysics Data System (ADS)
Ren, Hai-cang
2000-07-01
The path integral of a gauge theory is studied in Coulomb-like gauges. The Christ-Lee terms of operator ordering are reproduced within the path integration framework. In the presence of fermions, a new operator term, in addition to that of Christ and Lee, is discovered. Such terms are found to be instrumental in restoring the invariance of the effective Lagrangian under a field-dependent gauge transformation, which underlies the BRST symmetry. A unitary regularization scheme which maintains manifest BRST symmetry and is free from energy divergences is proposed for a nonabelian gauge field.
Effective regularized algebraic reconstruction technique for computed tomography
Prun, V. E.; Nikolaev, D. P.; Buzmakov, A. V.; Chukalina, M. V.; Asadchikov, V. E.
2013-12-15
A new fast version of the reconstruction algorithm for computed tomography based on the simultaneous algebraic reconstruction technique (SART) is proposed. The algorithm iteration is asymptotically accelerated using the fast Hough transform from O(n{sup 3}) to O(n{sup 2}logn). Similarly to the algebraic reconstruction technique (RegART), which was proposed by us previously, the regularization operator is applied after each iteration. A bilateral filter plays the role of this operator. The algorithm behavior is investigated using the model experiment.
Some possible regularities in the missing mass problem
Bahcall, J.N.; Casertano, S.
1985-06-01
In the present consideration of the missing mass problem, equations are developed which express relatively well determined parameters characterizing the internal properties of all the mass models invoked in a sampling of spiral galaxy cases. The numerical values thus obtained constitute benchmarks, relative to which theories of spiral galaxy formation can be tested. Simple regularities are noted to be exhibited by the unseen matter in the spiral galaxy sample. Attention is given to the possibility that both the visible and invisible matter are baryonic. 23 references.
The Behavior of Regular Satellites During the Planetary Migration
NASA Astrophysics Data System (ADS)
Nogueira, Erica Cristina; Gomes, R. S.; Brasser, R.
2013-05-01
Abstract (2,250 Maximum Characters): The behavior of the regular satellites of the giant planets during the instability phase of the Nice model needs to be better understood. In order to explain this behavior, we used numerical simulations to investigate the evolution of the regular satellite systems of the ice giants when these two planets experienced encounters with the gas giants. For the initial conditions we placed an ice planet in between Jupiter and Saturn, according to the evolution of Nice model simulations in a ‘jumping Jupiter’ scenario (Brasser et al. 2009). We used the MERCURY integrator (Chambers 1999) and cloned simulations by slightly modifying the Hybrid integrator changeover parameter. We obtained 101 successful runs which kept all planets, of which 24 were jumping Jupiter cases. Subsequently we performed additional numerical integrations in which the ice giant that encountered a gas giant was started on the same orbit but with its regular satellites included. This is done as follows: For each of the 101 basic runs, we save the orbital elements of all objects in the integration at all close encounter events. Then we performed a backward integration to start the system 100 years before the encounter and re-enacted the forward integration with the regular satellites around the ice giant. These integrations ran for 1000 years. The final orbital elements of the satellites with respect to the ice planet were used to restart the integration for the next planetary encounter (if any). If we assume that Uranus is the ice planet that had encounters with a gas giant, we considered the satellites Miranda, Ariel, Umbriel, Titania and Oberon with their present orbits around the planet. For Neptune we introduced Triton with an orbit with a 15% larger than the actual semi-major axis to account for the tidal decay from the LHB to present time. We also assume that Triton was captured through binary disruption (Agnor and Hamilton 2006, Nogueira et al. 2011) and
Computing the Casimir force using regularized boundary integral equations
NASA Astrophysics Data System (ADS)
Kilen, Isak; Jakobsen, Per Kristen
2014-11-01
In this paper we use a novel regularization procedure to reduce the calculation of the Casimir force for 2D scalar fields between compact objects to the solution of a classical integral equation defined on the boundaries of the objects. The scalar fields are subject to Dirichlet boundary conditions on the object boundaries. We test the integral equation by comparing with what we get for parallel plates, concentric circles and adjacent circles using mode summation and the functional integral method. We show how symmetries in the shapes and configuration of boundaries can easily be incorporated into our method and that it leads to fast evaluation of the Casimir force for symmetric situations.
An inverse method with regularity condition for transonic airfoil design
NASA Technical Reports Server (NTRS)
Zhu, Ziqiang; Xia, Zhixun; Wu, Liyi
1991-01-01
It is known from Lighthill's exact solution of the incompressible inverse problem that in the inverse design problem, the surface pressure distribution and the free stream speed cannot both be prescribed independently. This implies the existence of a constraint on the prescribed pressure distribution. The same constraint exists at compressible speeds. Presented here is an inverse design method for transonic airfoils. In this method, the target pressure distribution contains a free parameter that is adjusted during the computation to satisfy the regularity condition. Some design results are presented in order to demonstrate the capabilities of the method.
Adaptive regularized scheme for remote sensing image fusion
NASA Astrophysics Data System (ADS)
Tang, Sizhang; Shen, Chaomin; Zhang, Guixu
2016-06-01
We propose an adaptive regularized algorithm for remote sensing image fusion based on variational methods. In the algorithm, we integrate the inputs using a "grey world" assumption to achieve visual uniformity. We propose a fusion operator that can automatically select the total variation (TV)-L1 term for edges and L2-terms for non-edges. To implement our algorithm, we use the steepest descent method to solve the corresponding Euler-Lagrange equation. Experimental results show that the proposed algorithm achieves remarkable results.
Symbol calculus and zeta-function regularized determinants
Kaynak, Burak Tevfik; Turgut, O. Teoman
2007-11-15
In this work, we use semigroup integral to evaluate zeta-function regularized determinants. This is especially powerful for nonpositive operators such as the Dirac operator. In order to understand fully the quantum effective action, one should know not only the potential term but also the leading kinetic term. In this purpose, we use the Weyl type of symbol calculus to evaluate the determinant as a derivative expansion. The technique is applied both to a spin-0 bosonic operator and to the Dirac operator coupled to a scalar field.
[Study on determination of Chinese medicine flavor and its regularity].
Zhang, Zhuo; Zhang, Wei
2014-02-01
Five flavors are basic nature of Chinese medicine. But the labeling of Chinese medicine flavors was in a chaos. Song Jin and Yuan dynasty is a transconformation stage of labeling Chinese medicine flavors. In this article the author put forward that the determination of Chinese medicine flavor shifted from tasting of early and middle age of Northern Song dynasty to categorical analogizing and functional analogizing in the late age of Northern Song dynasty. The latter method had a flourished development in Southern Song, Jin and Yuan dynasty. This regularity conclusion has provided a reference for the standardizing Chinese medicine flavors. PMID:24946566
Regularized image reconstruction for continuously self-imaging gratings.
Horisaki, Ryoichi; Piponnier, Martin; Druart, Guillaume; Guérineau, Nicolas; Primot, Jérôme; Goudail, François; Taboury, Jean; Tanida, Jun
2013-06-01
In this paper, we demonstrate two image reconstruction schemes for continuously self-imaging gratings (CSIGs). CSIGs are diffractive optical elements that generate a depth-invariant propagation pattern and sample objects with a sparse spatial frequency spectrum. To compensate for the sparse sampling, we apply two methods with different regularizations for CSIG imaging. The first method employs continuity of the spatial frequency spectrum, and the second one uses sparsity of the intensity pattern. The two methods are demonstrated with simulations and experiments. PMID:23736336
Trigonometric Pade approximants for functions with regularly decreasing Fourier coefficients
Labych, Yuliya A; Starovoitov, Alexander P
2009-08-31
Sufficient conditions describing the regular decrease of the coefficients of a Fourier series f(x)=a{sub 0}/2 + {sigma} a{sub n} cos kx are found which ensure that the trigonometric Pade approximants {pi}{sup t}{sub n,m}(x;f) converge to the function f in the uniform norm at a rate which coincides asymptotically with the highest possible one. The results obtained are applied to problems dealing with finding sharp constants for rational approximations. Bibliography: 31 titles.
Ideality contours and thermodynamic regularities in supercritical molecular fluids
NASA Astrophysics Data System (ADS)
Desgranges, Caroline; Margo, Abigail; Delhommelle, Jerome
2016-08-01
Using Expanded Wang-Landau simulations, we calculate the ideality contours for 3 molecular fluids (SF6, CO2 and H2O). We analyze how the increase in polarity, and thus, in the strength of the intermolecular interactions, impacts the contours and thermodynamic regularities. This effect results in the increase in the Boyle and H parameters, that underlie the Zeno line and the curve of ideal enthalpy. Furthermore, a detailed analysis reveals that dipole-dipole interactions lead to much larger enthalpic contributions to the Gibbs free energy. This accounts for the much higher temperatures and pressures that are necessary for supercritical H2O to achieve ideal-like thermodynamic properties.
Dynamics of propagating front into sand ripples under regular waves.
Lebunetel-Levaslot, J; Jarno-Druaux, A; Ezersky, A B; Marin, F
2010-09-01
The results of an experimental study of pattern formation on sandy bottom under the action of regular harmonic surface waves are reported. It is found that two modes of pattern formation occur: sand ripples form uniformly on the whole bottom or from localized nucleation sites. In the second regime, the ripples appear in isolated regions (patches) increasing in size, and front propagation speed is measured. A simple dynamical model based on the Ginzburg-Landau equation is proposed to explain the characteristics of patches. PMID:21230122
1. PLAN OF MOXHAM, JOHNSTOWN, PENNA. ALL REGULAR LOTS 40 ...
1. PLAN OF MOXHAM, JOHNSTOWN, PENNA. ALL REGULAR LOTS 40 FT BY 120 FT. TRACED FROM DRAWING 10742 (dated February 1, 1892). THE JOHNSON COMPANY, SCALE 1 INCH - 160 FT, SEPT. 19TH 1898. DRAWING NUMBER 29781. Original plan for the Town of Moxham drafted in 1887-88, company archives contain several revised blueprints of the original plan. This revision reflects the subdivision of the Von Lunch Grove into residential lots, but still indicates the 'Moxham Block' on which the original Moxham Estate was built in 1888-89. (Photograph of drawing held at the Johnstown Corporation General Office, Johnstown, PA) - Borough of Moxham, Johnstown, Cambria County, PA
Regular arrays of Al nanoparticles for plasmonic applications
Schade, Martin Bohley, Christian; Sardana, Neha; Schilling, Jörg; Fuhrmann, Bodo; Schlenker, Sven; Leipner, Hartmut S.
2014-02-28
Optical properties of aluminium nanoparticles deposited on glass substrates are investigated. Laser interference lithography allows a quick deposition of regular, highly periodic arrays of nanostructures with different sizes and distances in order to investigate the shift of the surface plasmon resonance for, e.g., photovoltaic, plasmonic or photonic applications. The variation of the diameter of cylindrical Al nanoparticles exhibits a nearly linear shift of the surface plasmon resonance between 400 nm and 950 nm that is independent from the polarization vector of the incident light. Furthermore, particles with quadratic or elliptic base areas are presented exhibiting more complex and polarization vector dependent transmission spectra.
Singular and regular gap solitons between three dispersion curves.
Grimshaw, Roger; Malomed, Boris A; Gottwald, Georg A
2002-06-01
A general model is introduced to describe a wave-envelope system for the situation when the linear dispersion relation has three branches, which in the absence of any coupling terms between these branches, would intersect pairwise in three nearly coincident points. The system contains two waves with a strong linear coupling between them, to which a third wave is then coupled. This model has two gaps in its linear spectrum. As is typical for wave-envelope systems, the model also contains a set of cubic nonlinear terms. Realizations of this model can be made in terms of temporal or spatial evolution of optical fields in, respectively, either a planar waveguide, or a bulk-layered medium resembling a photonic-crystal fiber, which carry a triple spatial Bragg grating. Another physical system described by the same general model is a set of three internal wave modes in a density-stratified fluid, whose phase speeds come into close coincidence for a certain wave number. A nonlinear analysis is performed for zero-velocity solitons, that is, they have zero velocity in the reference frame in which the third wave has zero group velocity. If one may disregard the self-phase modulation (SPM) term in the equation for the third wave, we find an analytical solution which shows that there simultaneously exist two different families of solitons: regular ones, which may be regarded as a smooth deformation of the usual gap solitons in a two-wave system, and cuspons, which have finite amplitude and energy, but a singularity in the first derivative at their center. Even in the limit when the linear coupling of the third wave to the first two nearly vanishes, the soliton family remains drastically different from that in the uncoupled system; in this limit, regular solitons whose amplitude exceeds a certain critical value are replaced by peakons (whose first derivative is finite at the center, but jumps in value). While the regular solitons, cuspons, and peakons are found in an exact
Characteristics of density currents over regular and irregular rough surfaces
NASA Astrophysics Data System (ADS)
Bhaganagar, K.
2013-12-01
Direct numerical simulation is used as a tool to understand the effect of surface roughness on the propagation of density currents. Simulations have been performed for lock-exchange flow with gate separating the dense and the lighter fluid. As the lock is released the dense fluid collapses with the lighter fluid on the top, resulting in formation of horizontally evolving density current. The talk will focus on the fundamental differences between the propagation of the density current over regular and irregular rough surfaces. The flow statistics and the flow structures are discussed. The results have revealed the spacing between the roughness elements is an important factor in classifying the density currents. The empirical relations of the front velocity and location for the dense and sparse roughness have been evaluated in terms of the roughness height, spacing between the elements and the initial amount of lock fluid. DNS results for a dense current flowing over a (a) smooth and (b) rough bottom with egg-carton roughness elements in a regular configuration. In these simulations the lock-exchange box is located in the middle of the channel and has two gates which allow two dense currents to be generated, one moving to the right and one to the left side of the channel. Note how the dense current interface presents smaller structures when over a rough bottom (right).
Explicit B-spline regularization in diffeomorphic image registration
Tustison, Nicholas J.; Avants, Brian B.
2013-01-01
Diffeomorphic mappings are central to image registration due largely to their topological properties and success in providing biologically plausible solutions to deformation and morphological estimation problems. Popular diffeomorphic image registration algorithms include those characterized by time-varying and constant velocity fields, and symmetrical considerations. Prior information in the form of regularization is used to enforce transform plausibility taking the form of physics-based constraints or through some approximation thereof, e.g., Gaussian smoothing of the vector fields [a la Thirion's Demons (Thirion, 1998)]. In the context of the original Demons' framework, the so-called directly manipulated free-form deformation (DMFFD) (Tustison et al., 2009) can be viewed as a smoothing alternative in which explicit regularization is achieved through fast B-spline approximation. This characterization can be used to provide B-spline “flavored” diffeomorphic image registration solutions with several advantages. Implementation is open source and available through the Insight Toolkit and our Advanced Normalization Tools (ANTs) repository. A thorough comparative evaluation with the well-known SyN algorithm (Avants et al., 2008), implemented within the same framework, and its B-spline analog is performed using open labeled brain data and open source evaluation tools. PMID:24409140
FPGA-accelerated algorithm for the regular expression matching system
NASA Astrophysics Data System (ADS)
Russek, P.; Wiatr, K.
2015-01-01
This article describes an algorithm to support a regular expressions matching system. The goal was to achieve an attractive performance system with low energy consumption. The basic idea of the algorithm comes from a concept of the Bloom filter. It starts from the extraction of static sub-strings for strings of regular expressions. The algorithm is devised to gain from its decomposition into parts which are intended to be executed by custom hardware and the central processing unit (CPU). The pipelined custom processor architecture is proposed and a software algorithm explained accordingly. The software part of the algorithm was coded in C and runs on a processor from the ARM family. The hardware architecture was described in VHDL and implemented in field programmable gate array (FPGA). The performance results and required resources of the above experiments are given. An example of target application for the presented solution is computer and network security systems. The idea was tested on nearly 100,000 body-based viruses from the ClamAV virus database. The solution is intended for the emerging technology of clusters of low-energy computing nodes.
Influence of a Regular, Standardized Meal on Clinical Chemistry Analytes
Salvagno, Gian Luca; Lippi, Giuseppe; Gelati, Matteo; Montagnana, Martina; Danese, Elisa; Picheth, Geraldo; Guidi, Gian Cesare
2012-01-01
Background Preanalytical variability, including biological variability and patient preparation, is an important source of variability in laboratory testing. In this study, we assessed whether a regular light meal might bias the results of routine clinical chemistry testing. Methods We studied 17 healthy volunteers who consumed light meals containing a standardized amount of carbohydrates, proteins, and lipids. We collected blood for routine clinical chemistry tests before the meal and 1, 2, and 4 hr thereafter. Results One hour after the meal, triglycerides (TG), albumin (ALB), uric acid (UA), phosphatase (ALP), Ca, Fe, and Na levels significantly increased, whereas blood urea nitrogen (BUN) and P levels decreased. TG, ALB, Ca, Na, P, and total protein (TP) levels varied significantly. Two hours after the meal, TG, ALB, Ca, Fe, and Na levels remained significantly high, whereas BUN, P, UA, and total bilirubin (BT) levels decreased. Clinically significant variations were recorded for TG, ALB, ALT, Ca, Fe, Na, P, BT, and direct bilirubin (BD) levels. Four hours after the meal, TG, ALB, Ca, Fe, Na, lactate dehydrogenase (LDH), P, Mg, and K levels significantly increased, whereas UA and BT levels decreased. Clinically significant variations were observed for TG, ALB, ALT, Ca, Na, Mg, K, C-reactive protein (CRP), AST, UA, and BT levels. Conclusions A significant variation in the clinical chemistry parameters after a regular meal shows that fasting time needs to be carefully considered when performing tests to prevent spurious results and reduce laboratory errors, especially in an emergency setting. PMID:22779065
Regularization approach for tomosynthesis X-ray inspection
NASA Astrophysics Data System (ADS)
Tigkos, Konstantinos; Hassler, Ulf; Holub, Wolfgang; Woerlein, Norbert; Rehak, Markus
2014-02-01
X-ray inspection is intended to be used as an escalation technique for inspection of carbon fiber reinforced plastics (CFRP) in aerospace applications, especially in case of unclear indications from ultrasonic or other NDT modalities. Due to their large dimensions, most aerospace components cannot be scanned by conventional computed tomography. In such cases, X-ray Laminography may be applied, allowing a pseudo 3D slice-by-slice reconstruction of the sample with Tomosynthesis. However, due to the limited angle acquisition geometry, reconstruction artifacts arise, especially at surfaces parallel to the imaging plane. To regularize the Tomosynthesis approach, we propose an additional prescan of the object to detect outer sample surfaces. We recommend the use of contrasted markers which are temporarily attached to the sample surfaces. The depth position of the markers is then derived from that prescan. As long as the sample surface remains simple, few markers are required to fit the respective object surfaces. The knowledge about this surface may then be used to regularize the final Tomosynthesis reconstruction, performed with markerless projections. Eventually, it can also serve as prior information for an ART reconstruction or to register a CAD model of the sample. The presented work is carried out within the European FP7 project QUICOM. We demonstrate the proposed approach within a simulation study applying an acquisition geometry suited for CFRP part inspection. A practical verification of the approach is planned later in the project.
Theory of volume transition in polyelectrolyte gels with charge regularization.
Hua, Jing; Mitra, Mithun K; Muthukumar, M
2012-04-01
We present a theory for polyelectrolyte gels that allow the effective charge of the polymer backbone to self-regulate. Using a variational approach, we obtain an expression for the free energy of gels that accounts for the gel elasticity, free energy of mixing, counterion adsorption, local dielectric constant, electrostatic interaction among polymer segments, electrolyte ion correlations, and self-consistent charge regularization on the polymer strands. This free energy is then minimized to predict the behavior of the system as characterized by the gel volume fraction as a function of external variables such as temperature and salt concentration. We present results for the volume transition of polyelectrolyte gels in salt-free solvents, solvents with monovalent salts, and solvents with divalent salts. The results of our theoretical analysis capture the essential features of existing experimental results and also provide predictions for further experimentation. Our analysis highlights the importance of the self-regularization of the effective charge for the volume transition of gels in particular, and for charged polymer systems in general. Our analysis also enables us to identify the dominant free energy contributions for charged polymer networks and provides a framework for further investigation of specific experimental systems. PMID:22482584
Regularization approach for tomosynthesis X-ray inspection
Tigkos, Konstantinos; Hassler, Ulf; Holub, Wolfgang; Woerlein, Norbert; Rehak, Markus
2014-02-18
X-ray inspection is intended to be used as an escalation technique for inspection of carbon fiber reinforced plastics (CFRP) in aerospace applications, especially in case of unclear indications from ultrasonic or other NDT modalities. Due to their large dimensions, most aerospace components cannot be scanned by conventional computed tomography. In such cases, X-ray Laminography may be applied, allowing a pseudo 3D slice-by-slice reconstruction of the sample with Tomosynthesis. However, due to the limited angle acquisition geometry, reconstruction artifacts arise, especially at surfaces parallel to the imaging plane. To regularize the Tomosynthesis approach, we propose an additional prescan of the object to detect outer sample surfaces. We recommend the use of contrasted markers which are temporarily attached to the sample surfaces. The depth position of the markers is then derived from that prescan. As long as the sample surface remains simple, few markers are required to fit the respective object surfaces. The knowledge about this surface may then be used to regularize the final Tomosynthesis reconstruction, performed with markerless projections. Eventually, it can also serve as prior information for an ART reconstruction or to register a CAD model of the sample. The presented work is carried out within the European FP7 project QUICOM. We demonstrate the proposed approach within a simulation study applying an acquisition geometry suited for CFRP part inspection. A practical verification of the approach is planned later in the project.
Regularizing the divergent structure of light-front currents
Bakker, Bernard L. G.; Choi, Ho-Meoyng; Ji, Chueng-Ryong
2001-04-01
The divergences appearing in the (3+1)-dimensional fermion-loop calculations are often regulated by smearing the vertices in a covariant manner. Performing a parallel light-front calculation, we corroborate the similarity between the vertex-smearing technique and the Pauli-Villars regularization. In the light-front calculation of the electromagnetic meson current, we find that the persistent end-point singularity that appears in the case of point vertices is removed even if the smeared vertex is taken to the limit of the point vertex. Recapitulating the current conservation, we substantiate the finiteness of both valence and nonvalence contributions in all components of the current with the regularized bound-state vertex. However, we stress that each contribution, valence or nonvalence, depends on the reference frame even though the sum is always frame independent. The numerical taxonomy of each contribution including the instantaneous contribution and the zero-mode contribution is presented in the {pi}, K, and D-meson form factors.
Fully automatic perceptual modeling of near regular textures
NASA Astrophysics Data System (ADS)
Menegaz, G.; Franceschetti, A.; Mecocci, A.
2007-02-01
Near regular textures feature a relatively high degree of regularity. They can be conveniently modeled by the combination of a suitable set of textons and a placement rule. The main issues in this respect are the selection of the minimum set of textons bringing the variability of the basic patterns; the identification and positioning of the generating lattice; and the modelization of the variability in both the texton structure and the deviation from periodicity of the lattice capturing the naturalness of the considered texture. In this contribution, we provide a fully automatic solution to both the analysis and the synthesis issues leading to the generation of textures samples that are perceptually indistinguishable from the original ones. The definition of an ad-hoc periodicity index allows to predict the suitability of the model for a given texture. The model is validated through psychovisual experiments providing the conditions for subjective equivalence among the original and synthetic textures, while allowing to determine the minimum number of textons to be used to meet such a requirement for a given texture class. This is of prime importance in model-based coding applications, as is the one we foresee, as it allows to minimize the amount of information to be transmitted to the receiver.
Wavelet regularization of the 2D incompressible Euler equations
NASA Astrophysics Data System (ADS)
Nguyen van Yen, Romain; Farge, Marie; Schneider, Kai
2009-11-01
We examine the viscosity dependence of the solutions of two-dimensional Navier-Stokes equations in periodic and wall-bounded domains, for Reynolds numbers varying from 10^3 to 10^7. We compare the Navier-Stokes solutions to those of the regularized two-dimensional Euler equations. The regularization is performed by applying at each time step the wavelet-based CVS filter (Farge et al., Phys. Fluids, 11, 1999), which splits turbulent fluctuations into coherent and incoherent contributions. We find that for Reynolds 10^5 the dissipation of coherent enstrophy tends to become independent of Reynolds, while the dissipation of total enstrophy decays to zero logarithmically with Reynolds. In the wall-bounded case, we observe an additional production of enstrophy at the wall. As a result, coherent enstrophy diverges when Reynolds tends to infinity, but its time derivative seems to remain bounded independently of Reynolds. This indicates that a balance may have been established between coherent enstrophy dissipation and coherent enstrophy production at the wall. The Reynolds number for which the dissipation of coherent enstrophy becomes independent on the Reynolds number is proposed to define the onset of the fully-turbulent regime.
Neural signature of the conscious processing of auditory regularities
Bekinschtein, Tristan A.; Dehaene, Stanislas; Rohaut, Benjamin; Tadel, François; Cohen, Laurent; Naccache, Lionel
2009-01-01
Can conscious processing be inferred from neurophysiological measurements? Some models stipulate that the active maintenance of perceptual representations across time requires consciousness. Capitalizing on this assumption, we designed an auditory paradigm that evaluates cerebral responses to violations of temporal regularities that are either local in time or global across several seconds. Local violations led to an early response in auditory cortex, independent of attention or the presence of a concurrent visual task, whereas global violations led to a late and spatially distributed response that was only present when subjects were attentive and aware of the violations. We could detect the global effect in individual subjects using functional MRI and both scalp and intracerebral event-related potentials. Recordings from 8 noncommunicating patients with disorders of consciousness confirmed that only conscious individuals presented a global effect. Taken together these observations suggest that the presence of the global effect is a signature of conscious processing, although it can be absent in conscious subjects who are not aware of the global auditory regularities. This simple electrophysiological marker could thus serve as a useful clinical tool. PMID:19164526
Regularities and symmetries in atomic structure and spectra
NASA Astrophysics Data System (ADS)
Pain, Jean-Christophe
2013-09-01
The use of statistical methods for the description of complex quantum systems was primarily motivated by the failure of a line-by-line interpretation of atomic spectra. Such methods reveal regularities and trends in the distributions of levels and lines. In the past, much attention was paid to the distribution of energy levels (Wigner surmise, random-matrix model…). However, information about the distribution of the lines (energy and strength) is lacking. Thirty years ago, Learner found empirically an unexpected law: the logarithm of the number of lines whose intensities lie between 2kI0 and 2k+1I0, I0 being a reference intensity and k an integer, is a decreasing linear function of k. In the present work, the fractal nature of such an intriguing regularity is outlined and a calculation of its fractal dimension is proposed. Other peculiarities are also presented, such as the fact that the distribution of line strengths follows Benford's law of anomalous numbers, the existence of additional selection rules (PH coupling), the symmetry with respect to a quarter of the subshell in the spin-adapted space (LL coupling) and the odd-even staggering in the distribution of quantum numbers, pointed out by Bauche and Cossé.
Autocorrelation and regularization in digital images. I - Basic theory
NASA Technical Reports Server (NTRS)
Jupp, David L. B.; Strahler, Alan H.; Woodcock, Curtis E.
1988-01-01
Spatial structure occurs in remotely sensed images when the imaged scenes contain discrete objects that are identifiable in that their spectral properties are more homogeneous within than between them and other scene elements. The spatial structure introduced is manifest in statistical measures such as the autocovariance function and variogram associated with the scene, and it is possible to formulate these measures explicitly for scenes composed of simple objects of regular shapes. Digital images result from sensing scenes by an instrument with an associated point spread function (PSF). Since there is averaging over the PSF, the effect, termed regularization, induced in the image data by the instrument will influence the observable autocovariance and variogram functions of the image data. It is shown how the autocovariance or variogram of an image is a composition of the underlying scene covariance convolved with an overlap function, which is itself a convolution of the PSF. The functional form of this relationship provides an analytic basis for scene inference and eventual inversion of scene model parameters from image data.
Color Image Restoration Using Nonlocal Mumford-Shah Regularizers
NASA Astrophysics Data System (ADS)
Jung, Miyoun; Bresson, Xavier; Chan, Tony F.; Vese, Luminita A.
We introduce several color image restoration algorithms based on the Mumford-Shah model and nonlocal image information. The standard Ambrosio-Tortorelli and Shah models are defined to work in a small local neighborhood, which are sufficient to denoise smooth regions with sharp boundaries. However, textures are not local in nature and require semi-local/non-local information to be denoised efficiently. Inspired from recent work (NL-means of Buades, Coll, Morel and NL-TV of Gilboa, Osher), we extend the standard models of Ambrosio-Tortorelli and Shah approximations to Mumford-Shah functionals to work with nonlocal information, for better restoration of fine structures and textures. We present several applications of the proposed nonlocal MS regularizers in image processing such as color image denoising, color image deblurring in the presence of Gaussian or impulse noise, color image inpainting, and color image super-resolution. In the formulation of nonlocal variational models for the image deblurring with impulse noise, we propose an efficient preprocessing step for the computation of the weight function w. In all the applications, the proposed nonlocal regularizers produce superior results over the local ones, especially in image inpainting with large missing regions. Experimental results and comparisons between the proposed nonlocal methods and the local ones are shown.
Personalized microbial network inference via co-regularized spectral clustering.
Imangaliyev, Sultan; Keijser, Bart; Crielaard, Wim; Tsivtsivadze, Evgeni
2015-07-15
We use Human Microbiome Project (HMP) cohort (Peterson et al., 2009) to infer personalized oral microbial networks of healthy individuals. To determine clustering of individuals with similar microbial profiles, co-regularized spectral clustering algorithm is applied to the dataset. For each cluster we discovered, we compute co-occurrence relationships among the microbial species that determine microbial network per cluster of individuals. The results of our study suggest that there are several differences in microbial interactions on personalized network level in healthy oral samples acquired from various niches. Based on the results of co-regularized spectral clustering we discover two groups of individuals with different topology of their microbial interaction network. The results of microbial network inference suggest that niche-wise interactions are different in these two groups. Our study shows that healthy individuals have different microbial clusters according to their oral microbiota. Such personalized microbial networks open a better understanding of the microbial ecology of healthy oral cavities and new possibilities for future targeted medication. The scripts written in scientific Python and in Matlab, which were used for network visualization, are provided for download on the website http://learning-machines.com/. PMID:25842007
Nonlocal Mumford-Shah regularizers for color image restoration.
Jung, Miyoun; Bresson, Xavier; Chan, Tony F; Vese, Luminita A
2011-06-01
We propose here a class of restoration algorithms for color images, based upon the Mumford-Shah (MS) model and nonlocal image information. The Ambrosio-Tortorelli and Shah elliptic approximations are defined to work in a small local neighborhood, which are sufficient to denoise smooth regions with sharp boundaries. However, texture is nonlocal in nature and requires semilocal/non-local information for efficient image denoising and restoration. Inspired from recent works (nonlocal means of Buades, Coll, Morel, and nonlocal total variation of Gilboa, Osher), we extend the local Ambrosio-Tortorelli and Shah approximations to MS functional (MS) to novel nonlocal formulations, for better restoration of fine structures and texture. We present several applications of the proposed nonlocal MS regularizers in image processing such as color image denoising, color image deblurring in the presence of Gaussian or impulse noise, color image inpainting, color image super-resolution, and color filter array demosaicing. In all the applications, the proposed nonlocal regularizers produce superior results over the local ones, especially in image inpainting with large missing regions. We also prove several characterizations of minimizers based upon dual norm formulations. PMID:21078579
Regular patterns in subglacial bedforms demonstrate emergent field behaviour
NASA Astrophysics Data System (ADS)
Clark, Chris; Ely, Jeremy; Spagnolo, Matteo; Hahn, Ute; Stokes, Chris; Hughes, Anna
2016-04-01
Somewhat counter-intuitively, ice-sheets abhor flat beds when flowing over soft sedimentary substrates. Instead, they produce an undulated surface, metres in relief and with length-scales of hundreds of metres. The resistive stresses that such bumps impart on ice flow affect the functioning of ice sheets by slowing ice transfer to lower elevations for melting and calving. The most abundant roughness elements are drumlins, streamlined in the direction of ice flow. Understanding their formation has eluded scientific explanation for almost two centuries with the literature seeking mechanistic explanations for individual bumps. Here we analyse tens of thousands of drumlins and find that they possess a strong regularity in their spatial positioning, which requires interactions between drumlins during their formation. This demonstrates a pattern-forming behaviour that requires explanation at the scale of drumlinised landscapes, beyond that of individual drumlins. Such regularity is expected to arise from interdependence between ice flow, sediment flux and the shape of the bed, with drumlins representing a specific emergent property of these interactions. That bed roughness is found to organise itself into specific, predictable and patterned length-scales might assist next generation of 'sliding laws' that incorporate ice-bed interactions, thereby improving modelling of ice-sheet flow.
Mesoscopic Higher Regularity and Subadditivity in Elliptic Homogenization
NASA Astrophysics Data System (ADS)
Armstrong, Scott; Kuusi, Tuomo; Mourrat, Jean-Christophe
2016-05-01
We introduce a new method for obtaining quantitative results in stochastic homogenization for linear elliptic equations in divergence form. Unlike previous works on the topic, our method does not use concentration inequalities (such as Poincaré or logarithmic Sobolev inequalities in the probability space) and relies instead on a higher (C k , k ≥ 1) regularity theory for solutions of the heterogeneous equation, which is valid on length scales larger than a certain specified mesoscopic scale. This regularity theory, which is of independent interest, allows us to, in effect, localize the dependence of the solutions on the coefficients and thereby accelerate the rate of convergence of the expected energy of the cell problem by a bootstrap argument. The fluctuations of the energy are then tightly controlled using subadditivity. The convergence of the energy gives control of the scaling of the spatial averages of gradients and fluxes (that is, it quantifies the weak convergence of these quantities), which yields, by a new "multiscale" Poincaré inequality, quantitative estimates on the sublinearity of the corrector.
Regularized matched-mode processing for source localization.
Collison, N E; Dosso, S E
2000-06-01
This paper develops a new approach to matched-mode processing (MMP) for ocean acoustic source localization. MMP consists of decomposing far-field acoustic data measured at an array of sensors to obtain the excitations of the propagating modes, then matching these with modeled replica excitations computed for a grid of possible source locations. However, modal decomposition can be ill-posed and unstable if the sensor array does not provide an adequate spatial sampling of the acoustic field (i.e., the problem is underdetermined). For such cases, standard decomposition methods yield minimum-norm solutions that are biased towards zero. Although these methods provide a mathematical solution (i.e., a stable solution that fits the data), they may not represent the most physically meaningful solution. The new approach of regularized matched-mode processing (RMMP) carries out an independent modal decomposition prior to comparison with the replica excitations for each grid point, using the replica itself as the a priori estimate in a regularized inversion. For grid points at or near the source location, this should provide a more physically meaningful decomposition; at other points, the procedure provides a stable inversion. In this paper, RMMP is compared to standard MMP and matched-field processing for a series of realistic synthetic test cases, including a variety of noise levels and sensor array configurations, as well as the effects of environmental mismatch. PMID:10875355
Bilateral filter regularized accelerated Demons for improved discontinuity preserving registration.
Demirović, D; Šerifović-Trbalić, A; Prljača, N; Cattin, Ph C
2015-03-01
The classical accelerated Demons algorithm uses Gaussian smoothing to penalize oscillatory motion in the displacement fields during registration. This well known method uses the L2 norm for regularization. Whereas the L2 norm is known for producing well behaving smooth deformation fields it cannot properly deal with discontinuities often seen in the deformation field as the regularizer cannot differentiate between discontinuities and smooth part of motion field. In this paper we propose replacement the Gaussian filter of the accelerated Demons with a bilateral filter. In contrast the bilateral filter not only uses information from displacement field but also from the image intensities. In this way we can smooth the motion field depending on image content as opposed to the classical Gaussian filtering. By proper adjustment of two tunable parameters one can obtain more realistic deformations in a case of discontinuity. The proposed approach was tested on 2D and 3D datasets and showed significant improvements in the Target Registration Error (TRE) for the well known POPI dataset. Despite the increased computational complexity, the improved registration result is justified in particular abdominal data sets where discontinuities often appear due to sliding organ motion. PMID:25541494
Mixed noise removal by weighted encoding with sparse nonlocal regularization.
Jiang, Jielin; Zhang, Lei; Yang, Jian
2014-06-01
Mixed noise removal from natural images is a challenging task since the noise distribution usually does not have a parametric model and has a heavy tail. One typical kind of mixed noise is additive white Gaussian noise (AWGN) coupled with impulse noise (IN). Many mixed noise removal methods are detection based methods. They first detect the locations of IN pixels and then remove the mixed noise. However, such methods tend to generate many artifacts when the mixed noise is strong. In this paper, we propose a simple yet effective method, namely weighted encoding with sparse nonlocal regularization (WESNR), for mixed noise removal. In WESNR, there is not an explicit step of impulse pixel detection; instead, soft impulse pixel detection via weighted encoding is used to deal with IN and AWGN simultaneously. Meanwhile, the image sparsity prior and nonlocal self-similarity prior are integrated into a regularization term and introduced into the variational encoding framework. Experimental results show that the proposed WESNR method achieves leading mixed noise removal performance in terms of both quantitative measures and visual quality. PMID:24760906
Equivariant K-theory of regular compactifications: further developments
NASA Astrophysics Data System (ADS)
Uma, V.
2016-04-01
We describe the \\widetilde G× \\widetilde G-equivariant K-ring of X, where \\widetilde G is a factorial covering of a connected complex reductive algebraic group G, and X is a regular compactification of G. Furthermore, using the description of K\\widetilde G×\\widetilde G(X), we describe the ordinary K-ring K(X) as a free module (whose rank is equal to the cardinality of the Weyl group) over the K-ring of a toric bundle over G/B whose fibre is equal to the toric variety \\overline{T}+ associated with a smooth subdivision of the positive Weyl chamber. This generalizes our previous work on the wonderful compactification (see [1]). We also give an explicit presentation of K\\widetilde G×\\widetilde G(X) and K(X) as algebras over K\\widetilde G×\\widetilde G(\\overline{G\\operatorname{ad}}) and K(\\overline{G\\operatorname{ad}}) respectively, where \\overline{G\\operatorname{ad}} is the wonderful compactification of the adjoint semisimple group G\\operatorname{ad}. In the case when X is a regular compactification of G\\operatorname{ad}, we give a geometric interpretation of these presentations in terms of the equivariant and ordinary Grothendieck rings of a canonical toric bundle over \\overline{G\\operatorname{ad}}.
Choice of regularization weight in basis pursuit reflectivity inversion
NASA Astrophysics Data System (ADS)
Sen, Mrinal K.; Biswas, Reetam
2015-02-01
Seismic inverse problem of estimating P- and S-wave reflectivity from seismic traces has recently been revisited using a basis pursuit denoising inversion (BPI) approach. The BPI uses a wedge dictionary to define model constraints, which has been successful in resolving thin beds. Here we address two fundamental problems associated with BPI, namely, the uniqueness of the estimate and the choice of regularization weight λ to be used in the model norm. We investigated these using very fast simulated re-annealing (VFSR) and gradient projection sparse reconstruction (GPSR) approaches. For a synthetic model with two reflectors separated by one time sample, we are able to demonstrate convergence of VFSR to the true model with different random starting models. Two numerical approaches to estimating the regularization weight were investigated. One uses λ as a hyper-parameter and the other uses this as a temperature-like annealing parameter. In both cases, we were able to obtain λ fairly rapidly. Finally, an analytic formula for λ that is iteration adaptive was also implemented. Successful applications of our approach to synthetic and field data demonstrate validity and robustness.
Effect of Regular Exercise Program on Depression in Hemodialysis Patients
Rezaei, Jahangir; Abdi, Alireza; Rezaei, Mansour; Heydarnezhadian, Jafar; Jalali, Rostam
2015-01-01
Background and Aim. Depression is the most common psychological disorder in hemodialysis patients which decreases their quality of life and increases the mortality. This study was conducted to assess the effect of regular exercise on depression in hemodialysis patients. Methods. In a randomized clinical trial, 51 hemodialysis patients were allocated in two groups. Beck Depression Inventory (BDI) scale was used to assessing depression rate in participants. Designed program was educated using poster and face-to-face methods for case group. Intervention was carried out three times a week for ten weeks. At the beginning and the end of the study, depression rate of the subjects was assessed. Data was analyzed by SPSS16 software and descriptive and inferential statistics. Findings. According to the results of this study, there were no differences between case and control groups in depression rate at the beginning of the study, but there was significant difference after intervention (P = 0.016). In the beginning of the study, the mean and SD of depression in case group were 23.8 ± 9.29 and reduced to 11.07 ± 12.64 at the end (P < 0.001). Conclusion. The regular exercise program could reduce the depression in hemodialysis patients; therefore it is suggested for training this program for hemodialysis patients. This trial is registered with Iranian Registry of Clinical Trial (IRCT) number IRCT201205159763N1. PMID:27347502
Image superresolution by midfrequency sparse representation and total variation regularization
NASA Astrophysics Data System (ADS)
Xu, Jian; Chang, Zhiguo; Fan, Jiulun; Zhao, Xiaoqiang; Wu, Xiaomin; Wang, Yanzi
2015-01-01
Machine learning has provided many good tools for superresolution, whereas existing methods still need to be improved in many aspects. On one hand, the memory and time cost should be reduced. On the other hand, the step edges of the results obtained by the existing methods are not clear enough. We do the following work. First, we propose a method to extract the midfrequency features for dictionary learning. This method brings the benefit of a reduction of the memory and time complexity without sacrificing the performance. Second, we propose a detailed wiping-off total variation (DWO-TV) regularization model to reconstruct the sharp step edges. This model adds a novel constraint on the downsampling version of the high-resolution image to wipe off the details and artifacts and sharpen the step edges. Finally, step edges produced by the DWO-TV regularization and the details provided by learning are fused. Experimental results show that the proposed method offers a desirable compromise between low time and memory cost and the reconstruction quality.
Filter ensemble regularized common spatial pattern for EEG classification
NASA Astrophysics Data System (ADS)
Su, Yuxi; Li, Yali; Wang, Shengjin
2015-07-01
Common Spatial Pattern (CSP) is one of the most effective feature extraction algorithm for Brain-Computer Interfaces (BCI). Despite its advantages of wide versatility and high efficiency, CSP is shown to be non-robust to noise and prone to over fitting when training sample number is limited. In order to overcome these problems, Regularized Common Spatial Pattern (RCSP) is further proposed. RCSP regularized covariance matrix estimation by two parameters, which reduces the estimation difference and improves the stationarity under small sample condition. However, RCSP does not make full use of the frequency information. In this paper, we presents a filter ensemble technique for RCSP (FERCSP) to further extract frequency information and aggregate all the RCSPs efficiently to get an ensemble-based solution. The performance of the proposed algorithm is evaluated on data set IVa of BCI Competition III against other five RCSPbased algorithms. The experimental results show that FERCSP significantly outperforms those of the existing methods in classification accuracy. The FERCSP outperforms the CSP algorithm and R-CSP-A algorithm in all five subjects with an average improvement of 6% in accuracy.
Talking Physics to Regular People: The Why and the How
NASA Astrophysics Data System (ADS)
Perkowitz, Sidney
2013-04-01
The huge popular interest in the Higgs boson shows that non-physicists can be fascinated by the ideas of physics, even highly abstract ones. That's one good reason to talk physics to ``regular people.'' A second important reason is that society supports physics and in return, deserves to know what physicists are doing. Another is the need to engage young people who may become physicists. Yet another is that when we translate our work so anyone can grasp it, we ourselves better understand it and what it means outside the lab. Especially in today's climate where funding for science, and science itself, are under threat, it's essential that regular people know us, what we do, and why it is important. That's the ``why'' of talking physics. To discuss the ``how,'' I'll draw on my long and extensive experience in presenting physics, technology and science to non-scientists through books and articles, blogs, videos, lectures, stage and museum works, and media appearances (see http://sidneyperkowitz.net). I'll offer ideas about talking physics to different groups, at different levels, and for different purposes, and about how to use such outreach to enrich your own career in physics while helping the physics community.
Regularities in responding during performance of a complex choice task.
Mercado, Eduardo; Orduña, Vladimir
2015-12-01
Systematic variations in the rate and temporal patterns of responding under a multiple concurrent-chains schedule were quantified using recurrence metrics and self-organizing maps to assess whether individual rats showed consistent or idiosyncratic patterns. The results indicated that (1) the temporal regularity of response patterns varied as a function of number of training sessions, time on task, magnitude of reinforcement, and reinforcement contingencies; (2) individuals showed heterogeneous, stereotyped patterns of responding, despite similarities in matching behavior; (3) the specific trajectories of behavioral variation shown by individuals were less evident in group-level analyses; and (4) reinforcement contingencies within terminal links strongly modulated response patterns within initial links. Temporal regularity in responding was most evident for responses that led to minimally delayed reinforcers of larger magnitude. Models of response production and selection that take into account the time between individual responses, probabilities of transitions between response options, periodicity within response sequences, and individual differences in response dynamics can clarify the mechanisms that drive behavioral adjustments during operant conditioning. PMID:26077440
Investigation of wall-bounded turbulence over regularly distributed roughness
NASA Astrophysics Data System (ADS)
Placidi, Marco; Ganapathisubramani, Bharathram
2012-11-01
The effects of regularly distributed roughness elements on the structure of a turbulent boundary layer are examined by performing a series of Planar (high resolution l+ ~ 30) and Stereoscopic Particle Image Velocimetry (PIV) experiments in a wind tunnel. An adequate description of how to best characterise a rough wall, especially one where the density of roughness elements is sparse, is yet to be developed. In this study, rough surfaces consisting of regularly and uniformly distributed LEGO® blocks are used. Twelve different patterns are adopted in order to systematically examine the effects of frontal solidity (λf, frontal area of the roughness elements per unit wall-parallel area) and plan solidity (λp, plan area of roughness elements per unit wall-parallel area), on the turbulence structure. The Karman number, Reτ , is approximately 4000 across the different cases. Spanwise 3D vector fields at two different wall-normal locations (top of the canopy and within the log-region) are also compared to examine the spanwise homogeneity of the flow across different surfaces. In the talk, a detailed analysis of mean and rms velocity profiles, Reynolds stresses, and quadrant decomposition for the different patterns will be presented.
Suggesting Missing Relations in Biomedical Ontologies Based on Lexical Regularities.
Quesada-Martínez, Manuel; Fernández-Breis, Jesualdo Tomás; Karlsson, Daniel
2016-01-01
The number of biomedical ontologies has increased significantly in recent years. Many of such ontologies are the result of efforts of communities of domain experts and ontology engineers. The development and application of quality assurance (QA) methods should help these communities to develop useful ontologies for both humans and machines. According to previous studies, biomedical ontologies are rich in natural language content, but most of them are not so rich in axiomatic terms. Here, we are interested in studying the relation between content in natural language and content in axiomatic form. The analysis of the labels of the classes permits to identify lexical regularities (LRs), which are sets of words that are shared by labels of different classes. Our assumption is that the classes exhibiting an LR should be logically related through axioms, which is used to propose an algorithm to detect missing relations in the ontology. Here, we analyse a lexical regularity of SNOMED CT, congenital stenosis, which is reported as problematic by the SNOMED CT maintenance team. PMID:27577409
Theory of volume transition in polyelectrolyte gels with charge regularization
NASA Astrophysics Data System (ADS)
Hua, Jing; Mitra, Mithun K.; Muthukumar, M.
2012-04-01
We present a theory for polyelectrolyte gels that allow the effective charge of the polymer backbone to self-regulate. Using a variational approach, we obtain an expression for the free energy of gels that accounts for the gel elasticity, free energy of mixing, counterion adsorption, local dielectric constant, electrostatic interaction among polymer segments, electrolyte ion correlations, and self-consistent charge regularization on the polymer strands. This free energy is then minimized to predict the behavior of the system as characterized by the gel volume fraction as a function of external variables such as temperature and salt concentration. We present results for the volume transition of polyelectrolyte gels in salt-free solvents, solvents with monovalent salts, and solvents with divalent salts. The results of our theoretical analysis capture the essential features of existing experimental results and also provide predictions for further experimentation. Our analysis highlights the importance of the self-regularization of the effective charge for the volume transition of gels in particular, and for charged polymer systems in general. Our analysis also enables us to identify the dominant free energy contributions for charged polymer networks and provides a framework for further investigation of specific experimental systems.
75 FR 75674 - Farm Credit Administration Board; Sunshine Act; Regular Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-06
... From the Federal Register Online via the Government Publishing Office FARM CREDIT ADMINISTRATION Farm Credit Administration Board; Sunshine Act; Regular Meeting AGENCY: Farm Credit Administration... the regular meeting of the Farm Credit Administration Board (Board). Date and Time: The...
76 FR 48161 - Farm Credit Administration Board; Sunshine Act; Regular Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-08
... From the Federal Register Online via the Government Publishing Office FARM CREDIT ADMINISTRATION Farm Credit Administration Board; Sunshine Act; Regular Meeting AGENCY: Farm Credit Administration... the regular meeting of the Farm Credit Administration Board (Board). DATE AND TIME: The...
76 FR 40729 - Farm Credit Administration Board; Sunshine Act; Regular Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-11
... From the Federal Register Online via the Government Publishing Office FARM CREDIT ADMINISTRATION Farm Credit Administration Board; Sunshine Act; Regular Meeting AGENCY: Farm Credit Administration... the regular meeting of the Farm Credit Administration Board (Board). DATE AND TIME: The...
76 FR 32360 - Farm Credit Administration Board; Sunshine Act; Regular Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-06
... From the Federal Register Online via the Government Publishing Office FARM CREDIT ADMINISTRATION Farm Credit Administration Board; Sunshine Act; Regular Meeting AGENCY: Farm Credit Administration... the regular meeting of the Farm Credit Administration Board (Board). DATE AND TIME: The...
76 FR 75878 - Farm Credit Administration Board; Sunshine Act; Regular Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
... From the Federal Register Online via the Government Publishing Office FARM CREDIT ADMINISTRATION Farm Credit Administration Board; Sunshine Act; Regular Meeting AGENCY: Farm Credit Administration... the regular meeting of the Farm Credit Administration Board (Board). DATE AND TIME: The...
76 FR 12356 - Farm Credit Administration Board; Sunshine Act; Regular Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
... From the Federal Register Online via the Government Publishing Office FARM CREDIT ADMINISTRATION Farm Credit Administration Board; Sunshine Act; Regular Meeting AGENCY: Farm Credit Administration... the regular meeting of the Farm Credit Administration Board (Board). DATE AND TIME: The...
75 FR 23761 - Farm Credit Administration Board; Sunshine Act Regular Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-04
... From the Federal Register Online via the Government Publishing Office FARM CREDIT ADMINISTRATION Farm Credit Administration Board; Sunshine Act Regular Meeting AGENCY: Farm Credit Administration... the regular meeting of the Farm Credit Administration Board (Board). DATE AND TIME: The...
75 FR 9414 - Farm Credit Administration Board; Sunshine Act; Regular Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-02
... From the Federal Register Online via the Government Publishing Office FARM CREDIT ADMINISTRATION Farm Credit Administration Board; Sunshine Act; Regular Meeting AGENCY: Farm Credit Administration... the regular meeting of the Farm Credit Administration Board (Board). DATE AND TIME: The...
NASA Astrophysics Data System (ADS)
Sumin, M. I.
2015-06-01
A parametric nonlinear programming problem in a metric space with an operator equality constraint in a Hilbert space is studied assuming that its lower semicontinuous value function at a chosen individual parameter value has certain subdifferentiability properties in the sense of nonlinear (nonsmooth) analysis. Such subdifferentiability can be understood as the existence of a proximal subgradient or a Fréchet subdifferential. In other words, an individual problem has a corresponding generalized Kuhn-Tucker vector. Under this assumption, a stable sequential Kuhn-Tucker theorem in nondifferential iterative form is proved and discussed in terms of minimizing sequences on the basis of the dual regularization method. This theorem provides necessary and sufficient conditions for the stable construction of a minimizing approximate solution in the sense of Warga in the considered problem, whose initial data can be approximately specified. A substantial difference of the proved theorem from its classical same-named analogue is that the former takes into account the possible instability of the problem in the case of perturbed initial data and, as a consequence, allows for the inherited instability of classical optimality conditions. This theorem can be treated as a regularized generalization of the classical Uzawa algorithm to nonlinear programming problems. Finally, the theorem is applied to the "simplest" nonlinear optimal control problem, namely, to a time-optimal control problem.
On the Distinction between Regular and Irregular Inflectional Morphology: Evidence from Dinka
ERIC Educational Resources Information Center
Ladd, D. Robert; Remijsen, Bert; Manyang, Caguor Adong
2009-01-01
Discussions of the psycholinguistic significance of regularity in inflectional morphology generally deal with languages in which regular forms can be clearly identified and revolve around whether there are distinct processing mechanisms for regular and irregular forms. We present a detailed description of Dinka's notoriously irregular noun number…
ERIC Educational Resources Information Center
Halvorsen, Ann T.; And Others
This needs assessment instrument was developed as part of the PEERS (Providing Education for Everyone in Regular Schools) Project, a California project to integrate students with severe disabilities who were previously at special centers into services at regular school sites and students who were in special classes in regular schools into general…
El Maestro de Sala Regular de Clases Ante el Proceso de Inclusion del Nino Con Impedimento
ERIC Educational Resources Information Center
Rosa Morales, Awilda
2012-01-01
The purpose of this research was to describe the experiences of regular class elementary school teachers with the Puerto Rico Department of Education who have worked with handicapped children who have been integrated to the regular classroom. Five elementary level regular class teachers were selected in the northwest zone of Puerto Rico who during…
Project S.E.R.T. - Special Education for Regular Teachers.
ERIC Educational Resources Information Center
Hale, Steve; And Others
Evaluated in two field tests with 50 regular teachers was a set of eight instructional modules designed to develop the competencies of regular teachers involved in mainstreaming handicapped children as part of Project SERT (Special Education for Regular Teachers). The following modules were developed: comprehensive special education, formal…
Spatially-Variant Tikhonov Regularization for Double-Difference Waveform Inversion
Lin, Youzuo; Huang, Lianjie; Zhang, Zhigang
2011-01-01
Double-difference waveform inversion is a potential tool for quantitative monitoring for geologic carbon storage. It jointly inverts time-lapse seismic data for changes in reservoir geophysical properties. Due to the ill-posedness of waveform inversion, it is a great challenge to obtain reservoir changes accurately and efficiently, particularly when using time-lapse seismic reflection data. Regularization techniques can be utilized to address the issue of ill-posedness. The regularization parameter controls the smoothness of inversion results. A constant regularization parameter is normally used in waveform inversion, and an optimal regularization parameter has to be selected. The resulting inversion results are a trade off among regions with different smoothness or noise levels; therefore the images are either over regularized in some regions while under regularized in the others. In this paper, we employ a spatially-variant parameter in the Tikhonov regularization scheme used in double-difference waveform tomography to improve the inversion accuracy and robustness. We compare the results obtained using a spatially-variant parameter with those obtained using a constant regularization parameter and those produced without any regularization. We observe that, utilizing a spatially-variant regularization scheme, the target regions are well reconstructed while the noise is reduced in the other regions. We show that the spatially-variant regularization scheme provides the flexibility to regularize local regions based on the a priori information without increasing computational costs and the computer memory requirement.
20 CFR 220.26 - Disability for any regular employment, defined.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Disability for any regular employment... RAILROAD RETIREMENT ACT DETERMINING DISABILITY Disability Under the Railroad Retirement Act for Any Regular Employment § 220.26 Disability for any regular employment, defined. An employee, widow(er), or child...
20 CFR 220.26 - Disability for any regular employment, defined.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Disability for any regular employment, defined... RETIREMENT ACT DETERMINING DISABILITY Disability Under the Railroad Retirement Act for Any Regular Employment § 220.26 Disability for any regular employment, defined. An employee, widow(er), or child is...
20 CFR 220.26 - Disability for any regular employment, defined.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Disability for any regular employment... RAILROAD RETIREMENT ACT DETERMINING DISABILITY Disability Under the Railroad Retirement Act for Any Regular Employment § 220.26 Disability for any regular employment, defined. An employee, widow(er), or child...
20 CFR 220.10 - Disability for work in an employee's regular railroad occupation.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Work in an Employee's Regular Railroad Occupation § 220.10 Disability for work in an employee's regular... be found by the Board to be disabled for work in his or her regular railroad occupation because of a... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Disability for work in an employee's...
20 CFR 220.13 - Establishment of permanent disability for work in regular railroad occupation.
Code of Federal Regulations, 2011 CFR
2011-04-01
... work in regular railroad occupation. 220.13 Section 220.13 Employees' Benefits RAILROAD RETIREMENT... Retirement Act for Work in an Employee's Regular Railroad Occupation § 220.13 Establishment of permanent disability for work in regular railroad occupation. The Board will presume that a claimant who is not...
20 CFR 220.13 - Establishment of permanent disability for work in regular railroad occupation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... work in regular railroad occupation. 220.13 Section 220.13 Employees' Benefits RAILROAD RETIREMENT... Retirement Act for Work in an Employee's Regular Railroad Occupation § 220.13 Establishment of permanent disability for work in regular railroad occupation. The Board will presume that a claimant who is not...
20 CFR 220.10 - Disability for work in an employee's regular railroad occupation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Work in an Employee's Regular Railroad Occupation § 220.10 Disability for work in an employee's regular... be found by the Board to be disabled for work in his or her regular railroad occupation because of a... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Disability for work in an employee's...
Regularity for steady periodic capillary water waves with vorticity.
Henry, David
2012-04-13
In the following, we prove new regularity results for two-dimensional steady periodic capillary water waves with vorticity, in the absence of stagnation points. Firstly, we prove that if the vorticity function has a Hölder-continuous first derivative, then the free surface is a smooth curve and the streamlines beneath the surface will be real analytic. Furthermore, once we assume that the vorticity function is real analytic, it will follow that the wave surface profile is itself also analytic. A particular case of this result includes irrotational fluid flow where the vorticity is zero. The property of the streamlines being analytic allows us to gain physical insight into small-amplitude waves by justifying a power-series approach. PMID:22393112
Constructing a logical, regular axis topology from an irregular topology
Faraj, Daniel A.
2014-07-01
Constructing a logical regular topology from an irregular topology including, for each axial dimension and recursively, for each compute node in a subcommunicator until returning to a first node: adding to a logical line of the axial dimension a neighbor specified in a nearest neighbor list; calling the added compute node; determining, by the called node, whether any neighbor in the node's nearest neighbor list is available to add to the logical line; if a neighbor in the called compute node's nearest neighbor list is available to add to the logical line, adding, by the called compute node to the logical line, any neighbor in the called compute node's nearest neighbor list for the axial dimension not already added to the logical line; and, if no neighbor in the called compute node's nearest neighbor list is available to add to the logical line, returning to the calling compute node.
Regular graphs maximize the variability of random neural networks
NASA Astrophysics Data System (ADS)
Wainrib, Gilles; Galtier, Mathieu
2015-09-01
In this work we study the dynamics of systems composed of numerous interacting elements interconnected through a random weighted directed graph, such as models of random neural networks. We develop an original theoretical approach based on a combination of a classical mean-field theory originally developed in the context of dynamical spin-glass models, and the heterogeneous mean-field theory developed to study epidemic propagation on graphs. Our main result is that, surprisingly, increasing the variance of the in-degree distribution does not result in a more variable dynamical behavior, but on the contrary that the most variable behaviors are obtained in the regular graph setting. We further study how the dynamical complexity of the attractors is influenced by the statistical properties of the in-degree distribution.
Uncorrelated regularized local Fisher discriminant analysis for face recognition
NASA Astrophysics Data System (ADS)
Wang, Zhan; Ruan, Qiuqi; An, Gaoyun
2014-07-01
A local Fisher discriminant analysis can work well for a multimodal problem. However, it often suffers from the undersampled problem, which makes the local within-class scatter matrix singular. We develop a supervised discriminant analysis technique called uncorrelated regularized local Fisher discriminant analysis for image feature extraction. In this technique, the local within-class scatter matrix is approximated by a full-rank matrix that not only solves the undersampled problem but also eliminates the poor impact of small and zero eigenvalues. Statistically uncorrelated features are obtained to remove redundancy. A trace ratio criterion and the corresponding iterative algorithm are employed to globally solve the objective function. Experimental results on four famous face databases indicate that our proposed method is effective and outperforms the conventional dimensionality reduction methods.
Weak Gravitational Lensing from Regular Bardeen Black Holes
NASA Astrophysics Data System (ADS)
Ghaffarnejad, Hossein; niad, Hassan
2016-03-01
In this article we study weak gravitational lensing of regular Bardeen black hole which has scalar charge g and mass m. We investigate the angular position and magnification of non-relativistic images in two cases depending on the presence or absence of photon sphere. Defining dimensionless charge parameter q= {g}/{2m} we seek to disappear photon sphere in the case of |q|>{24√5}/{125} for which the space time metric encounters strongly with naked singularities. We specify the basic parameters of lensing in terms of scalar charge by using the perturbative method and found that the parity of images is different in two cases: (a) The strongly naked singularities is present in the space time. (b) singularity of space time is weak or is eliminated (the black hole lens).
Pauli-Villars Regularization of Non-Abelian Gauge Theories
NASA Astrophysics Data System (ADS)
Hiller, J. R.
2016-04-01
As an extension of earlier work on QED, we construct a BRST-invariant Lagrangian for SU(N) Yang-Mills theory with fundamental matter, regulated by the inclusion of massive Pauli-Villars (PV) gluons and PV quarks. The underlying gauge symmetry for massless PV gluons is generalized to accommodate the PV-index-changing currents that are required by the regularization. Auxiliary adjoint scalars are used, in a mechanism due to Stueckelberg, to attribute mass to the PV gluons and the PV quarks. The addition of Faddeev-Popov ghosts then establishes a residual BRST symmetry. Although there are drawbacks to the approach, in particular the computational load of a large number of PV fields and a nonlocal interaction of the ghost fields, this formulation could provide a foundation for renormalizable nonperturbative solutions of light-front QCD in an arbitrary covariant gauge.
Constructing a logical, regular axis topology from an irregular topology
Faraj, Daniel A.
2014-07-22
Constructing a logical regular topology from an irregular topology including, for each axial dimension and recursively, for each compute node in a subcommunicator until returning to a first node: adding to a logical line of the axial dimension a neighbor specified in a nearest neighbor list; calling the added compute node; determining, by the called node, whether any neighbor in the node's nearest neighbor list is available to add to the logical line; if a neighbor in the called compute node's nearest neighbor list is available to add to the logical line, adding, by the called compute node to the logical line, any neighbor in the called compute node's nearest neighbor list for the axial dimension not already added to the logical line; and, if no neighbor in the called compute node's nearest neighbor list is available to add to the logical line, returning to the calling compute node.
ImERSE (Improving Experience through Regular Shadowing Events)
Calvert, William; Minford, Joanne; Platt, Carol; Chatfield, Catriona
2015-01-01
Systematic operational quality improvement strategies within the NHS are hard to find, although there are numerous published reports of sporadic departmental models and methods resulting in improvements in clinical care. We describe the experience of devising a tool to provide large data collection of patient care experiences by using medical students to shadow patient journeys. This combines patient and family centred care (PFCC) and quality improvement approaches to create a systematic organisational strategy for improving care. The ImERSE (improving experience through regular shadowing events) approach could be applied to any area of health care to generate population specific improvement priorities. It can be used to promote patient and family centred care and provide a unique medical education experience. We describe its evolution in its first year of use and suggest that using the ImERSE approach delivers beneficial characteristics to patients and their families, those undergoing a shadowing experience, and provider organisations. PMID:26734410
Regularized Primal-Dual Subgradient Method for Distributed Constrained Optimization.
Yuan, Deming; Ho, Daniel W C; Xu, Shengyuan
2016-09-01
In this paper, we study the distributed constrained optimization problem where the objective function is the sum of local convex cost functions of distributed nodes in a network, subject to a global inequality constraint. To solve this problem, we propose a consensus-based distributed regularized primal-dual subgradient method. In contrast to the existing methods, most of which require projecting the estimates onto the constraint set at every iteration, only one projection at the last iteration is needed for our proposed method. We establish the convergence of the method by showing that it achieves an O ( K (-1/4) ) convergence rate for general distributed constrained optimization, where K is the iteration counter. Finally, a numerical example is provided to validate the convergence of the propose method. PMID:26285232
Superior Regularity in Erosion Patterns by Planar Subsurface Channeling
Redinger, Alex; Hansen, Henri; Michely, Thomas; Linke, Udo; Rosandi, Yudi; Urbassek, Herbert M.
2006-03-17
The onset of pattern formation through exposure of Pt(111) with 5 keV Ar{sup +} ions at grazing incidence has been studied at 550 K by scanning tunneling microscopy and is supplemented by molecular-dynamics simulations of single ion impacts. A consistent description of pattern formation in terms of atomic scale mechanisms is given. Most surprisingly, pattern formation depends crucially on the angle of incidence of the ions. As soon as this angle allows subsurface channeling of the ions, pattern regularity and alignment with respect to the ion beam greatly improves. These effects are traced back to the positionally aligned formation of vacancy islands through the damage created by the ions at dechanneling locations.
Regularized Embedded Multiple Kernel Dimensionality Reduction for Mine Signal Processing
Li, Shuang; Liu, Bing; Zhang, Chen
2016-01-01
Traditional multiple kernel dimensionality reduction models are generally based on graph embedding and manifold assumption. But such assumption might be invalid for some high-dimensional or sparse data due to the curse of dimensionality, which has a negative influence on the performance of multiple kernel learning. In addition, some models might be ill-posed if the rank of matrices in their objective functions was not high enough. To address these issues, we extend the traditional graph embedding framework and propose a novel regularized embedded multiple kernel dimensionality reduction method. Different from the conventional convex relaxation technique, the proposed algorithm directly takes advantage of a binary search and an alternative optimization scheme to obtain optimal solutions efficiently. The experimental results demonstrate the effectiveness of the proposed method for supervised, unsupervised, and semisupervised scenarios. PMID:27247562
Affective responses to qigong: a pilot study of regular practitioners.
Johansson, Mattias; Hassmén, Peter
2013-04-01
Single sessions of Qigong have been associated with increased positive affect/emotional benefits. In the present study the aim was to refine the present understanding by using newly developed research methodologies. Therefore, affective reactions were studied in a group performing Qigong through pre-, during, and post-assessments using a modified version of the short Swedish Core Affect Scale complemented with open-ended questions. Affect was measured on a group and individual level. The results showed a shift during Qigong toward increased pleasant activated and deactivated affect in the group of 46 women who regularly practice Qigong. Inter-individual responses displayed positive affective responses, which also increased as the bout proceeded for the majority of practitioners. Acknowledging some limitations, these findings have practical implications for the enhancement of positive affect and subjective well-being. PMID:23561864
Lipschitz regularity of solutions for mixed integro-differential equations
NASA Astrophysics Data System (ADS)
Barles, Guy; Chasseigne, Emmanuel; Ciomaga, Adina; Imbert, Cyril
We establish new Hölder and Lipschitz estimates for viscosity solutions of a large class of elliptic and parabolic nonlinear integro-differential equations, by the classical Ishii-Lions's method. We thus extend the Hölder regularity results recently obtained by Barles, Chasseigne and Imbert (2011). In addition, we deal with a new class of nonlocal equations that we term mixed integro-differential equations. These equations are particularly interesting, as they are degenerate both in the local and nonlocal term, but their overall behavior is driven by the local-nonlocal interaction, e.g. the fractional diffusion may give the ellipticity in one direction and the classical diffusion in the complementary one.
Local graph regularized coding for salient object detection
NASA Astrophysics Data System (ADS)
Huo, Lina; Yang, Shuyuan; Jiao, Licheng; Wang, Shuang; Shi, Jiao
2016-07-01
Subspace segmentation based salient object detection has received increasing interests in recent years. To preserve the locality and similarity of regions, a grouping effect of representation is introduced to segment the salient object and background in subspace. Then a new saliency map is calculated by incorporating this local graph regularizer into coding, which explicitly explores the data self-representation model and thus locate more accurate salient regions. Moreover, a heuristic object-based dictionary from background superpixels is obtained in border set removing the image regions within the potential object regions. Experimental results on four large benchmark databases demonstrate that the proposed method performs favorably against eight recent state-of-the-art methods in terms of three evaluation criterions, with a reduction of MAE by 19.8% than GR and 29.3% than CB in the two SED datasets, respectively. Meanwhile, our method also runs faster than the comparative detection approaches.
The effect of regularization on the reconstruction of ACAR data
NASA Astrophysics Data System (ADS)
Weber, J. A.; Ceeh, H.; Hugenschmidt, C.; Leitner, M.; Böni, P.
2014-04-01
The Fermi surface, i.e. the two-dimensional surface separating occupied and unoccupied states in k-space, is the defining property of a metal. Full information about its shape is mandatory for identifying nesting vectors or for validating band structure calculations. With the angular correlation of positron-electron annihilation radiation (ACAR) it is easy to get projections of the Fermi surface. Nevertheless it is claimed to be inexact compared to more common methods like the determination based on quantum oscillations or angle-resolved photoemission spectroscopy. In this article we will present a method for reconstructing the Fermi surface from projections with statistically correct data treatment which is able to increase accuracy by introducing different types of regularization.
Quantitative interferometric microscopy cytometer based on regularized optical flow algorithm
NASA Astrophysics Data System (ADS)
Xue, Liang; Vargas, Javier; Wang, Shouyu; Li, Zhenhua; Liu, Fei
2015-09-01
Cell detections and analysis are important in various fields, such as medical observations and disease diagnoses. In order to analyze the cell parameters as well as observe the samples directly, in this paper, we present an improved quantitative interferometric microscopy cytometer, which can monitor the quantitative phase distributions of bio-samples and realize cellular parameter statistics. The proposed system is able to recover the phase imaging of biological samples in the expanded field of view via a regularized optical flow demodulation algorithm. This algorithm reconstructs the phase distribution with high accuracy with only two interferograms acquired at different time points simplifying the scanning system. Additionally, the method is totally automatic, and therefore it is convenient for establishing a quantitative phase cytometer. Moreover, the phase retrieval approach is robust against noise and background. Excitingly, red blood cells are readily investigated with the quantitative interferometric microscopy cytometer system.
Estimating parameter of influenza transmission using regularized least square
NASA Astrophysics Data System (ADS)
Nuraini, N.; Syukriah, Y.; Indratno, S. W.
2014-02-01
Transmission process of influenza can be presented in a mathematical model as a non-linear differential equations system. In this model the transmission of influenza is determined by the parameter of contact rate of the infected host and susceptible host. This parameter will be estimated using a regularized least square method where the Finite Element Method and Euler Method are used for approximating the solution of the SIR differential equation. The new infected data of influenza from CDC is used to see the effectiveness of the method. The estimated parameter represents the contact rate proportion of transmission probability in a day which can influence the number of infected people by the influenza. Relation between the estimated parameter and the number of infected people by the influenza is measured by coefficient of correlation. The numerical results show positive correlation between the estimated parameters and the infected people.
Complexity of regular invertible p-adic motions.
Pettigrew, J.; Roberts, J. A. G.; Vivaldi, F.
2001-12-01
We consider issues of computational complexity that arise in the study of quasi-periodic motions (Siegel discs) over the p-adic integers, where p is a prime number. These systems generate regular invertible dynamics over the integers modulo p(k), for all k, and the main questions concern the computation of periods and orbit structure. For a specific family of polynomial maps, we identify conditions under which the cycle structure is determined solely by the number of Siegel discs and two integer parameters for each disc. We conjecture the minimal parametrization needed to achieve-for every odd prime p-a two-disc tessellation with maximal cycle length. We discuss the relevance of Cebotarev's density theorem to the probabilistic description of these dynamical systems. (c) 2001 American Institute of Physics. PMID:12779524
C1,1 regularity for degenerate elliptic obstacle problems
NASA Astrophysics Data System (ADS)
Daskalopoulos, Panagiota; Feehan, Paul M. N.
2016-03-01
The Heston stochastic volatility process is a degenerate diffusion process where the degeneracy in the diffusion coefficient is proportional to the square root of the distance to the boundary of the half-plane. The generator of this process with killing, called the elliptic Heston operator, is a second-order, degenerate-elliptic partial differential operator, where the degeneracy in the operator symbol is proportional to the distance to the boundary of the half-plane. In mathematical finance, solutions to the obstacle problem for the elliptic Heston operator correspond to value functions for perpetual American-style options on the underlying asset. With the aid of weighted Sobolev spaces and weighted Hölder spaces, we establish the optimal C 1 , 1 regularity (up to the boundary of the half-plane) for solutions to obstacle problems for the elliptic Heston operator when the obstacle functions are sufficiently smooth.
Effect of regular exercise on health and disease.
Karacabey, Kursat
2005-10-01
It is known for a long time that exercise increases physical adequacy, has beneficial effects on the general health condition as well as a playing preventing role against various disease states. To decrease the risk of disease and maintain good health, the natural defense system of the organism needs to be strengthened. It is thought that in addition to increasing the body's resistance to disease through the strengthening of the immune system, decreases the convalescence time, increases work efficiency and improves the sportive performance of the individual all which would contribute positively to the national economy. The positive effects of regular exercising of aerobic nature such as strengthening of the immune system, protection against diseases as well as its positive effects on quality of life will help to emphasize the importance of physical exercise and improve the general view of sports by society. PMID:16264392
Surface tension regularizes the crack singularity of adhesion.
Karpitschka, Stefan; van Wijngaarden, Leen; Snoeijer, Jacco H
2016-05-11
The elastic and adhesive properties of a solid surface can be quantified by indenting it with a rigid sphere. Indentation tests are classically described by the JKR-law when the solid is very stiff, while recent work highlights the importance of surface tension for exceedingly soft materials. Here we show that surface tension plays a crucial role even in stiff solids: Young's wetting angle emerges as a boundary condition and this regularizes the crack-like singularity at the edge of adhesive contacts. We find that the edge region exhibits a universal, self-similar structure that emerges from the balance of surface tension and elasticity. The similarity theory is solved analytically and provides a complete description of adhesive contacts, by which we reconcile global adhesion laws and local contact mechanics. PMID:27087459
Image deblurring based structural graph and nonlocal similarity regularization
NASA Astrophysics Data System (ADS)
Jiang, Fangfang; Chen, Huahua; Ye, Xueyi
2013-07-01
The distribution of image data points forms its geometrical structure. This structure characterizes the local variation, and provides valuable heuristics to the regularization of image restoration process. However, most of the existing approaches to sparse coding fail to consider this character of the image. In this paper, we address the deblurring problem of image restoration. We analyze the distribution of the input data points. Inspired by the theory of manifold learning algorithm, we build a k-NN graph to character the geometrical structure of the data, so the local manifold structure of the data can be explicitly taken into account. To enforce the invariance constraint, we introduce a patch-similarity based term into the cost function which penalizes the nonlocal invariance of the image. Experimental results have shown the effectiveness of the proposed scheme.
Correlation applied to the recognition of regular geometric figures
NASA Astrophysics Data System (ADS)
Lasso, William; Morales, Yaileth; Vega, Fabio; Díaz, Leonardo; Flórez, Daniel; Torres, Cesar
2013-11-01
It developed a system capable of recognizing of regular geometric figures, the images are taken by the software automatically through a process of validating the presence of figure to the camera lens, the digitized image is compared with a database that contains previously images captured, to subsequently be recognized and finally identified using sonorous words referring to the name of the figure identified. The contribution of system set out is the fact that the acquisition of data is done in real time and using a spy smart glasses with usb interface offering an system equally optimal but much more economical. This tool may be useful as a possible application for visually impaired people can get information of surrounding environment.
Temporal and spatial regularity of mobile-phone data
NASA Astrophysics Data System (ADS)
Hoevel, Philipp; Barabasi, Albert-Laszlo
2012-02-01
Network science is a vibrant, interdisciplinary research area with strong connections to a plethora of different fields. As the amount of empirically obtained datasets increases more and more, approaches from network sciences continue to enhance our understanding, for instance, of human dynamics. The available data often consist of temporal as well as spatial information. In our case they originate from anonymized mobile-phone traces, which include information about the timing of the connections between two mobile phones and also their positions. Thus, the data contains an additional social component. In this study, we evaluate patterns of human behavior identifying both temporal and spatial regularity. This leads to a detailed mobility analysis on various timescales and contributes to a general theory of synchronization in complex, real-world networks.
Dimensional reduction in numerical relativity: Modified Cartoon formalism and regularization
NASA Astrophysics Data System (ADS)
Cook, William G.; Figueras, Pau; Kunesch, Markus; Sperhake, Ulrich; Tunyasuvunakool, Saran
2016-06-01
We present in detail the Einstein equations in the Baumgarte-Shapiro-Shibata-Nakamura formulation for the case of D-dimensional spacetimes with SO(D ‑ d) isometry based on a method originally introduced in Ref. 1. Regularized expressions are given for a numerical implementation of this method on a vertex centered grid including the origin of the quasi-radial coordinate that covers the extra dimensions with rotational symmetry. Axisymmetry, corresponding to the value d = D ‑ 2, represents a special case with fewer constraints on the vanishing of tensor components and is conveniently implemented in a variation of the general method. The robustness of the scheme is demonstrated for the case of a black-hole head-on collision in D = 7 spacetime dimensions with SO(4) symmetry.
Pauli-Villars Regularization of Non-Abelian Gauge Theories
NASA Astrophysics Data System (ADS)
Hiller, J. R.
2016-07-01
As an extension of earlier work on QED, we construct a BRST-invariant Lagrangian for SU(N) Yang-Mills theory with fundamental matter, regulated by the inclusion of massive Pauli-Villars (PV) gluons and PV quarks. The underlying gauge symmetry for massless PV gluons is generalized to accommodate the PV-index-changing currents that are required by the regularization. Auxiliary adjoint scalars are used, in a mechanism due to Stueckelberg, to attribute mass to the PV gluons and the PV quarks. The addition of Faddeev-Popov ghosts then establishes a residual BRST symmetry. Although there are drawbacks to the approach, in particular the computational load of a large number of PV fields and a nonlocal interaction of the ghost fields, this formulation could provide a foundation for renormalizable nonperturbative solutions of light-front QCD in an arbitrary covariant gauge.
Localized strain field measurement on laminography data with mechanical regularization
NASA Astrophysics Data System (ADS)
Taillandier-Thomas, Thibault; Roux, Stéphane; Morgeneyer, Thilo F.; Hild, François
2014-04-01
For an in-depth understanding of the failure of structural materials the study of deformation mechanisms in the material bulk is fundamental. In situ synchrotron computed laminography provides 3D images of sheet samples and digital volume correlation yields the displacement and strain fields between each step of experimental loading by using the natural contrast of the material. Difficulties arise from the lack of data, which is intrinsic to laminography and leads to several artifacts, and the little absorption contrast in the 3D image texture of the studied aluminum alloy. To lower the uncertainty level and to have a better mechanical admissibility of the measured displacement field, a regularized digital volume correlation procedure is introduced and applied to measure localized displacement and strain fields.
Regularized Multitask Learning for Multidimensional Log-Density Gradient Estimation.
Yamane, Ikko; Sasaki, Hiroaki; Sugiyama, Masashi
2016-07-01
Log-density gradient estimation is a fundamental statistical problem and possesses various practical applications such as clustering and measuring nongaussianity. A naive two-step approach of first estimating the density and then taking its log gradient is unreliable because an accurate density estimate does not necessarily lead to an accurate log-density gradient estimate. To cope with this problem, a method to directly estimate the log-density gradient without density estimation has been explored and demonstrated to work much better than the two-step method. The objective of this letter is to improve the performance of this direct method in multidimensional cases. Our idea is to regard the problem of log-density gradient estimation in each dimension as a task and apply regularized multitask learning to the direct log-density gradient estimator. We experimentally demonstrate the usefulness of the proposed multitask method in log-density gradient estimation and mode-seeking clustering. PMID:27171983
On d-Regular Schematization of Embedded Paths
NASA Astrophysics Data System (ADS)
Gemsa, Andreas; Nöllenburg, Martin; Pajor, Thomas; Rutter, Ignaz
In the d-regular path schematization problem we are given an embedded path P (e.g., a route in a road network) and an integer d. The goal is to find a d-schematized embedding of P in which the orthogonal order of all vertices in the input is preserved and in which every edge has a slope that is an integer multiple of 90°/d. We show that deciding whether a path can be d-schematized is NP-hard for any integer d. We further model the problem as a mixed-integer linear program. An experimental evaluation indicates that this approach generates reasonable route sketches for real-world data.
Human behavioral regularity, fractional Brownian motion, and exotic phase transition
NASA Astrophysics Data System (ADS)
Li, Xiaohui; Yang, Guang; An, Kenan; Huang, Jiping
2016-08-01
The mix of competition and cooperation (C&C) is ubiquitous in human society, which, however, remains poorly explored due to the lack of a fundamental method. Here, by developing a Janus game for treating C&C between two sides (suppliers and consumers), we show, for the first time, experimental and simulation evidences for human behavioral regularity. This property is proved to be characterized by fractional Brownian motion associated with an exotic transition between periodic and nonperiodic phases. Furthermore, the periodic phase echoes with business cycles, which are well-known in reality but still far from being well understood. Our results imply that the Janus game could be a fundamental method for studying C&C among humans in society, and it provides guidance for predicting human behavioral activity from the perspective of fractional Brownian motion.
Regularized Embedded Multiple Kernel Dimensionality Reduction for Mine Signal Processing.
Li, Shuang; Liu, Bing; Zhang, Chen
2016-01-01
Traditional multiple kernel dimensionality reduction models are generally based on graph embedding and manifold assumption. But such assumption might be invalid for some high-dimensional or sparse data due to the curse of dimensionality, which has a negative influence on the performance of multiple kernel learning. In addition, some models might be ill-posed if the rank of matrices in their objective functions was not high enough. To address these issues, we extend the traditional graph embedding framework and propose a novel regularized embedded multiple kernel dimensionality reduction method. Different from the conventional convex relaxation technique, the proposed algorithm directly takes advantage of a binary search and an alternative optimization scheme to obtain optimal solutions efficiently. The experimental results demonstrate the effectiveness of the proposed method for supervised, unsupervised, and semisupervised scenarios. PMID:27247562
Regularity underlies erratic population abundances in marine ecosystems
Sun, Jie; Cornelius, Sean P.; Janssen, John; Gray, Kimberly A.; Motter, Adilson E.
2015-01-01
The abundance of a species' population in an ecosystem is rarely stationary, often exhibiting large fluctuations over time. Using historical data on marine species, we show that the year-to-year fluctuations of population growth rate obey a well-defined double-exponential (Laplace) distribution. This striking regularity allows us to devise a stochastic model despite seemingly irregular variations in population abundances. The model identifies the effect of reduced growth at low population density as a key factor missed in current approaches of population variability analysis and without which extinction risks are severely underestimated. The model also allows us to separate the effect of demographic stochasticity and show that single-species growth rates are dominantly determined by stochasticity common to all species. This dominance—and the implications it has for interspecies correlations, including co-extinctions—emphasizes the need for ecosystem-level management approaches to reduce the extinction risk of the individual species themselves. PMID:25972438
A robust regularization algorithm for polynomial networks for machine learning
NASA Astrophysics Data System (ADS)
Jaenisch, Holger M.; Handley, James W.
2011-06-01
We present an improvement to the fundamental Group Method of Data Handling (GMDH) Data Modeling algorithm that overcomes the parameter sensitivity to novel cases presented to derived networks. We achieve this result by regularization of the output and using a genetic weighting that selects intermediate models that do not exhibit divergence. The result is the derivation of multi-nested polynomial networks following the Kolmogorov-Gabor polynomial that are robust to mean estimators as well as novel exemplars for input. The full details of the algorithm are presented. We also introduce a new method for approximating GMDH in a single regression model using F, H, and G terms that automatically exports the answers as ordinary differential equations. The MathCAD 15 source code for all algorithms and results are provided.
Diffuse light tomography to detect blood vessels using Tikhonov regularization
NASA Astrophysics Data System (ADS)
Kazanci, Huseyin O.; Jacques, Steven L.
2016-04-01
Detection of blood vessels within light-scattering tissues involves detection of subtle shadows as blood absorbs light. These shadows are diffuse but measurable by a set of source-detector pairs in a spatial array of sources and detectors on the tissue surface. The measured shadows can reconstruct the internal position(s) of blood vessels. The tomographic method involves a set of Ns sources and Nd detectors such that Nsd = Ns x Nd source-detector pairs produce Nsd measurements, each interrogating the tissue with a unique perspective, i.e., a unique region of sensitivity to voxels within the tissue. This tutorial report describes the reconstruction of the image of a blood vessel within a soft tissue based on such source-detector measurements, by solving a matrix equation using Tikhonov regularization. This is not a novel contribution, but rather a simple introduction to a well-known method, demonstrating its use in mapping blood perfusion.
Chiral Thirring–Wess model with Faddeevian regularization
Rahaman, Anisur
2015-03-15
Replacing vector type of interaction of the Thirring–Wess model by the chiral type a new model is presented which is termed here as chiral Thirring–Wess model. Ambiguity parameters of regularization are so chosen that the model falls into the Faddeevian class. The resulting Faddeevian class of model in general does not possess Lorentz invariance. However we can exploit the arbitrariness admissible in the ambiguity parameters to relate the quantum mechanically generated ambiguity parameters with the classical parameter involved in the masslike term of the gauge field which helps to maintain physical Lorentz invariance instead of the absence of manifestly Lorentz covariance of the model. The phase space structure and the theoretical spectrum of this class of model have been determined through Dirac’s method of quantization of constraint system.
Regular step distribution of the bare Si(553) surface
NASA Astrophysics Data System (ADS)
Kopciuszyński, M.; Dyniec, P.; Zdyb, R.; Jałochowski, M.
2015-06-01
Vicinal Si(111) surfaces are known to undergo faceting when the temperature is lowered below the (1 ×1 ) to (7 ×7 ) phase transition temperature. Depending on the cutoff angle value and direction with respect to the crystallographic axis, various facets, together with low Miller index terraces, are formed. Here, we report the formation of regularly distributed steps over macroscopic sample regions of the bare Si(553) surface. The surface morphology is studied with scanning tunneling microscopy and reflection high energy electron diffraction techniques. The (111) terraces of 2.88 nm in width, which are separated by double atomic height steps, reveal an unusual reconstruction. However, the electronic structure determined with angle resolved photoemission spectroscopy shows bands very similar to those observed for the Si (111 )-(7 ×7 ) surface.
PDE regularization for Bayesian reconstruction of emission tomography
NASA Astrophysics Data System (ADS)
Wang, Zhentian; Zhang, Li; Xing, Yuxiang; Zhao, Ziran
2008-03-01
The aim of the present study is to investigate a type of Bayesian reconstruction which utilizes partial differential equations (PDE) image models as regularization. PDE image models are widely used in image restoration and segmentation. In a PDE model, the image can be viewed as the solution of an evolutionary differential equation. The variation of the image can be regard as a descent of an energy function, which entitles us to use PDE models in Bayesian reconstruction. In this paper, two PDE models called anisotropic diffusion are studied. Both of them have the characteristics of edge-preserving and denoising like the popular median root prior (MRP). We use PDE regularization with an Ordered Subsets accelerated Bayesian one step late (OSL) reconstruction algorithm for emission tomography. The OS accelerated OSL algorithm is more practical than a non-accelerated one. The proposed algorithm is called OSEM-PDE. We validated the OSEM-PDE using a Zubal phantom in numerical experiments with attenuation correction and quantum noise considered, and the results are compared with OSEM and an OS version of MRP (OSEM-MRP) reconstruction. OSEM-PDE shows better results both in bias and variance. The reconstruction images are smoother and have sharper edges, thus are more applicable for post processing such as segmentation. We validate this using a k-means segmentation algorithm. The classic OSEM is not convergent especially in noisy condition. However, in our experiment, OSEM-PDE can benefit from OS acceleration and keep stable and convergent while OSEM-MRP failed to converge.
Multimodal manifold-regularized transfer learning for MCI conversion prediction.
Cheng, Bo; Liu, Mingxia; Suk, Heung-Il; Shen, Dinggang; Zhang, Daoqiang
2015-12-01
As the early stage of Alzheimer's disease (AD), mild cognitive impairment (MCI) has high chance to convert to AD. Effective prediction of such conversion from MCI to AD is of great importance for early diagnosis of AD and also for evaluating AD risk pre-symptomatically. Unlike most previous methods that used only the samples from a target domain to train a classifier, in this paper, we propose a novel multimodal manifold-regularized transfer learning (M2TL) method that jointly utilizes samples from another domain (e.g., AD vs. normal controls (NC)) as well as unlabeled samples to boost the performance of the MCI conversion prediction. Specifically, the proposed M2TL method includes two key components. The first one is a kernel-based maximum mean discrepancy criterion, which helps eliminate the potential negative effect induced by the distributional difference between the auxiliary domain (i.e., AD and NC) and the target domain (i.e., MCI converters (MCI-C) and MCI non-converters (MCI-NC)). The second one is a semi-supervised multimodal manifold-regularized least squares classification method, where the target-domain samples, the auxiliary-domain samples, and the unlabeled samples can be jointly used for training our classifier. Furthermore, with the integration of a group sparsity constraint into our objective function, the proposed M2TL has a capability of selecting the informative samples to build a robust classifier. Experimental results on the Alzheimer's Disease Neuroimaging Initiative (ADNI) database validate the effectiveness of the proposed method by significantly improving the classification accuracy of 80.1 % for MCI conversion prediction, and also outperforming the state-of-the-art methods. PMID:25702248
Multimodal manifold-regularized transfer learning for MCI conversion prediction
Cheng, Bo; Liu, Mingxia; Suk, Heung-Il; Shen, Dinggang; Zhang, Daoqiang
2015-01-01
As the early stage of Alzheimer's disease (AD), mild cognitive impairment (MCI) has high chance to convert to AD. Effective prediction of such conversion from MCI to AD is of great importance for early diagnosis of AD and also for evaluating AD risk pre-symptomatically. Unlike most previous methods that used only the samples from a target domain to train a classifier, in this paper, we propose a novel multimodal manifold-regularized transfer learning (M2TL) method that jointly utilizes samples from another domain (e.g., AD vs. normal controls (NC)) as well as unlabeled samples to boost the performance of the MCI conversion prediction. Specifically, the proposed M2TL method includes two key components. The first one is a kernel-based maximum mean discrepancy criterion, which helps eliminate the potential negative effect induced by the distributional difference between the auxiliary domain (i.e., AD and NC) and the target domain (i.e., MCI converters (MCI-C) and MCI non-converters (MCI-NC)). The second one is a semi-supervised multimodal manifold-regularized least squares classification method, where the target-domain samples, the auxiliary-domain samples, and the unlabeled samples can be jointly used for training our classifier. Furthermore, with the integration of a group sparsity constraint into our objective function, the proposed M2TL has a capability of selecting the informative samples to build a robust classifier. Experimental results on the Alzheimer's Disease Neuroimaging Initiative (ADNI) database validate the effectiveness of the proposed method by significantly improving the classification accuracy of 80.1 % for MCI conversion prediction, and also outperforming the state-of-the-art methods. PMID:25702248
Preference for luminance histogram regularities in natural scenes.
Graham, Daniel; Schwarz, Bianca; Chatterjee, Anjan; Leder, Helmut
2016-03-01
Natural scene luminance distributions typically have positive skew, and for single objects, there is evidence that higher skew is a correlate (but not a guarantee) of glossiness. Skewness is also relevant to aesthetics: preference for glossy single objects (with high skew) has been shown even in infants, and skewness is a good predictor of fruit freshness. Given that primate vision appears to efficiently encode natural scene luminance variation, and given evidence that natural scene regularities may be a prerequisite for aesthetic perception in the spatial domain, here we ask whether humans in general prefer natural scenes with more positively skewed luminance distributions. If humans generally prefer images with the higher-order regularities typical of natural scenes and/or shiny objects, we would expect this to be the case. By manipulating luminance distribution skewness (holding mean and variance constant) for individual natural images, we show that in fact preference varies inversely with increasing positive skewness. This finding holds for: artistic landscape images and calibrated natural scenes; scenes with and without glossy surfaces; landscape scenes and close-up objects; and noise images with natural luminance histograms. Across conditions, humans prefer images with skew near zero over higher skew images, and they prefer skew lower than that of the unmodified scenes. These results suggest that humans prefer images with luminances that are distributed relatively evenly about the mean luminance, i.e., images with similar amounts of light and dark. We propose that our results reflect an efficient processing advantage of low-skew images over high-skew images, following evidence from prior brain imaging results. PMID:25872178
Packing regularities in biological structures relate to their dynamics.
Jernigan, Robert L; Kloczkowski, Andrzej
2007-01-01
The high packing density inside proteins leads to certain geometric regularities and also is one of the most important contributors to the high extent of cooperativity manifested by proteins in their cohesive domain motions. The orientations between neighboring nonbonded residues in proteins substantially follow the similar geometric regularities, regardless of whether the residues are on the surface or buried, a direct result of hydrophobicity forces. These orientations are relatively fixed and correspond closely to small deformations from those of the face-centered cubic lattice, which is the way in which identical spheres pack at the highest density. Packing density also is related to the extent of conservation of residues, and we show this relationship for residue packing densities by averaging over a large sample or residue packings. There are three regimes: (1) over a broad range of packing densities the relationship between sequence entropy and inverse packing density is nearly linear, (2) over a limited range of low packing densities the sequence entropy is nearly constant, and (3) at extremely low packing densities the sequence entropy is highly variable. These packing results provide important justification for the simple elastic network models that have been shown for a large number of proteins to represent protein dynamics so successfully, even when the models are extremely coarse grained. Elastic network models for polymeric chains are simple and could be combined with these protein elastic networks to represent partially denatured parts of proteins. Finally, we show results of applications of the elastic network model to study the functional motions of the ribosome, based on its known structure. These results indicate expected correlations among its components for the step-wise processing steps in protein synthesis, and suggest ways to use these elastic network models to develop more detailed mechanisms, an important possibility because most
Regularized reconstruction of wave fields from refracted images of water
NASA Astrophysics Data System (ADS)
Choudhury, K. Roy; O'Sullivan, F.; Samanta, M.; Shrira, V.; Caulliez, G.
2009-04-01
Refractive imaging of wave fields is often used for observation of short gravity and gravity-capillary waves in wave tanks and in the field. A light box placed under the waves emits light of spatially graduated intensity. The refracted light intensity image recorded overhead can be related to the wave slope field using a system of equations derived from the laws of refraction. Previous authors have proposed a two stage reconstruction strategy for the recovery of wave slope and height fields: i) estimation of local slope fields ii) global reconstruction of height and slope fields using local estimates. Our statistical analysis of local slope estimates reveals that estimation error variability increases considerably from the bright to the dark ends of the imaging area, with some concomitant bias. The reconstruction problem behaves like an ill posed inverse problem in the dark areas of the image. Illposedness is addressed by a reconstruction method that imposes Tikhonov regularization of directional wave slopes using penalized least squares. Other refinements proposed include a) bias correction of local slope estimates b) spatially weighted reconstruction using estimated variability of local slope estimates and c) more accurate estimates of reference light profiles from time sequence data. A computationally efficient algorithm that exploits sparsity in the resulting system of equations is employed to evaluate the regularized estimator. Simulation studies show that the refinements can result in substantial improvements in the mean squared error of reconstruction. The algorithm is applied to obtain wave field reconstructions from video recordings. Analysis of various video sequences demonstrates distinct spatial patterns at different wind speed and fetch combinations.
Mathematical strategies for filtering complex systems: Regularly spaced sparse observations
Harlim, J. Majda, A.J.
2008-05-01
Real time filtering of noisy turbulent signals through sparse observations on a regularly spaced mesh is a notoriously difficult and important prototype filtering problem. Simpler off-line test criteria are proposed here as guidelines for filter performance for these stiff multi-scale filtering problems in the context of linear stochastic partial differential equations with turbulent solutions. Filtering turbulent solutions of the stochastically forced dissipative advection equation through sparse observations is developed as a stringent test bed for filter performance with sparse regular observations. The standard ensemble transform Kalman filter (ETKF) has poor skill on the test bed and even suffers from filter divergence, surprisingly, at observable times with resonant mean forcing and a decaying energy spectrum in the partially observed signal. Systematic alternative filtering strategies are developed here including the Fourier Domain Kalman Filter (FDKF) and various reduced filters called Strongly Damped Approximate Filter (SDAF), Variance Strongly Damped Approximate Filter (VSDAF), and Reduced Fourier Domain Kalman Filter (RFDKF) which operate only on the primary Fourier modes associated with the sparse observation mesh while nevertheless, incorporating into the approximate filter various features of the interaction with the remaining modes. It is shown below that these much cheaper alternative filters have significant skill on the test bed of turbulent solutions which exceeds ETKF and in various regimes often exceeds FDKF, provided that the approximate filters are guided by the off-line test criteria. The skill of the various approximate filters depends on the energy spectrum of the turbulent signal and the observation time relative to the decorrelation time of the turbulence at a given spatial scale in a precise fashion elucidated here.
Regular Football Practice Improves Autonomic Cardiac Function in Male Children
Fernandes, Luis; Oliveira, Jose; Soares-Miranda, Luisa; Rebelo, Antonio; Brito, Joao
2015-01-01
Background: The role of the autonomic nervous system (ANS) in the cardiovascular regulation is of primal importance. Since it has been associated with adverse conditions such as cardiac arrhythmias, sudden death, sleep disorders, hypertension and obesity. Objectives: The present study aimed to investigate the impact of recreational football practice on the autonomic cardiac function of male children, as measured by heart rate variability. Patients and Methods: Forty-seven male children aged 9 - 12 years were selected according to their engagement with football oriented practice outside school context. The children were divided into a football group (FG; n = 22) and a control group (CG; n = 25). The FG had regular football practices, with 2 weekly training sessions and occasional weekend matches. The CG was not engaged with any physical activity other than complementary school-based physical education classes. Data from physical activity, physical fitness, and heart rate variability measured in time and frequency domains were obtained. Results: The anthropometric and body composition characteristics were similar in both groups (P > 0.05). The groups were also similar in time spent daily on moderate-to-vigorous physical activities (FG vs. CG: 114 ± 64 vs. 87 ± 55 minutes; P > 0.05). However, the FG performed better (P < 0.05) in Yo-Yo intermittent endurance test (1394 ± 558 vs. 778 ± 408 m) and 15-m sprint test (3.06 ± 0.17 vs. 3.20 ± 0.23 s). Also, the FG presented enhanced autonomic function. Significant differences were detected (P < 0.05) between groups for low frequency normalized units (38.0 ± 15.2 vs. 47.3 ± 14.2 n.u (normalized units)), high frequency normalized units (62.1 ± 15.2 vs. 52.8 ± 14.2 n.u.), and LF:HF ratio (0.7 ± 0.4 vs. 1.1 ± 0.6 ms2). Conclusions: Children engaged with regular football practice presented enhanced physical fitness and autonomic function, by increasing vagal tone at rest. PMID:26448848
The regularity of primary and secondary muscle spindle afferent discharges
Matthews, P. B. C.; Stein, R. B.
1969-01-01
1. The patterns of nerve impulses in the afferent fibres from muscle spindles have been studied using the soleus muscle of the decerebrate cat. Impulses from up to five single units were recorded simultaneously on magnetic tape, while the muscle was stretched to a series of different lengths. Various statistics were later determined by computer analysis. 2. After the ventral roots were cut to eliminate any motor outflow to the muscle spindles, both primary and secondary spindle endings discharged very regularly. At frequencies around 30 impulses/sec the coefficient of variation of the interspike interval distributions had a mean value of only 0·02 for the secondary endings and 0·058 for the primary endings. The values obtained for the two kinds of ending did not overlap. 3. When the ventral roots were intact, the `spontaneous' fusimotor activity considerably increased the variability of both kinds of endings. Secondary endings still discharged much more regularly than primary endings, even when the fusimotor activity increased the frequency of firing equally for the two kinds of endings. At frequencies around 30/sec the average coefficient of variation of the interval distributions was then 0·064 for the secondary endings and 0·25 for the primary endings. 4. When the ventral roots were intact there was usually an inverse relation between the values of successive interspike intervals. The first serial correlation coefficient often had values down to - 0·6 for both kinds of ending. Higher order serial correlation coefficients were also computed. 5. Approximate calculations, based on the variability observed when the ventral roots were intact, suggested that when the length of the muscle was constant an observer analysing a 1 sec period of discharge from a single primary ending would only be able to distinguish about six different lengths of the muscle. The corresponding figure for a secondary ending was twenty-five lengths. 6. The increase in variability with
Physiological time-series analysis: what does regularity quantify?
NASA Technical Reports Server (NTRS)
Pincus, S. M.; Goldberger, A. L.
1994-01-01
Approximate entropy (ApEn) is a recently developed statistic quantifying regularity and complexity that appears to have potential application to a wide variety of physiological and clinical time-series data. The focus here is to provide a better understanding of ApEn to facilitate its proper utilization, application, and interpretation. After giving the formal mathematical description of ApEn, we provide a multistep description of the algorithm as applied to two contrasting clinical heart rate data sets. We discuss algorithm implementation and interpretation and introduce a general mathematical hypothesis of the dynamics of a wide class of diseases, indicating the utility of ApEn to test this hypothesis. We indicate the relationship of ApEn to variability measures, the Fourier spectrum, and algorithms motivated by study of chaotic dynamics. We discuss further mathematical properties of ApEn, including the choice of input parameters, statistical issues, and modeling considerations, and we conclude with a section on caveats to ensure correct ApEn utilization.
Zeroth-order regular approximation approach to molecular parity violation
Berger, Robert; Langermann, Norbert; Wuellen, Christoph van
2005-04-01
We present an ab initio (quasirelativistic) two-component approach to the computation of molecular parity-violating effects which is based on the zeroth-order regular approximation (ZORA). As a first application, we compute the parity-violating energy differences between various P and M conformations of C{sub 2}-symmetric molecules belonging to the series H{sub 2}X{sub 2} with X=O, S, Se, Te, Po. The results are compared to previously reported (relativistic) four-component Dirac-Hartree-Fock-Coulomb (DHFC) data. Relative deviations between ZORA and DHFC values are well below 2% for diselane and the heavier homologs whereas somewhat larger relative deviations are observed for the lighter homologs. The larger deviations for lighter systems are attributed to the (nonlocal) exchange terms coupling large and small components, which have been neglected in the present ZORA implementation. For heavier systems these play a minor role, which explains the good performance of the ZORA approach. An excellent performance, even for lighter systems, is expected for a related density-functional-theory-based ZORA because then the exchange terms coupling large and small components are absent.
A regularized multivariate regression approach for eQTL analysis
Zhang, Hexin; Zhang, Yuzheng; Hsu, Li; Wang, Pei
2013-01-01
Expression quantitative trait loci (eQTLs) are genomic loci that regulate expression levels of mRNAs or proteins. Understanding these regulatory provides important clues to biological pathways that underlie diseases. In this paper, we propose a new statistical method, GroupRemMap, for identifying eQTLs. We model the relationship between gene expression and single nucleotide variants (SNVs) through multivariate linear regression models, in which gene expression levels are responses and SNV genotypes are predictors. To handle the high-dimensionality as well as to incorporate the intrinsic group structure of SNVs, we introduce a new regularization scheme to (1) control the overall sparsity of the model; (2) encourage the group selection of SNVs from the same gene; and (3) facilitate the detection of trans-hub-eQTLs. We apply the proposed method to the colorectal and breast cancer data sets from The Cancer Genome Atlas (TCGA), and identify several biologically interesting eQTLs. These findings may provide insight into biological processes associated with cancers and generate hypotheses for future studies. PMID:26085849
Structural regularities in the lithosphere of continents and plate tectonics
NASA Astrophysics Data System (ADS)
Pavlenkova, N. I.
1995-03-01
Two fundamental, but competing earth science concepts have been under discussion in Russia. The first one, that of endogenous regimes, is based on the assumption that permanent vertical relationships or long-term interactions between the crust and upper mantle control crustal evolution. Significant horizontal movements of the lithosphere, as required by the second concept, that of global plate tectonics, would destroy these crust-mantle interactions. Certain regular features of the crust and upper mantle support the endogenous regime concept and are difficult to explain in terms of conventional plate tectonics. In particular, the close correlation between near-surface features and deep (> 400 km) mantle inhomogeneities suggests that many geological structures are deeply rooted in the mantle. Moreover, geophysical studies have failed to reveal a well-defined and continuous asthenosphere at relatively shallow depths (˜ 100 km) that would allow lithospheric plates to be transported over large distances, and the rheology of the lithosphere itself is found to be sufficiently inhomogeneous as to cast doubt on the principle of thin rigid plates. In contrast, palaeomagnetic and other data require that horizontal movements of many near-surface geological structures must have taken place. To explain this apparent contradiction, it is suggested here that the crust and its connected deep root are capable of gliding along one of the deep mantle phase transition zones with respect to the inner Earth.
Statistical regularities in the return intervals of volatility
NASA Astrophysics Data System (ADS)
Wang, F.; Weber, P.; Yamasaki, K.; Havlin, S.; Stanley, H. E.
2007-01-01
We discuss recent results concerning statistical regularities in the return intervals of volatility in financial markets. In particular, we show how the analysis of volatility return intervals, defined as the time between two volatilities larger than a given threshold, can help to get a better understanding of the behavior of financial time series. We find scaling in the distribution of return intervals for thresholds ranging over a factor of 25, from 0.6 to 15 standard deviations, and also for various time windows from one minute up to 390 min (an entire trading day). Moreover, these results are universal for different stocks, commodities, interest rates as well as currencies. We also analyze the memory in the return intervals which relates to the memory in the volatility and find two scaling regimes, ℓ<ℓ* with α1=0.64±0.02 and ℓ> ℓ* with α2=0.92±0.04; these exponent values are similar to results of Liu et al. for the volatility. As an application, we use the scaling and memory properties of the return intervals to suggest a possibly useful method for estimating risk.
Provably optimal parallel transport sweeps on regular grids
Adams, M. P.; Adams, M. L.; Hawkins, W. D.; Smith, T.; Rauchwerger, L.; Amato, N. M.; Bailey, T. S.; Falgout, R. D.
2013-07-01
We have found provably optimal algorithms for full-domain discrete-ordinate transport sweeps on regular grids in 3D Cartesian geometry. We describe these algorithms and sketch a 'proof that they always execute the full eight-octant sweep in the minimum possible number of stages for a given P{sub x} x P{sub y} x P{sub z} partitioning. Computational results demonstrate that our optimal scheduling algorithms execute sweeps in the minimum possible stage count. Observed parallel efficiencies agree well with our performance model. An older version of our PDT transport code achieves almost 80% parallel efficiency on 131,072 cores, on a weak-scaling problem with only one energy group, 80 directions, and 4096 cells/core. A newer version is less efficient at present-we are still improving its implementation - but achieves almost 60% parallel efficiency on 393,216 cores. These results conclusively demonstrate that sweeps can perform with high efficiency on core counts approaching 10{sup 6}. (authors)
Can static regular black holes form from gravitational collapse?
NASA Astrophysics Data System (ADS)
Zhang, Yiyang; Zhu, Yiwei; Modesto, Leonardo; Bambi, Cosimo
2015-02-01
Starting from the Oppenheimer-Snyder model, we know how in classical general relativity the gravitational collapse of matter forms a black hole with a central spacetime singularity. It is widely believed that the singularity must be removed by quantum-gravity effects. Some static quantum-inspired singularity-free black hole solutions have been proposed in the literature, but when one considers simple examples of gravitational collapse the classical singularity is replaced by a bounce, after which the collapsing matter expands for ever. We may expect three possible explanations: (i) the static regular black hole solutions are not physical, in the sense that they cannot be realized in Nature, (ii) the final product of the collapse is not unique, but it depends on the initial conditions, or (iii) boundary effects play an important role and our simple models miss important physics. In the latter case, after proper adjustment, the bouncing solution would approach the static one. We argue that the "correct answer" may be related to the appearance of a ghost state in de Sitter spacetimes with super Planckian mass. Our black holes have indeed a de Sitter core and the ghost would make these configurations unstable. Therefore we believe that these black hole static solutions represent the transient phase of a gravitational collapse but never survive as asymptotic states.
Grouping by closure influences subjective regularity and implicit preference
Makin, Alexis; Pecchinenda, Anna; Bertamini, Marco
2012-01-01
A reflection between a pair of contours is more rapidly detected than a translation, but this effect is stronger when the contours are closed to form a single object compared to when they are closed to form 2 objects with a gap between them. That is, grouping changes the relative salience of different regularities. We tested whether this manipulation would also change preference for reflection or translation. We measured preference for these patterns using the Implicit Association Test (IAT). On some trials, participants saw words that were either positive or negative and had to classify them as quickly as possible. On interleaved trials, they saw reflection or translation patterns and again had to classify them. Participants were faster when 1 button was used for reflection and positive words and another button was used for translation and negative words, compared to when the reverse response mapping was used (translation and positive vs. reflection and negative). This reaction time difference indicates an implicit preference for reflection over translation. However, the size of the implicit preference was significantly reduced in the Two-objects condition. We concluded that factors that affect perceptual sensitivity also systematically affect implicit preference formation. PMID:23145305
Regular branched Macromolecules: Structure of Bottlebrush Polymers in Solution
NASA Astrophysics Data System (ADS)
Pakula, T.; Rathgeber, S.; Matyjaszewski, K.
2001-03-01
The shape and internal structure of bottlebrush (comb) macromolecules under good solvent conditions have been studied using small angle neutron scattering and computer simulations. The form factor S(Q) was measured at low concentrations in toluene for comb polymers consisting of a p(BPEM) backbone with p(nBA) side chains. The following intramolecular parameters were varied: (1) backbone length, (2) grafting density and (3) length of the side chains. Using models which have been successfully applied to other regular branched polymers we derive the range of the hydrodynamic interaction within the polymer and the particle dimension from which we can conclude on the overall shape of the macromolecular brush. In addition we determined the radius of the gyration of the backbone R_g^bb and of the side chains R_g^sc. These parameters give information about the stiffness of the polymer. Experimental findings are compared with computer simulation results performed for a single bottlebrush macromolecule using the cooperative motion algorithm. The simulation gives direct access to R_g^bb and R_g^sc and allows an independent determination of S(Q). Good agreement between experiment and simulation has been found.
Markov Boundary Discovery with Ridge Regularized Linear Models
Visweswaran, Shyam
2016-01-01
Ridge regularized linear models (RRLMs), such as ridge regression and the SVM, are a popular group of methods that are used in conjunction with coefficient hypothesis testing to discover explanatory variables with a significant multivariate association to a response. However, many investigators are reluctant to draw causal interpretations of the selected variables due to the incomplete knowledge of the capabilities of RRLMs in causal inference. Under reasonable assumptions, we show that a modified form of RRLMs can get “very close” to identifying a subset of the Markov boundary by providing a worst-case bound on the space of possible solutions. The results hold for any convex loss, even when the underlying functional relationship is nonlinear, and the solution is not unique. Our approach combines ideas in Markov boundary and sufficient dimension reduction theory. Experimental results show that the modified RRLMs are competitive against state-of-the-art algorithms in discovering part of the Markov boundary from gene expression data. PMID:27170915
Infants with Williams syndrome detect statistical regularities in continuous speech.
Cashon, Cara H; Ha, Oh-Ryeong; Graf Estes, Katharine; Saffran, Jenny R; Mervis, Carolyn B
2016-09-01
Williams syndrome (WS) is a rare genetic disorder associated with delays in language and cognitive development. The reasons for the language delay are unknown. Statistical learning is a domain-general mechanism recruited for early language acquisition. In the present study, we investigated whether infants with WS were able to detect the statistical structure in continuous speech. Eighteen 8- to 20-month-olds with WS were familiarized with 2min of a continuous stream of synthesized nonsense words; the statistical structure of the speech was the only cue to word boundaries. They were tested on their ability to discriminate statistically-defined "words" and "part-words" (which crossed word boundaries) in the artificial language. Despite significant cognitive and language delays, infants with WS were able to detect the statistical regularities in the speech stream. These findings suggest that an inability to track the statistical properties of speech is unlikely to be the primary basis for the delays in the onset of language observed in infants with WS. These results provide the first evidence of statistical learning by infants with developmental delays. PMID:27299804
Regular and chaotic dynamics of a piecewise smooth bouncer
Langer, Cameron K. Miller, Bruce N.
2015-07-15
The dynamical properties of a particle in a gravitational field colliding with a rigid wall moving with piecewise constant velocity are studied. The linear nature of the wall's motion permits further analytical investigation than is possible for the system's sinusoidal counterpart. We consider three distinct approaches to modeling collisions: (i) elastic, (ii) inelastic with constant restitution coefficient, and (iii) inelastic with a velocity-dependent restitution function. We confirm the existence of distinct unbounded orbits (Fermi acceleration) in the elastic model, and investigate regular and chaotic behavior in the inelastic cases. We also examine in the constant restitution model trajectories wherein the particle experiences an infinite number of collisions in a finite time, i.e., the phenomenon of inelastic collapse. We address these so-called “sticking solutions” and their relation to both the overall dynamics and the phenomenon of self-reanimating chaos. Additionally, we investigate the long-term behavior of the system as a function of both initial conditions and parameter values. We find the non-smooth nature of the system produces novel bifurcation phenomena not seen in the sinusoidal model, including border-collision bifurcations. The analytical and numerical investigations reveal that although our piecewise linear bouncer is a simplified version of the sinusoidal model, the former not only captures essential features of the latter but also exhibits behavior unique to the discontinuous dynamics.
Dynamically self-regular quantum harmonic black holes
NASA Astrophysics Data System (ADS)
Spallucci, Euro; Smailagic, Anais
2015-04-01
The recently proposed UV self-complete quantum gravity program is a new and very interesting way to envision Planckian/trans-Planckian physics. In this new framework, high energy scattering is dominated by the creation of micro black holes, and it is experimentally impossible to probe distances shorter than the horizon radius. In this letter we present a model which realizes this idea through the creation of self-regular quantum black holes admitting a minimal size extremal configuration. Their radius provides a dynamically generated minimal length acting as a universal short-distance cutoff. We propose a quantization scheme for this new kind of microscopic objects based on a Bohr-like approach, which does not require a detailed knowledge of quantum gravity. The resulting black hole quantum picture resembles the energy spectrum of a quantum harmonic oscillator. The mass of the extremal configuration plays the role of zero-point energy. Large quantum number re-establishes the classical black hole description. Finally, we also formulate a "quantum hoop conjecture" which is satisfied by all the mass eigenstates and sustains the existence of quantum black holes sourced by Gaussian matter distributions.
Regularized MANOVA (rMANOVA) in untargeted metabolomics.
Engel, J; Blanchet, L; Bloemen, B; van den Heuvel, L P; Engelke, U H F; Wevers, R A; Buydens, L M C
2015-10-29
Many advanced metabolomics experiments currently lead to data where a large number of response variables were measured while one or several factors were changed. Often the number of response variables vastly exceeds the sample size and well-established techniques such as multivariate analysis of variance (MANOVA) cannot be used to analyze the data. ANOVA simultaneous component analysis (ASCA) is an alternative to MANOVA for analysis of metabolomics data from an experimental design. In this paper, we show that ASCA assumes that none of the metabolites are correlated and that they all have the same variance. Because of these assumptions, ASCA may relate the wrong variables to a factor. This reduces the power of the method and hampers interpretation. We propose an improved model that is essentially a weighted average of the ASCA and MANOVA models. The optimal weight is determined in a data-driven fashion. Compared to ASCA, this method assumes that variables can correlate, leading to a more realistic view of the data. Compared to MANOVA, the model is also applicable when the number of samples is (much) smaller than the number of variables. These advantages are demonstrated by means of simulated and real data examples. The source code of the method is available from the first author upon request, and at the following github repository: https://github.com/JasperE/regularized-MANOVA. PMID:26547490
Satisfiability Threshold for Random Regular nae-sat
NASA Astrophysics Data System (ADS)
Ding, Jian; Sly, Allan; Sun, Nike
2016-01-01
We consider the random regular k- nae- sat problem with n variables, each appearing in exactly d clauses. For all k exceeding an absolute constant {{k}_0}, we establish explicitly the satisfiability threshold {{{d_star} equiv {d_star(k)}}}. We prove that for {{d < d_star}} the problem is satisfiable with high probability, while for {{d > d_star}} the problem is unsatisfiable with high probability. If the threshold {{d_star}} lands exactly on an integer, we show that the problem is satisfiable with probability bounded away from both zero and one. This is the first result to locate the exact satisfiability threshold in a random constraint satisfaction problem exhibiting the condensation phenomenon identified by Krzakała et al. [Proc Natl Acad Sci 104(25):10318-10323, 2007]. Our proof verifies the one-step replica symmetry breaking formalism for this model. We expect our methods to be applicable to a broad range of random constraint satisfaction problems and combinatorial problems on random graphs.
Cryptococcal pleuritis developing in a patient on regular hemodialysis.
Kinjo, K; Satake, S; Ohama, T
2009-09-01
A 64-year-old male on regular hemodialysis who was a human T lymphotrophic virus Type I (HTLV-I) carrier developed cryptococcal pleuritis. The initial manifestations of the present case were a persistent cough and the accumulation of unilateral pleural effusion. A culture of the pleural fluid of the patient grew cryptococcus neoformans and a test for antigens against cryptococcus neoformans in the pleural fluid was also positive, therefore, cryptococcal pleuritis was diagnosed. Pleural cryptococcosis per se is rare and it is extremely rare for a dialysis patient to develop pleural cryptococcosis. To our knowledge, only a few cases of cryptococcal pleuritis have so far been reported in patients on dialysis. Furthermore, an isolated occurrence of cryptococcal pleuritis with no cryptococcal pulmonary parenchymal lesions, as was seen in the present case, is rare because cryptococcal pleuritis is usually associated with underlying cryptococcal pulmonary parenchymal lesions. Patients on chronic dialysis are susceptible to developing pleural effusion from many etiologies such as congestive heart failure, infection (tuberculosis, bacterial, viral, parasitic, fungal), collagen vascular disease, drug reaction, metastasis, or uremia itself. Cryptococcal pleuritis developing in a dialysis patient is extremely rare, but physicians should consider cryptococcal infection as a possible cause when pleural effusion develops in a dialysis patient and no other cause is identified, as occurred in the present case. PMID:19761731
Scene recognition by manifold regularized deep learning architecture.
Yuan, Yuan; Mou, Lichao; Lu, Xiaoqiang
2015-10-01
Scene recognition is an important problem in the field of computer vision, because it helps to narrow the gap between the computer and the human beings on scene understanding. Semantic modeling is a popular technique used to fill the semantic gap in scene recognition. However, most of the semantic modeling approaches learn shallow, one-layer representations for scene recognition, while ignoring the structural information related between images, often resulting in poor performance. Modeled after our own human visual system, as it is intended to inherit humanlike judgment, a manifold regularized deep architecture is proposed for scene recognition. The proposed deep architecture exploits the structural information of the data, making for a mapping between visible layer and hidden layer. By the proposed approach, a deep architecture could be designed to learn the high-level features for scene recognition in an unsupervised fashion. Experiments on standard data sets show that our method outperforms the state-of-the-art used for scene recognition. PMID:25622326
Regularized reestimation of stochastic duration models for phone-classification
NASA Astrophysics Data System (ADS)
Russell, Martin J.; Jackson, Philip J. B.
2001-05-01
Recent research has compared the performance of various distributions (uniform, boxcar, exponential, gamma, discrete) for modeling segment (state) durations in hidden semi-Markov models used for phone classification on the TIMIT database. These experiments have shown that a gamma distribution is more appropriate than exponential (which is implicit in first-order Markov models), and achieved a 3% relative reduction in phone-classification errors [Jackson, Proc. ICPhS, pp. 1349-1352 (2003)]. The parameters of these duration distributions were estimated once for each model from initial statistics of state occupation (offline), and remained unchanged during subsequent iterations of training. The present work investigates the effect of reestimating the duration models in training (online) with respect to the phone-classification scores. First, tests were conducted on duration models reestimated directly from statistics gathered in the previous iteration of training. It was found that the boxcar and gamma models were unstable, meanwhile the performance of the other models also tended to degrade. Secondary tests, using a scheme of annealed regularization, demonstrated that the losses could be recouped and a further 1% improvement was obtained. The results from this pilot study imply that similar gains in recognition accuracy deserve investigation, along with further optimization of the duration model reestimation procedure.
Regularized path integrals and anomalies: U(1) chiral gauge theory
NASA Astrophysics Data System (ADS)
Kopper, Christoph; Lévêque, Benjamin
2012-02-01
We analyze the origin of the Adler-Bell-Jackiw anomaly of chiral U(1) gauge theory within the framework of regularized path integrals. Momentum or position space regulators allow for mathematically well-defined path integrals but violate local gauge symmetry. It is known how (nonanomalous) gauge symmetry can be recovered in the renormalized theory in this case [Kopper, C. and Müller, V. F., "Renormalization of spontaneously broken SU(2) Yang-Mills theory with flow equations," Rev. Math. Phys. 21, 781 (2009)], 10.1142/S0129055X0900375X. Here we analyze U(1) chiral gauge theory to show how the appearance of anomalies manifests itself in such a context. We show that the three-photon amplitude leads to a violation of the Slavnov-Taylor identities which cannot be restored on taking the UV limit in the renormalized theory. We point out that this fact is related to the nonanalyticity of this amplitude in the infrared region.
NASA Astrophysics Data System (ADS)
Moradi, Hamid; Tang, Shuo; Salcudean, Septimiu E.
2016-03-01
We define a deconvolution based photoacoustic reconstruction with sparsity regularization (DPARS) algorithm for image restoration from projections. The proposed method is capable of visualizing tissue in the presence of constraints such as the specific directivity of sensors and limited-view Photoacoustic Tomography (PAT). The directivity effect means that our algorithm treats the optically-generated ultrasonic waves based on which direction they arrive at the transducer. Most PA image reconstruction methods assume that sensors have omni-directional response; however, in practice, the sensors show higher sensitivity to the ultrasonic waves coming from one specific direction. In DPARS, the sensitivity of the transducer to incoming waves from different directions are considered. Thus, the DPARS algorithm takes into account the relative location of the absorbers with respect to the transducers, and generates a linear system of equations to solve for the distribution of absorbers. The numerical conditioning and computing times are improved by the use of a sparse Discrete Fourier Transform (DCT) representation of the distribution of absorption coefficients. Our simulation results show that DPARS outperforms the conventional Delay-and-Sum reconstruction method in terms of CNR and RMS errors. Experimental results confirm that DPARS provides images with higher resolution than DAS.
Regular synchrony lattices for product coupled cell networks
NASA Astrophysics Data System (ADS)
Aguiar, Manuela A. D.; Dias, Ana Paula S.
2015-01-01
There are several ways for constructing (bigger) networks from smaller networks. We consider here the cartesian and the Kronecker (tensor) product networks. Our main aim is to determine a relation between the lattices of synchrony subspaces for a product network and the component networks of the product. In this sense, we show how to obtain the lattice of regular synchrony subspaces for a product network from the lattices of synchrony subspaces for the component networks. Specifically, we prove that a tensor of subspaces is of synchrony for the product network if and only if the subspaces involved in the tensor are synchrony subspaces for the component networks of the product. We also show that, in general, there are (irregular) synchrony subspaces for the product network that are not described by the synchrony subspaces for the component networks, concluding that, in general, it is not possible to obtain the all synchrony lattice for the product network from the corresponding lattices for the component networks. We also make the following remark concerning the fact that the cartesian and Kronecker products, as graph operations, are quite different, implying that the associated coupled cell systems have distinct structures. Although, the kinds of dynamics expected to occur are difficult to compare, we establish an inclusion relation between the lattices of synchrony subspaces for the cartesian and Kronecker products.