Fischer, E A J; De Vlas, S J; Richardus, J H; Habbema, J D F
2008-09-01
Microsimulation of infectious diseases requires simulation of many life histories of interacting individuals. In particular, relatively rare infections such as leprosy need to be studied in very large populations. Computation time increases disproportionally with the size of the simulated population. We present a novel method, MUSIDH, an acronym for multiple use of simulated demographic histories, to reduce computation time. Demographic history refers to the processes of birth, death and all other demographic events that should be unrelated to the natural course of an infection, thus non-fatal infections. MUSIDH attaches a fixed number of infection histories to each demographic history, and these infection histories interact as if being the infection history of separate individuals. With two examples, mumps and leprosy, we show that the method can give a factor 50 reduction in computation time at the cost of a small loss in precision. The largest reductions are obtained for rare infections with complex demographic histories.
Fatigue loading history reconstruction based on the rain-flow technique
NASA Technical Reports Server (NTRS)
Khosrovaneh, A. K.; Dowling, N. E.
1989-01-01
Methods are considered for reducing a non-random fatigue loading history to a concise description and then for reconstructing a time history similar to the original. In particular, three methods of reconstruction based on a rain-flow cycle counting matrix are presented. A rain-flow matrix consists of the numbers of cycles at various peak and valley combinations. Two methods are based on a two dimensional rain-flow matrix, and the third on a three dimensional rain-flow matrix. Histories reconstructed by any of these methods produce a rain-flow matrix identical to that of the original history, and as a result the resulting time history is expected to produce a fatigue life similar to that for the original. The procedures described allow lengthy loading histories to be stored in compact form.
Aircraft model prototypes which have specified handling-quality time histories
NASA Technical Reports Server (NTRS)
Johnson, S. H.
1976-01-01
Several techniques for obtaining linear constant-coefficient airplane models from specified handling-quality time histories are discussed. One technique, the pseudodata method, solves the basic problem, yields specified eigenvalues, and accommodates state-variable transfer-function zero suppression. The method is fully illustrated for a fourth-order stability-axis small-motion model with three lateral handling-quality time histories specified. The FORTRAN program which obtains and verifies the model is included and fully documented.
Development of an algorithm for automatic detection and rating of squeak and rattle events
NASA Astrophysics Data System (ADS)
Chandrika, Unnikrishnan Kuttan; Kim, Jay H.
2010-10-01
A new algorithm for automatic detection and rating of squeak and rattle (S&R) events was developed. The algorithm utilizes the perceived transient loudness (PTL) that approximates the human perception of a transient noise. At first, instantaneous specific loudness time histories are calculated over 1-24 bark range by applying the analytic wavelet transform and Zwicker loudness transform to the recorded noise. Transient specific loudness time histories are then obtained by removing estimated contributions of the background noise from instantaneous specific loudness time histories. These transient specific loudness time histories are summed to obtain the transient loudness time history. Finally, the PTL time history is obtained by applying Glasberg and Moore temporal integration to the transient loudness time history. Detection of S&R events utilizes the PTL time history obtained by summing only 18-24 barks components to take advantage of high signal-to-noise ratio in the high frequency range. A S&R event is identified when the value of the PTL time history exceeds the detection threshold pre-determined by a jury test. The maximum value of the PTL time history is used for rating of S&R events. Another jury test showed that the method performs much better if the PTL time history obtained by summing all frequency components is used. Therefore, r ating of S&R events utilizes this modified PTL time history. Two additional jury tests were conducted to validate the developed detection and rating methods. The algorithm developed in this work will enable automatic detection and rating of S&R events with good accuracy and minimum possibility of false alarm.
NASA Technical Reports Server (NTRS)
Reeves, P. M.; Campbell, G. S.; Ganzer, V. M.; Joppa, R. G.
1974-01-01
A method is described for generating time histories which model the frequency content and certain non-Gaussian probability characteristics of atmospheric turbulence including the large gusts and patchy nature of turbulence. Methods for time histories using either analog or digital computation are described. A STOL airplane was programmed into a 6-degree-of-freedom flight simulator, and turbulence time histories from several atmospheric turbulence models were introduced. The pilots' reactions are described.
Aircraft model prototypes which have specified handling-quality time histories
NASA Technical Reports Server (NTRS)
Johnson, S. H.
1978-01-01
Several techniques for obtaining linear constant-coefficient airplane models from specified handling-quality time histories are discussed. The pseudodata method solves the basic problem, yields specified eigenvalues, and accommodates state-variable transfer-function zero suppression. The algebraic equations to be solved are bilinear, at worst. The disadvantages are reduced generality and no assurance that the resulting model will be airplane like in detail. The method is fully illustrated for a fourth-order stability-axis small motion model with three lateral handling quality time histories specified. The FORTRAN program which obtains and verifies the model is included and fully documented.
An improved approximate-Bayesian model-choice method for estimating shared evolutionary history
2014-01-01
Background To understand biological diversification, it is important to account for large-scale processes that affect the evolutionary history of groups of co-distributed populations of organisms. Such events predict temporally clustered divergences times, a pattern that can be estimated using genetic data from co-distributed species. I introduce a new approximate-Bayesian method for comparative phylogeographical model-choice that estimates the temporal distribution of divergences across taxa from multi-locus DNA sequence data. The model is an extension of that implemented in msBayes. Results By reparameterizing the model, introducing more flexible priors on demographic and divergence-time parameters, and implementing a non-parametric Dirichlet-process prior over divergence models, I improved the robustness, accuracy, and power of the method for estimating shared evolutionary history across taxa. Conclusions The results demonstrate the improved performance of the new method is due to (1) more appropriate priors on divergence-time and demographic parameters that avoid prohibitively small marginal likelihoods for models with more divergence events, and (2) the Dirichlet-process providing a flexible prior on divergence histories that does not strongly disfavor models with intermediate numbers of divergence events. The new method yields more robust estimates of posterior uncertainty, and thus greatly reduces the tendency to incorrectly estimate models of shared evolutionary history with strong support. PMID:24992937
Modified Spectral Fatigue Methods for S-N Curves With MIL-HDBK-5J Coefficients
NASA Technical Reports Server (NTRS)
Irvine, Tom; Larsen, Curtis
2016-01-01
The rainflow method is used for counting fatigue cycles from a stress response time history, where the fatigue cycles are stress-reversals. The rainflow method allows the application of Palmgren-Miner's rule in order to assess the fatigue life of a structure subject to complex loading. The fatigue damage may also be calculated from a stress response power spectral density (PSD) using the semi-empirical Dirlik, Single Moment, Zhao-Baker and other spectral methods. These methods effectively assume that the PSD has a corresponding time history which is stationary with a normal distribution. This paper shows how the probability density function for rainflow stress cycles can be extracted from each of the spectral methods. This extraction allows for the application of the MIL-HDBK-5J fatigue coefficients in the cumulative damage summation. A numerical example is given in this paper for the stress response of a beam undergoing random base excitation, where the excitation is applied separately by a time history and by its corresponding PSD. The fatigue calculation is performed in the time domain, as well as in the frequency domain via the modified spectral methods. The result comparison shows that the modified spectral methods give comparable results to the time domain rainflow counting method.
A simplified method for calculating temperature time histories in cryogenic wind tunnels
NASA Technical Reports Server (NTRS)
Stallings, R. L., Jr.; Lamb, M.
1976-01-01
Average temperature time history calculations of the test media and tunnel walls for cryogenic wind tunnels have been developed. Results are in general agreement with limited preliminary experimental measurements obtained in a 13.5-inch pilot cryogenic wind tunnel.
Kim, Jung J; Youm, Kwang-Soo; Reda Taha, Mahmoud M
2014-01-01
A numerical method to identify thermal conductivity from time history of one-dimensional temperature variations in thermal unsteady-state is proposed. The numerical method considers the change of specific heat and thermal conductivity with respect to temperature. Fire test of reinforced concrete (RC) columns was conducted using a standard fire to obtain time history of temperature variations in the column section. A thermal equilibrium model in unsteady-state condition was developed. The thermal conductivity of concrete was then determined by optimizing the numerical solution of the model to meet the observed time history of temperature variations. The determined thermal conductivity with respect to temperature was then verified against standard thermal conductivity measurements of concrete bricks. It is concluded that the proposed method can be used to conservatively estimate thermal conductivity of concrete for design purpose. Finally, the thermal radiation properties of concrete for the RC column were estimated from the thermal equilibrium at the surface of the column. The radiant heat transfer ratio of concrete representing absorptivity to emissivity ratio of concrete during fire was evaluated and is suggested as a concrete criterion that can be used in fire safety assessment.
2014-01-01
A numerical method to identify thermal conductivity from time history of one-dimensional temperature variations in thermal unsteady-state is proposed. The numerical method considers the change of specific heat and thermal conductivity with respect to temperature. Fire test of reinforced concrete (RC) columns was conducted using a standard fire to obtain time history of temperature variations in the column section. A thermal equilibrium model in unsteady-state condition was developed. The thermal conductivity of concrete was then determined by optimizing the numerical solution of the model to meet the observed time history of temperature variations. The determined thermal conductivity with respect to temperature was then verified against standard thermal conductivity measurements of concrete bricks. It is concluded that the proposed method can be used to conservatively estimate thermal conductivity of concrete for design purpose. Finally, the thermal radiation properties of concrete for the RC column were estimated from the thermal equilibrium at the surface of the column. The radiant heat transfer ratio of concrete representing absorptivity to emissivity ratio of concrete during fire was evaluated and is suggested as a concrete criterion that can be used in fire safety assessment. PMID:25180197
NASA Astrophysics Data System (ADS)
MacDonald, Christopher L.; Bhattacharya, Nirupama; Sprouse, Brian P.; Silva, Gabriel A.
2015-09-01
Computing numerical solutions to fractional differential equations can be computationally intensive due to the effect of non-local derivatives in which all previous time points contribute to the current iteration. In general, numerical approaches that depend on truncating part of the system history while efficient, can suffer from high degrees of error and inaccuracy. Here we present an adaptive time step memory method for smooth functions applied to the Grünwald-Letnikov fractional diffusion derivative. This method is computationally efficient and results in smaller errors during numerical simulations. Sampled points along the system's history at progressively longer intervals are assumed to reflect the values of neighboring time points. By including progressively fewer points backward in time, a temporally 'weighted' history is computed that includes contributions from the entire past of the system, maintaining accuracy, but with fewer points actually calculated, greatly improving computational efficiency.
Transportation forecasting : analysis and quantitative methods
DOT National Transportation Integrated Search
1983-01-01
This Record contains the following papers: Development of Survey Instruments Suitable for Determining Non-Home Activity Patterns; Sequential, History-Dependent Approach to Trip-Chaining Behavior; Identifying Time and History Dependencies of Activity ...
Time-domain representation of frequency-dependent foundation impedance functions
Safak, E.
2006-01-01
Foundation impedance functions provide a simple means to account for soil-structure interaction (SSI) when studying seismic response of structures. Impedance functions represent the dynamic stiffness of the soil media surrounding the foundation. The fact that impedance functions are frequency dependent makes it difficult to incorporate SSI in standard time-history analysis software. This paper introduces a simple method to convert frequency-dependent impedance functions into time-domain filters. The method is based on the least-squares approximation of impedance functions by ratios of two complex polynomials. Such ratios are equivalent, in the time-domain, to discrete-time recursive filters, which are simple finite-difference equations giving the relationship between foundation forces and displacements. These filters can easily be incorporated into standard time-history analysis programs. Three examples are presented to show the applications of the method.
NASA Technical Reports Server (NTRS)
Mei, Chuh; Dhainaut, Jean-Michel
2000-01-01
The Monte Carlo simulation method in conjunction with the finite element large deflection modal formulation are used to estimate fatigue life of aircraft panels subjected to stationary Gaussian band-limited white-noise excitations. Ten loading cases varying from 106 dB to 160 dB OASPL with bandwidth 1024 Hz are considered. For each load case, response statistics are obtained from an ensemble of 10 response time histories. The finite element nonlinear modal procedure yields time histories, probability density functions (PDF), power spectral densities and higher statistical moments of the maximum deflection and stress/strain. The method of moments of PSD with Dirlik's approach is employed to estimate the panel fatigue life.
Detecting space-time cancer clusters using residential histories
NASA Astrophysics Data System (ADS)
Jacquez, Geoffrey M.; Meliker, Jaymie R.
2007-04-01
Methods for analyzing geographic clusters of disease typically ignore the space-time variability inherent in epidemiologic datasets, do not adequately account for known risk factors (e.g., smoking and education) or covariates (e.g., age, gender, and race), and do not permit investigation of the latency window between exposure and disease. Our research group recently developed Q-statistics for evaluating space-time clustering in cancer case-control studies with residential histories. This technique relies on time-dependent nearest neighbor relationships to examine clustering at any moment in the life-course of the residential histories of cases relative to that of controls. In addition, in place of the widely used null hypothesis of spatial randomness, each individual's probability of being a case is instead based on his/her risk factors and covariates. Case-control clusters will be presented using residential histories of 220 bladder cancer cases and 440 controls in Michigan. In preliminary analyses of this dataset, smoking, age, gender, race and education were sufficient to explain the majority of the clustering of residential histories of the cases. Clusters of unexplained risk, however, were identified surrounding the business address histories of 10 industries that emit known or suspected bladder cancer carcinogens. The clustering of 5 of these industries began in the 1970's and persisted through the 1990's. This systematic approach for evaluating space-time clustering has the potential to generate novel hypotheses about environmental risk factors. These methods may be extended to detect differences in space-time patterns of any two groups of people, making them valuable for security intelligence and surveillance operations.
History of the Buttonhole Technique.
Misra, Madhukar
2015-01-01
The constant side method of access cannulation in hemodialysis, popularly known as the 'buttonhole' method, has an interesting history. Dr. Zbylut J. Twardowski, a Polish nephrologist, discovered this technique by pure serendipity in 1972. A patient with a complicated vascular access history and limited options for cannulation was repeatedly 'stuck' at the same sites by a nurse. Soon it was noticed that the cannulation at the same spot became easier with time. Since the needles were being reused, the sharpness of the needles decreased with time and the bluntness of the needle seemed to minimize the damage to the cannulation tract (another serendipity!). This method soon became popular among patients, and many patients started using this technique. This chapter traces the invention of this technique and its subsequent development following Dr. Twardowski's emigration to the USA. © 2015 S. Karger AG, Basel.
Response Matrix Monte Carlo for electron transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ballinger, C.T.; Nielsen, D.E. Jr.; Rathkopf, J.A.
1990-11-01
A Response Matrix Monte Carol (RMMC) method has been developed for solving electron transport problems. This method was born of the need to have a reliable, computationally efficient transport method for low energy electrons (below a few hundred keV) in all materials. Today, condensed history methods are used which reduce the computation time by modeling the combined effect of many collisions but fail at low energy because of the assumptions required to characterize the electron scattering. Analog Monte Carlo simulations are prohibitively expensive since electrons undergo coulombic scattering with little state change after a collision. The RMMC method attempts tomore » combine the accuracy of an analog Monte Carlo simulation with the speed of the condensed history methods. The combined effect of many collisions is modeled, like condensed history, except it is precalculated via an analog Monte Carol simulation. This avoids the scattering kernel assumptions associated with condensed history methods. Results show good agreement between the RMMC method and analog Monte Carlo. 11 refs., 7 figs., 1 tabs.« less
NASA Technical Reports Server (NTRS)
Greenwood, Eric II; Schmitz, Fredric H.
2009-01-01
A new method of separating the contributions of helicopter main and tail rotor noise sources is presented, making use of ground-based acoustic measurements. The method employs time-domain de-Dopplerization to transform the acoustic pressure time-history data collected from an array of ground-based microphones to the equivalent time-history signals observed by an array of virtual inflight microphones traveling with the helicopter. The now-stationary signals observed by the virtual microphones are then periodically averaged with the main and tail rotor once per revolution triggers. The averaging process suppresses noise which is not periodic with the respective rotor, allowing for the separation of main and tail rotor pressure time-histories. The averaged measurements are then interpolated across the range of directivity angles captured by the microphone array in order to generate separate acoustic hemispheres for the main and tail rotor noise sources. The new method is successfully applied to ground-based microphone measurements of a Bell 206B3 helicopter and demonstrates the strong directivity characteristics of harmonic noise radiation from both the main and tail rotors of that helicopter.
NASA Astrophysics Data System (ADS)
Atobe, Satoshi; Nonami, Shunsuke; Hu, Ning; Fukunaga, Hisao
2017-09-01
Foreign object impact events are serious threats to composite laminates because impact damage leads to significant degradation of the mechanical properties of the structure. Identification of the location and force history of the impact that was applied to the structure can provide useful information for assessing the structural integrity. This study proposes a method for identifying impact forces acting on CFRP (carbon fiber reinforced plastic) laminated plates on the basis of the sound radiated from the impacted structure. Identification of the impact location and force history is performed using the sound pressure measured with microphones. To devise a method for identifying the impact location from the difference in the arrival times of the sound wave detected with the microphones, the propagation path of the sound wave from the impacted point to the sensor is examined. For the identification of the force history, an experimentally constructed transfer matrix is employed to relate the force history to the corresponding sound pressure. To verify the validity of the proposed method, impact tests are conducted by using a CFRP cross-ply laminate as the specimen, and an impulse hammer as the impactor. The experimental results confirm the validity of the present method for identifying the impact location from the arrival time of the sound wave detected with the microphones. Moreover, the results of force history identification show the feasibility of identifying the force history accurately from the measured sound pressure using the experimental transfer matrix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smallwood, D.O.
It is recognized that some dynamic and noise environments are characterized by time histories which are not Gaussian. An example is high intensity acoustic noise. Another example is some transportation vibration. A better simulation of these environments can be generated if a zero mean non-Gaussian time history can be reproduced with a specified auto (or power) spectral density (ASD or PSD) and a specified probability density function (pdf). After the required time history is synthesized, the waveform can be used for simulation purposes. For example, modem waveform reproduction techniques can be used to reproduce the waveform on electrodynamic or electrohydraulicmore » shakers. Or the waveforms can be used in digital simulations. A method is presented for the generation of realizations of zero mean non-Gaussian random time histories with a specified ASD, and pdf. First a Gaussian time history with the specified auto (or power) spectral density (ASD) is generated. A monotonic nonlinear function relating the Gaussian waveform to the desired realization is then established based on the Cumulative Distribution Function (CDF) of the desired waveform and the known CDF of a Gaussian waveform. The established function is used to transform the Gaussian waveform to a realization of the desired waveform. Since the transformation preserves the zero-crossings and peaks of the original Gaussian waveform, and does not introduce any substantial discontinuities, the ASD is not substantially changed. Several methods are available to generate a realization of a Gaussian distributed waveform with a known ASD. The method of Smallwood and Paez (1993) is an example. However, the generation of random noise with a specified ASD but with a non-Gaussian distribution is less well known.« less
Van Oosten, John
1928-01-01
This study shows that the structural characters of the scales of the coregonid fishes of Lake Huron are so clearly recognizable as to permit their use by the scale method. It shows, further, that the fundamental assumptions underlying the scale method are warranted in so far as they apply to the lake herring (Leucichthys artedi Le Sueur). The scale method is therefore valid when applied in a study fo the life history of the lake herring. The life history of the lake herring that occur in Lake Huron is described in detail in this paper for the first time.
The Past Is All before Us: The History of Education in Hard Times
ERIC Educational Resources Information Center
Jones, Ken
2012-01-01
In this article, the author explores these questions--from what position, with what focus, and through what methods can a history be produced that is sensible of the conflicts and passions of its own time, and that can illuminate those of the past?--estimating that the books under review in several ways invite such a demanding reading. Gary…
Leaking in history space: A way to analyze systems subjected to arbitrary driving
NASA Astrophysics Data System (ADS)
Kaszás, Bálint; Feudel, Ulrike; Tél, Tamás
2018-03-01
Our aim is to unfold phase space structures underlying systems with a drift in their parameters. Such systems are non-autonomous and belong to the class of non-periodically driven systems where the traditional theory of chaos (based e.g., on periodic orbits) does not hold. We demonstrate that even such systems possess an underlying topological horseshoe-like structure at least for a finite period of time. This result is based on a specifically developed method which allows to compute the corresponding time-dependent stable and unstable foliations. These structures can be made visible by prescribing a certain type of history for an ensemble of trajectories in phase space and by analyzing the trajectories fulfilling this constraint. The process can be considered as a leaking in history space—a generalization of traditional leaking, a method that has become widespread in traditional chaotic systems, to leaks depending on time.
Harris, Magdalena; Rhodes, Tim
2018-06-01
A life history approach enables study of how risk or health protection is shaped by critical transitions and turning points in a life trajectory and in the context of social environment and time. We employed visual and narrative life history methods with people who inject drugs to explore how hepatitis C protection was enabled and maintained over the life course. We overview our methodological approach, with a focus on the ethics in practice of using life history timelines and life-grids with 37 participants. The life-grid evoked mixed emotions for participants: pleasure in receiving a personalized visual history and pain elicited by its contents. A minority managed this pain with additional heroin use. The methodological benefits of using life history methods and visual aids have been extensively reported. Crucial to consider are the ethical implications of this process, particularly for people who lack socially ascribed markers of a "successful life."
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Heeg, Jennifer; Perry, Boyd, III
1990-01-01
Time-correlated gust loads are time histories of two or more load quantities due to the same disturbance time history. Time correlation provides knowledge of the value (magnitude and sign) of one load when another is maximum. At least two analysis methods have been identified that are capable of computing maximized time-correlated gust loads for linear aircraft. Both methods solve for the unit-energy gust profile (gust velocity as a function of time) that produces the maximum load at a given location on a linear airplane. Time-correlated gust loads are obtained by re-applying this gust profile to the airplane and computing multiple simultaneous load responses. Such time histories are physically realizable and may be applied to aircraft structures. Within the past several years there has been much interest in obtaining a practical analysis method which is capable of solving the analogous problem for nonlinear aircraft. Such an analysis method has been the focus of an international committee of gust loads specialists formed by the U.S. Federal Aviation Administration and was the topic of a panel discussion at the Gust and Buffet Loads session at the 1989 SDM Conference in Mobile, Alabama. The kinds of nonlinearities common on modern transport aircraft are indicated. The Statical Discrete Gust method is capable of being, but so far has not been, applied to nonlinear aircraft. To make the method practical for nonlinear applications, a search procedure is essential. Another method is based on Matched Filter Theory and, in its current form, is applicable to linear systems only. The purpose here is to present the status of an attempt to extend the matched filter approach to nonlinear systems. The extension uses Matched Filter Theory as a starting point and then employs a constrained optimization algorithm to attack the nonlinear problem.
Nowak, Michael D.; Smith, Andrew B.; Simpson, Carl; Zwickl, Derrick J.
2013-01-01
Molecular divergence time analyses often rely on the age of fossil lineages to calibrate node age estimates. Most divergence time analyses are now performed in a Bayesian framework, where fossil calibrations are incorporated as parametric prior probabilities on node ages. It is widely accepted that an ideal parameterization of such node age prior probabilities should be based on a comprehensive analysis of the fossil record of the clade of interest, but there is currently no generally applicable approach for calculating such informative priors. We provide here a simple and easily implemented method that employs fossil data to estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade, which can be used to fit an informative parametric prior probability distribution on a node age. Specifically, our method uses the extant diversity and the stratigraphic distribution of fossil lineages confidently assigned to a clade to fit a branching model of lineage diversification. Conditioning this on a simple model of fossil preservation, we estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade. The likelihood surface of missing history can then be translated into a parametric prior probability distribution on the age of the clade of interest. We show that the method performs well with simulated fossil distribution data, but that the likelihood surface of missing history can at times be too complex for the distribution-fitting algorithm employed by our software tool. An empirical example of the application of our method is performed to estimate echinoid node ages. A simulation-based sensitivity analysis using the echinoid data set shows that node age prior distributions estimated under poor preservation rates are significantly less informative than those estimated under high preservation rates. PMID:23755303
Senarathna, S.M.D.K. Ganga; Ranganathan, Shalini S.; Buckley, Nick; Soysa, S.S.S.B.D. Preethi; Fernandopulle, B. M. Rohini
2012-01-01
Objectives: Acute paracetamol poisoning is an emerging problem in Sri Lanka. Management guidelines recommend ingested dose and serum paracetamol concentrations to assess the risk. Our aim was to determine the usefulness of the patient's history of an ingested dose of >150 mg/kg and paracetamol concentration obtained by a simple colorimetric method to assess risk in patients with acute paracetamol poisoning. Materials and Methods: Serum paracetamol concentrations were determined in 100 patients with a history of paracetamol overdose using High Performance Liquid Chromatography (HPLC); (reference method). The results were compared to those obtained with a colorimetric method. The utility of risk assessment by reported dose ingested and colorimetric analysis were compared. Results: The area under the receiver operating characteristic curve for the history of ingested dose was 0.578 and there was no dose cut-off providing useful risk categorization. Both analytical methods had less than 5% intra- and inter-batch variation and were accurate on spiked samples. The time from blood collection to result was six times faster and ten times cheaper for colorimetry (30 minutes, US$2) than for HPLC (180 minutes, US$20). The correlation coefficient between the paracetamol levels by the two methods was 0.85. The agreement on clinical risk categorization on the standard nomogram was also good (Kappa = 0.62, sensitivity 81%, specificity 89%). Conclusions: History of dose ingested alone greatly over-estimated the number of patients who need antidotes and it was a poor predictor of risk. Paracetamol concentrations by colorimetry are rapid and inexpensive. The use of these would greatly improve the assessment of risk and greatly reduce unnecessary expenditure on antidotes. PMID:23087506
Research on Influencing Factors and Generalized Power of Synthetic Artificial Seismic Wave
NASA Astrophysics Data System (ADS)
Jiang, Yanpei
2018-05-01
Start your abstract here… In this paper, according to the trigonometric series method, the author adopts different envelope functions and the acceleration design spectrum in Seismic Code For Urban Bridge Design to simulate the seismic acceleration time history which meets the engineering accuracy requirements by modifying and iterating the initial wave. Spectral analysis is carried out to find out the the distribution law of the changing frequencies of the energy of seismic time history and to determine the main factors that affect the acceleration amplitude spectrum and energy spectrum density. The generalized power formula of seismic time history is derived from the discrete energy integral formula and the author studied the changing characteristics of generalized power of the seismic time history under different envelop functions. Examples are analyzed to illustrate that generalized power can measure the seismic performance of bridges.
ERIC Educational Resources Information Center
Turk, Laraine D.
"Ancient Egypt," an upper-division, non-required history course covering Egypt from pre-dynastic time through the Roman domination is described. General descriptive information is presented first, including the method of grading, expectation of student success rate, long-range course objectives, procedures for revising the course, major…
Retrieving rupture history using waveform inversions in time sequence
NASA Astrophysics Data System (ADS)
Yi, L.; Xu, C.; Zhang, X.
2017-12-01
The rupture history of large earthquakes is generally regenerated using the waveform inversion through utilizing seismological waveform records. In the waveform inversion, based on the superposition principle, the rupture process is linearly parameterized. After discretizing the fault plane into sub-faults, the local source time function of each sub-fault is usually parameterized using the multi-time window method, e.g., mutual overlapped triangular functions. Then the forward waveform of each sub-fault is synthesized through convoluting the source time function with its Green function. According to the superposition principle, these forward waveforms generated from the fault plane are summarized in the recorded waveforms after aligning the arrival times. Then the slip history is retrieved using the waveform inversion method after the superposing of all forward waveforms for each correspond seismological waveform records. Apart from the isolation of these forward waveforms generated from each sub-fault, we also realize that these waveforms are gradually and sequentially superimposed in the recorded waveforms. Thus we proposed a idea that the rupture model is possibly detachable in sequent rupture times. According to the constrained waveform length method emphasized in our previous work, the length of inverted waveforms used in the waveform inversion is objectively constrained by the rupture velocity and rise time. And one essential prior condition is the predetermined fault plane that limits the duration of rupture time, which means the waveform inversion is restricted in a pre-set rupture duration time. Therefore, we proposed a strategy to inverse the rupture process sequentially using the progressively shift rupture times as the rupture front expanding in the fault plane. And we have designed a simulation inversion to test the feasibility of the method. Our test result shows the prospect of this idea that requiring furthermore investigation.
Yoshihama, Mieko; Bybee, Deborah
2011-03-01
Intimate partner violence (IPV) is prevalent and often recurrent in women's lives. To better understand the changing risk of IPV over the life course, which could guide more effective policies and program responses, methodological innovations are needed. Life History Calendar methods enhance respondents' recall of the timing of specific types of IPV experienced over the life course. Multilevel modeling provides a way to analyze individual and collective trajectories and examine covariates of IPV risk. We apply these complementary methods to examine IPV trajectories for a sample of women of Filipina descent living in the United States, examining life course timing and cohort effects. © The Author(s) 2011.
Andrianakis, I; Vernon, I; McCreesh, N; McKinley, T J; Oakley, J E; Nsubuga, R N; Goldstein, M; White, R G
2017-08-01
Complex stochastic models are commonplace in epidemiology, but their utility depends on their calibration to empirical data. History matching is a (pre)calibration method that has been applied successfully to complex deterministic models. In this work, we adapt history matching to stochastic models, by emulating the variance in the model outputs, and therefore accounting for its dependence on the model's input values. The method proposed is applied to a real complex epidemiological model of human immunodeficiency virus in Uganda with 22 inputs and 18 outputs, and is found to increase the efficiency of history matching, requiring 70% of the time and 43% fewer simulator evaluations compared with a previous variant of the method. The insight gained into the structure of the human immunodeficiency virus model, and the constraints placed on it, are then discussed.
Calculation of vitrinite reflectance from thermal histories: A comparison of some methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrow, D.W.; Issler, D.R.
1993-04-01
Vitrinite reflectance values (%R[sub o]) calculated from commonly used methods are compared with respect to time invariant temperatures and constant heating rates. Two monofunctional methods, one involving a time-temperature index to vitrinite reflectance correlation (TTI-%R[sub o]) to depth correlation, yield vitrinite reflectance values that are similar to those calculated by recently published Arrhenius-based methods, such as EASY%R[sub o]. The approximate agreement between these methods supports the perception that the EASY%R[sub o] algorithm is the most accurate method for the prediction of vitrinite reflectances throughout the range of organic maturity normally encountered. However, calibration of these methods against vitrinite reflectance datamore » from two basin sequences with well-documented geologic histories indicates that, although the EASY%R[sub o] method has wide applicability, it slightly overestimates vitrinite reflectances in strata of low to medium maturity up to a %R[sub o] value of 0.9%. The two monofunctional methods may be more accurate for prediction of vitrinite reflectances in similar sequences of low maturity. An older, but previously widely accepted TTI-%R[sub O] correlation consistently overestimates vitrinite reflectances with respect to other methods. Underestimation of paleogeothermal gradients in the original calibration of time-temperature history to vitrinite reflectance may have introduced a systematic bias to the TTI-%R[sub o] correlation used in this method. Also, incorporation of TAI (thermal alteration index) data and its conversion to %R[sub o]-equivalent values may have introduced inaccuracies. 36 refs., 7 figs.« less
Senarathna, S M D K Ganga; Ranganathan, Shalini S; Buckley, Nick; Soysa, S S S B D Preethi; Fernandopulle, B M Rohini
2012-01-01
Acute paracetamol poisoning is an emerging problem in Sri Lanka. Management guidelines recommend ingested dose and serum paracetamol concentrations to assess the risk. Our aim was to determine the usefulness of the patient's history of an ingested dose of >150 mg/kg and paracetamol concentration obtained by a simple colorimetric method to assess risk in patients with acute paracetamol poisoning. Serum paracetamol concentrations were determined in 100 patients with a history of paracetamol overdose using High Performance Liquid Chromatography (HPLC); (reference method). The results were compared to those obtained with a colorimetric method. The utility of risk assessment by reported dose ingested and colorimetric analysis were compared. The area under the receiver operating characteristic curve for the history of ingested dose was 0.578 and there was no dose cut-off providing useful risk categorization. Both analytical methods had less than 5% intra- and inter-batch variation and were accurate on spiked samples. The time from blood collection to result was six times faster and ten times cheaper for colorimetry (30 minutes, US$2) than for HPLC (180 minutes, US$20). The correlation coefficient between the paracetamol levels by the two methods was 0.85. The agreement on clinical risk categorization on the standard nomogram was also good (Kappa = 0.62, sensitivity 81%, specificity 89%). History of dose ingested alone greatly over-estimated the number of patients who need antidotes and it was a poor predictor of risk. Paracetamol concentrations by colorimetry are rapid and inexpensive. The use of these would greatly improve the assessment of risk and greatly reduce unnecessary expenditure on antidotes.
NASA Astrophysics Data System (ADS)
Cupola, F.; Tanda, M. G.; Zanini, A.
2014-12-01
The interest in approaches that allow the estimation of pollutant source release in groundwater has increased exponentially over the last decades. This is due to the large number of groundwater reclamation procedures that have been carried out: the remediation is expensive and the costs can be easily shared among the different actors if the release history is known. Moreover, a reliable release history can be a useful tool for predicting the plume evolution and for minimizing the harmful effects of the contamination. In this framework, Woodbury and Ulrych (1993, 1996) adopted and improved the minimum relative entropy (MRE) method to solve linear inverse problems for the recovery of the pollutant release history in an aquifer. In this work, the MRE method has been improved to detect the source release history in 2-D aquifer characterized by a non-uniform flow-field. The approach has been tested on two cases: a 2-D homogeneous conductivity field and a strong heterogeneous one (the hydraulic conductivity presents three orders of magnitude in terms of variability). In the latter case the transfer function could not be described with an analytical formulation, thus, the transfer functions were estimated by means of the method developed by Butera et al. (2006). In order to demonstrate its scope, this method was applied with two different datasets: observations collected at the same time at 20 different monitoring points, and observations collected at 2 monitoring points at different times (15-25 monitoring points). The data observed were considered affected by a random error. These study cases have been carried out considering a Boxcar and a Gaussian function as expected value of the prior distribution of the release history. The agreement between the true and the estimated release history has been evaluated through the calculation of the normalized Root Mean Square (nRMSE) error: this has shown the ability of the method of recovering the release history even in the most severe cases. Finally, the forward simulation has been carried out by using the estimated release history in order to compare the true data with the estimated one: the best agreement has been obtained in the homogeneous case, even if also in the heterogenous one the nRMSE is acceptable.
NASA Technical Reports Server (NTRS)
Clothiaux, John D.; Dowling, Norman E.
1992-01-01
The suitability of using rain-flow reconstructions as an alternative to an original loading spectrum for component fatigue life testing is investigated. A modified helicopter maneuver history is used for the rain-flow cycle counting and history regenerations. Experimental testing on a notched test specimen over a wide range of loads produces similar lives for the original history and the reconstructions. The test lives also agree with a simplified local strain analysis performed on the specimen utilizing the rain-flow cycle count. The rain-flow reconstruction technique is shown to be a viable test spectrum alternative to storing the complete original load history, especially in saving computer storage space and processing time. A description of the regeneration method, the simplified life prediction analysis, and the experimental methods are included in the investigation.
Thaler, Kylie; Harris, Mark F
2012-01-01
Objective To assess if data collected by a consumer organisation are valid for a health service research study on physicians' performance in preventive care. To report first results of the analysis of physicians performance like consultation time and guideline adherence in history taking. Design Secondary data analysis of a clustered cross-sectional direct observation survey. Setting General practitioners (GPs) in Vienna, Austria, visited unannounced by mystery shoppers (incognito standardised patients (ISPs)). Participants 21 randomly selected GPs were visited by two different ISPs each. 40 observation protocols were realised. Main outcome measures Robustness of sampling and data collection by the consumer organisation. GPs consultation and waiting times, guideline adherence in history taking. Results The double stratified random sampling method was robust and representative for the private and contracted GPs mix of Vienna. The clinical scenarios presented by the ISPs were valid and believable, and no GP realised the ISPs were not genuine patients. The average consultation time was 46 min (95% CI 37 to 54 min). Waiting times differed more than consultation times between private and contracted GPs. No differences between private and contracted GPs in terms of adherence to the evidence-based guidelines regarding history taking including questions regarding alcohol use were found. According to the analysis, 20% of the GPs took a perfect history (95% CI 9% to 39%). Conclusions The analysis of secondary data collected by a consumer organisation was a valid method for drawing conclusions about GPs preventive practice. Initial results, like consultation times longer than anticipated, and the moderate quality of history taking encourage continuing the analysis on available clinical data. PMID:22872721
NASA Astrophysics Data System (ADS)
Hwang, James Ho-Jin; Duran, Adam
2016-08-01
Most of the times pyrotechnic shock design and test requirements for space systems are provided in Shock Response Spectrum (SRS) without the input time history. Since the SRS does not describe the input or the environment, a decomposition method is used to obtain the source time history. The main objective of this paper is to develop a decomposition method producing input time histories that can satisfy the SRS requirement based on the pyrotechnic shock test data measured from a mechanical impact test apparatus. At the heart of this decomposition method is the statistical representation of the pyrotechnic shock test data measured from the MIT Lincoln Laboratory (LL) designed Universal Pyrotechnic Shock Simulator (UPSS). Each pyrotechnic shock test data measured at the interface of a test unit has been analyzed to produce the temporal peak acceleration, Root Mean Square (RMS) acceleration, and the phase lag at each band center frequency. Maximum SRS of each filtered time history has been calculated to produce a relationship between the input and the response. Two new definitions are proposed as a result. The Peak Ratio (PR) is defined as the ratio between the maximum SRS and the temporal peak acceleration at each band center frequency. The ratio between the maximum SRS and the RMS acceleration is defined as the Energy Ratio (ER) at each band center frequency. Phase lag is estimated based on the time delay between the temporal peak acceleration at each band center frequency and the peak acceleration at the lowest band center frequency. This stochastic process has been applied to more than one hundred pyrotechnic shock test data to produce probabilistic definitions of the PR, ER, and the phase lag. The SRS is decomposed at each band center frequency using damped sinusoids with the PR and the decays obtained by matching the ER of the damped sinusoids to the ER of the test data. The final step in this stochastic SRS decomposition process is the Monte Carlo (MC) simulation. The MC simulation identifies combinations of the PR and decays that can meet the SRS requirement at each band center frequency. Decomposed input time histories are produced by summing the converged damped sinusoids with the MC simulation of the phase lag distribution.
NASA Astrophysics Data System (ADS)
Huang, Duruo; Du, Wenqi; Zhu, Hong
2017-10-01
In performance-based seismic design, ground-motion time histories are needed for analyzing dynamic responses of nonlinear structural systems. However, the number of ground-motion data at design level is often limited. In order to analyze seismic performance of structures, ground-motion time histories need to be either selected from recorded strong-motion database or numerically simulated using stochastic approaches. In this paper, a detailed procedure to select proper acceleration time histories from the Next Generation Attenuation (NGA) database for several cities in Taiwan is presented. Target response spectra are initially determined based on a local ground-motion prediction equation under representative deterministic seismic hazard analyses. Then several suites of ground motions are selected for these cities using the Design Ground Motion Library (DGML), a recently proposed interactive ground-motion selection tool. The selected time histories are representatives of the regional seismic hazard and should be beneficial to earthquake studies when comprehensive seismic hazard assessments and site investigations are unavailable. Note that this method is also applicable to site-specific motion selections with the target spectra near the ground surface considering the site effect.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, E.J.
1976-02-01
A computer program is described which calculates nuclide concentration histories, power or neutron flux histories, burnups, and fission-product birthrates for fueled experimental capsules subjected to neutron irradiations. Seventeen heavy nuclides in the chain from $sup 232$Th to $sup 242$Pu and a user- specified number of fission products are treated. A fourth-order Runge-Kutta calculational method solves the differential equations for nuclide concentrations as a function of time. For a particular problem, a user-specified number of fuel regions may be treated. A fuel region is described by volume, length, and specific irradiation history. A number of initial fuel compositions may be specifiedmore » for each fuel region. The irradiation history for each fuel region can be divided into time intervals, and a constant power density or a time-dependent neutron flux is specified for each time interval. Also, an independent cross- section set may be selected for each time interval in each irradiation history. The fission-product birthrates for the first composition of each fuel region are summed to give the total fission-product birthrates for the problem.« less
Jewett, Ethan M.; Steinrücken, Matthias; Song, Yun S.
2016-01-01
Many approaches have been developed for inferring selection coefficients from time series data while accounting for genetic drift. These approaches have been motivated by the intuition that properly accounting for the population size history can significantly improve estimates of selective strengths. However, the improvement in inference accuracy that can be attained by modeling drift has not been characterized. Here, by comparing maximum likelihood estimates of selection coefficients that account for the true population size history with estimates that ignore drift by assuming allele frequencies evolve deterministically in a population of infinite size, we address the following questions: how much can modeling the population size history improve estimates of selection coefficients? How much can mis-inferred population sizes hurt inferences of selection coefficients? We conduct our analysis under the discrete Wright–Fisher model by deriving the exact probability of an allele frequency trajectory in a population of time-varying size and we replicate our results under the diffusion model. For both models, we find that ignoring drift leads to estimates of selection coefficients that are nearly as accurate as estimates that account for the true population history, even when population sizes are small and drift is high. This result is of interest because inference methods that ignore drift are widely used in evolutionary studies and can be many orders of magnitude faster than methods that account for population sizes. PMID:27550904
A Galerkin discretisation-based identification for parameters in nonlinear mechanical systems
NASA Astrophysics Data System (ADS)
Liu, Zuolin; Xu, Jian
2018-04-01
In the paper, a new parameter identification method is proposed for mechanical systems. Based on the idea of Galerkin finite-element method, the displacement over time history is approximated by piecewise linear functions, and the second-order terms in model equation are eliminated by integrating by parts. In this way, the lost function of integration form is derived. Being different with the existing methods, the lost function actually is a quadratic sum of integration over the whole time history. Then for linear or nonlinear systems, the optimisation of the lost function can be applied with traditional least-squares algorithm or the iterative one, respectively. Such method could be used to effectively identify parameters in linear and arbitrary nonlinear mechanical systems. Simulation results show that even under the condition of sparse data or low sampling frequency, this method could still guarantee high accuracy in identifying linear and nonlinear parameters.
Stein, Claudia
2013-01-01
Summary This article investigates the historical method of Karl Sudhoff (1853– 1938), Germany’s first professor of medical history. It argues that in order to understand his ideas more fully, we need to step outside the historiography of medical history and assess his methodology in relation to the norms and ideals of German academic history writing in general. The article demonstrates that the philology-based “critical method” of Leopold von Ranke (1795–1886) was central to Sudhoff’s methodological thinking. It investigates the underlying philosophical and epistemological assumptions of Ranke’s method, which tend to be less appreciated than his overt empiricism and explores how Sudhoff applied these to the new professionalizing subdiscipline of the history of medicine. The article argues that Sudhoff’s concerns with the methodology of history, which involved a particular conception of the relationship between the human sciences and the medical sciences, offers compelling addresses to our times. PMID:23811710
Lee, E Henry; Wickham, Charlotte; Beedlow, Peter A; Waschmann, Ronald S; Tingey, David T
2017-10-01
A time series intervention analysis (TSIA) of dendrochronological data to infer the tree growth-climate-disturbance relations and forest disturbance history is described. Maximum likelihood is used to estimate the parameters of a structural time series model with components for climate and forest disturbances (i.e., pests, diseases, fire). The statistical method is illustrated with a tree-ring width time series for a mature closed-canopy Douglas-fir stand on the west slopes of the Cascade Mountains of Oregon, USA that is impacted by Swiss needle cast disease caused by the foliar fungus, Phaecryptopus gaeumannii (Rhode) Petrak. The likelihood-based TSIA method is proposed for the field of dendrochronology to understand the interaction of temperature, water, and forest disturbances that are important in forest ecology and climate change studies.
Particle drag history in a subcritical post-shock flow - data analysis method and uncertainty
NASA Astrophysics Data System (ADS)
Ding, Liuyang; Bordoloi, Ankur; Adrian, Ronald; Prestridge, Kathy; Arizona State University Team; Los Alamos National Laboratory Team
2017-11-01
A novel data analysis method for measuring particle drag in an 8-pulse particle tracking velocimetry-accelerometry (PTVA) experiment is described. We represented the particle drag history, CD(t) , using polynomials up to the third order. An analytical model for continuous particle position history was derived by integrating an equation relating CD(t) with particle velocity and acceleration. The coefficients of CD(t) were then calculated by fitting the position history model to eight measured particle locations in the sense of least squares. A preliminary test with experimental data showed that the new method yielded physically more reasonable particle velocity and acceleration history compared to conventionally adopted polynomial fitting. To fully assess and optimize the performance of the new method, we performed a PTVA simulation by assuming a ground truth of particle motion based on an ensemble of experimental data. The results indicated a significant reduction in the RMS error of CD. We also found that for particle locating noise between 0.1 and 3 pixels, a range encountered in our experiment, the lowest RMS error was achieved by using the quadratic CD(t) model. Furthermore, we will also discuss the optimization of the pulse timing configuration.
Comparison of reproducibility of natural head position using two methods.
Khan, Abdul Rahim; Rajesh, R N G; Dinesh, M R; Sanjay, N; Girish, K S; Venkataraghavan, Karthik
2012-01-01
Lateral cephalometric radiographs have become virtually indispensable to orthodontists in the treatment of patients. They are important in orthodontic growth analysis, diagnosis, treatment planning, monitoring of therapy and evaluation of final treatment outcome. The purpose of this study was to evaluate and compare the maximum reproducibility with minimum variation of natural head position using two methods, i.e. the mirror method and the fluid level device method. The study included two sets of 40 lateral cephalograms taken using two methods of obtaining natural head position: (1) The mirror method and (2) fluid level device method, with a time interval of 2 months. Inclusion criteria • Subjects were randomly selected aged between 18 to 26 years Exclusion criteria • History of orthodontic treatment • Any history of respiratory tract problem or chronic mouth breathing • Any congenital deformity • History of traumatically-induced deformity • History of myofacial pain syndrome • Any previous history of head and neck surgery. The result showed that both the methods for obtaining natural head position-the mirror method and fluid level device method were comparable, but maximum reproducibility was more with the fluid level device as shown by the Dahlberg's coefficient and Bland-Altman plot. The minimum variance was seen with the fluid level device method as shown by Precision and Pearson correlation. The mirror method and the fluid level device method used for obtaining natural head position were comparable without any significance, and the fluid level device method was more reproducible and showed less variance when compared to mirror method for obtaining natural head position. Fluid level device method was more reproducible and shows less variance when compared to mirror method for obtaining natural head position.
A History of Instructional Methods in Uncontracted and Contracted Braille
ERIC Educational Resources Information Center
D'Andrea, Frances Mary
2009-01-01
This literature review outlines the history of the braille code as used in the United States and Canada, illustrating how both the code itself and instructional strategies for teaching it changed over time. The review sets the stage for the research questions of the recently completed Alphabetic Braille and Contracted Braille Study.
Shoulder instability in professional football players.
Leclere, Lance E; Asnis, Peter D; Griffith, Matthew H; Granito, David; Berkson, Eric M; Gill, Thomas J
2013-09-01
Shoulder instability is a common problem in American football players entering the National Football League (NFL). Treatment options include nonoperative and surgical stabilization. This study evaluated how the method of treatment of pre-NFL shoulder instability affects the rate of recurrence and the time elapsed until recurrence in players on 1 NFL team. Retrospective cohort. Medical records from 1980 to 2008 for 1 NFL team were reviewed. There were 328 players included in the study who started their career on the team and remained on the team for at least 2 years (mean, 3.9 years; range, 2-14 years). The history of instability prior to entering the NFL and the method of treatment were collected. Data on the occurrence of instability while in the NFL were recorded to determine the rate and timing of recurrence. Thirty-one players (9.5%) had a history of instability prior to entering the NFL. Of the 297 players with no history of instability, 39 (13.1%) had a primary event at a mean of 18.4 ± 22.2 months (range, 0-102 months) after joining the team. In the group of players with prior instability treated with surgical stabilization, there was no statistical difference in the rate of recurrence (10.5%) or the timing to the instability episode (mean, 26 months) compared with players with no history of instability. Twelve players had shoulder instability treated nonoperatively prior to the NFL. Five of these players (41.7%) had recurrent instability at a mean of 4.4 ± 7.0 months (range, 0-16 months). The patients treated nonoperatively had a significantly higher rate of recurrence (P = 0.02) and an earlier time of recurrence (P = 0.04). The rate of contralateral instability was 25.8%, occurring at a mean of 8.6 months. Recurrent shoulder instability is more common in NFL players with a history of nonoperative treatment. Surgical stabilization appears to restore the rate and timing of instability to that of players with no prior history of instability.
Global, local and focused geographic clustering for case-control data with residential histories
Jacquez, Geoffrey M; Kaufmann, Andy; Meliker, Jaymie; Goovaerts, Pierre; AvRuskin, Gillian; Nriagu, Jerome
2005-01-01
Background This paper introduces a new approach for evaluating clustering in case-control data that accounts for residential histories. Although many statistics have been proposed for assessing local, focused and global clustering in health outcomes, few, if any, exist for evaluating clusters when individuals are mobile. Methods Local, global and focused tests for residential histories are developed based on sets of matrices of nearest neighbor relationships that reflect the changing topology of cases and controls. Exposure traces are defined that account for the latency between exposure and disease manifestation, and that use exposure windows whose duration may vary. Several of the methods so derived are applied to evaluate clustering of residential histories in a case-control study of bladder cancer in south eastern Michigan. These data are still being collected and the analysis is conducted for demonstration purposes only. Results Statistically significant clustering of residential histories of cases was found but is likely due to delayed reporting of cases by one of the hospitals participating in the study. Conclusion Data with residential histories are preferable when causative exposures and disease latencies occur on a long enough time span that human mobility matters. To analyze such data, methods are needed that take residential histories into account. PMID:15784151
NASA Astrophysics Data System (ADS)
Dutrieux, Loïc P.; Jakovac, Catarina C.; Latifah, Siti H.; Kooistra, Lammert
2016-05-01
We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The Breaks For Additive Season and Trend (BFAST) framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used for a selected study area, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in land use regimes. In order to further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil (state of Amazonas). Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after land abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation approach. We validated the number of cultivation cycles predicted by the method against in-situ information collected from farmers interviews, resulting in a Normalized Residual Mean Squared Error (NRMSE) of 0.25. Overall the method performed well, producing maps with coherent spatial patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for land use history mapping in the tropics and beyond.
Large deviations of a long-time average in the Ehrenfest urn model
NASA Astrophysics Data System (ADS)
Meerson, Baruch; Zilber, Pini
2018-05-01
Since its inception in 1907, the Ehrenfest urn model (EUM) has served as a test bed of key concepts of statistical mechanics. Here we employ this model to study large deviations of a time-additive quantity. We consider two continuous-time versions of the EUM with K urns and N balls: with and without interactions between the balls in the same urn. We evaluate the probability distribution that the average number of balls in one urn over time T, , takes any specified value aN, where . For long observation time, , a Donsker–Varadhan large deviation principle holds: , where … denote additional parameters of the model. We calculate the rate function exactly by two different methods due to Donsker and Varadhan and compare the exact results with those obtained with a variant of WKB approximation (after Wentzel, Kramers and Brillouin). In the absence of interactions the WKB prediction for is exact for any N. In the presence of interactions the WKB method gives asymptotically exact results for . The WKB method also uncovers the (very simple) time history of the system which dominates the contribution of different time histories to .
Jewett, Ethan M; Steinrücken, Matthias; Song, Yun S
2016-11-01
Many approaches have been developed for inferring selection coefficients from time series data while accounting for genetic drift. These approaches have been motivated by the intuition that properly accounting for the population size history can significantly improve estimates of selective strengths. However, the improvement in inference accuracy that can be attained by modeling drift has not been characterized. Here, by comparing maximum likelihood estimates of selection coefficients that account for the true population size history with estimates that ignore drift by assuming allele frequencies evolve deterministically in a population of infinite size, we address the following questions: how much can modeling the population size history improve estimates of selection coefficients? How much can mis-inferred population sizes hurt inferences of selection coefficients? We conduct our analysis under the discrete Wright-Fisher model by deriving the exact probability of an allele frequency trajectory in a population of time-varying size and we replicate our results under the diffusion model. For both models, we find that ignoring drift leads to estimates of selection coefficients that are nearly as accurate as estimates that account for the true population history, even when population sizes are small and drift is high. This result is of interest because inference methods that ignore drift are widely used in evolutionary studies and can be many orders of magnitude faster than methods that account for population sizes. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
Comparison of frequency-domain and time-domain rotorcraft vibration control methods
NASA Technical Reports Server (NTRS)
Gupta, N. K.
1984-01-01
Active control of rotor-induced vibration in rotorcraft has received significant attention recently. Two classes of techniques have been proposed. The more developed approach works with harmonic analysis of measured time histories and is called the frequency-domain approach. The more recent approach computes the control input directly using the measured time history data and is called the time-domain approach. The report summarizes the results of a theoretical investigation to compare the two approaches. Five specific areas were addressed: (1) techniques to derive models needed for control design (system identification methods), (2) robustness with respect to errors, (3) transient response, (4) susceptibility to noise, and (5) implementation difficulties. The system identification methods are more difficult for the time-domain models. The time-domain approach is more robust (e.g., has higher gain and phase margins) than the frequency-domain approach. It might thus be possible to avoid doing real-time system identification in the time-domain approach by storing models at a number of flight conditions. The most significant error source is the variation in open-loop vibrations caused by pilot inputs, maneuvers or gusts. The implementation requirements are similar except that the time-domain approach can be much simpler to implement if real-time system identification were not necessary.
NASA Technical Reports Server (NTRS)
Decker, A. J.
1984-01-01
The holographic recording of the time history of a flow feature in three dimensions is discussed. The use of diffuse illumination holographic interferometry or the three dimensional visualization of flow features such as shock waves and turbulent eddies is described. The double-exposure and time-average methods are compared using the characteristic function and the results from a flow simulator. A time history requires a large hologram recording rate. Results of holographic cinematography of the shock waves in a flutter cascade are presented as an example. Future directions of this effort, including the availability and development of suitable lasers, are discussed.
Immigration history of amphidromous species on a Greater Antillean island
Benjamin D. Cook; Catherine M. Pringle; Jane M. Hughes
2010-01-01
Aim To use molecular data to test for dispersal structuring in the immigration history of an amphidromous community on an island. Location The Caribbean island of Puerto Rico. Methods Mitochondrial DNA sequences were obtained from 11 amphidromous species, including shrimps, fish and a gastropod, sampled from throughout the island. The timing of population expansion (TE...
A Hot-Deck Multiple Imputation Procedure for Gaps in Longitudinal Recurrent Event Histories
Wang, Chia-Ning; Little, Roderick; Nan, Bin; Harlow, Siobán D.
2012-01-01
Summary We propose a regression-based hot deck multiple imputation method for gaps of missing data in longitudinal studies, where subjects experience a recurrent event process and a terminal event. Examples are repeated asthma episodes and death, or menstrual periods and the menopause, as in our motivating application. Research interest concerns the onset time of a marker event, defined by the recurrent-event process, or the duration from this marker event to the final event. Gaps in the recorded event history make it difficult to determine the onset time of the marker event, and hence, the duration from onset to the final event. Simple approaches such as jumping gap times or dropping cases with gaps have obvious limitations. We propose a procedure for imputing information in the gaps by substituting information in the gap from a matched individual with a completely recorded history in the corresponding interval. Predictive Mean Matching is used to incorporate information on longitudinal characteristics of the repeated process and the final event time. Multiple imputation is used to propagate imputation uncertainty. The procedure is applied to an important data set for assessing the timing and duration of the menopausal transition. The performance of the proposed method is assessed by a simulation study. PMID:21361886
NASA Astrophysics Data System (ADS)
Huang, D.; Wang, G.
2014-12-01
Stochastic simulation of spatially distributed ground-motion time histories is important for performance-based earthquake design of geographically distributed systems. In this study, we develop a novel technique to stochastically simulate regionalized ground-motion time histories using wavelet packet analysis. First, a transient acceleration time history is characterized by wavelet-packet parameters proposed by Yamamoto and Baker (2013). The wavelet-packet parameters fully characterize ground-motion time histories in terms of energy content, time- frequency-domain characteristics and time-frequency nonstationarity. This study further investigates the spatial cross-correlations of wavelet-packet parameters based on geostatistical analysis of 1500 regionalized ground motion data from eight well-recorded earthquakes in California, Mexico, Japan and Taiwan. The linear model of coregionalization (LMC) is used to develop a permissible spatial cross-correlation model for each parameter group. The geostatistical analysis of ground-motion data from different regions reveals significant dependence of the LMC structure on regional site conditions, which can be characterized by the correlation range of Vs30 in each region. In general, the spatial correlation and cross-correlation of wavelet-packet parameters are stronger if the site condition is more homogeneous. Using the regional-specific spatial cross-correlation model and cokriging technique, wavelet packet parameters at unmeasured locations can be best estimated, and regionalized ground-motion time histories can be synthesized. Case studies and blind tests demonstrated that the simulated ground motions generally agree well with the actual recorded data, if the influence of regional-site conditions is considered. The developed method has great potential to be used in computational-based seismic analysis and loss estimation in a regional scale.
Demographic history and gene flow during silkworm domestication
2014-01-01
Background Gene flow plays an important role in domestication history of domesticated species. However, little is known about the demographic history of domesticated silkworm involving gene flow with its wild relative. Results In this study, four model-based evolutionary scenarios to describe the demographic history of B. mori were hypothesized. Using Approximate Bayesian Computation method and DNA sequence data from 29 nuclear loci, we found that the gene flow at bottleneck model is the most likely scenario for silkworm domestication. The starting time of silkworm domestication was estimated to be approximate 7,500 years ago; the time of domestication termination was 3,984 years ago. Using coalescent simulation analysis, we also found that bi-directional gene flow occurred during silkworm domestication. Conclusions Estimates of silkworm domestication time are nearly consistent with the archeological evidence and our previous results. Importantly, we found that the bi-directional gene flow might occur during silkworm domestication. Our findings add a dimension to highlight the important role of gene flow in domestication of crops and animals. PMID:25123546
Dissemination of the Phasor Method in Electrical Engineering in China
ERIC Educational Resources Information Center
Zhang, Liangliang; Lei, Yinzhao
2014-01-01
Synchrophasors, widely used in the monitoring and analysis of power systems, evolved from the phasor method presented by Charles Proteus Steinmetz in 1893. The phasor method is a mathematical method for solving linear sinusoidal steady-state circuits and time-varying electromagnetic fields. This paper traces the history and diffusion of the phasor…
Linguistic Method: Yesterday and Today.
ERIC Educational Resources Information Center
Rauch, Irmengard
This paper introduces the reader to a brief history of the focus of linguistic method from prehistoric times, through the Classical era, the Middle Ages, to the present. The scientific orientation of linguistic method is exploited; a set of specific principles is found to unify most of today's diverse methods. The success of linguistics is…
Gray, R H; Simpson, J L; Kambic, R T; Queenan, J T; Mena, P; Perez, A; Barbato, M
1995-05-01
Our purpose was to ascertain the effects of timing of conception on the risk of spontaneous abortion. To assess these effects, women who conceived while using natural family planning were identified in five centers worldwide between 1987 and 1993. Timing of conception was determined from 868 natural family planning charts that recorded day of intercourse and indices of ovulation (cervical mucus peak obtained according to the ovulation method and/or basal body temperature). Conceptions on days - 1 or 0 with respect to the natural family planning estimated day of ovulation were considered to be "optimally timed," and all other conceptions were considered as "non-optimally timed." The rate of spontaneous abortions per 100 pregnancies was examined in relation to timing of conception, ages, reproductive history, and other covariates with bivariate and multivariate statistical methods. There were 88 spontaneous abortions among 868 pregnancies (10.1%). The spontaneous abortion rate was similar for 361 optimally timed conceptions (9.1%) and 507 non-optimally timed conceptions (10.9%). However, among 171 women who had experienced a spontaneous abortion in a prior pregnancy, the rate of spontaneous abortion in the index pregnancy was significantly higher with non-optimally timed conceptions (22.6%) as compared with optimally timed conceptions (7.3%). This association was not observed among 697 women with no history of pregnancy loss. The adjusted relative risk of spontaneous abortion among women with non-optimally timed conceptions and a history of pregnancy loss was 2.35 (95% confidence intervals 1.42 to 3.89). The excess risk of spontaneous abortion was observed with both preovulatory and postovulatory conceptions. Overall, there is no excess risk of spontaneous abortion among the pregnancies conceived during natural family planning use. However, among women with a history of pregnancy loss, there is an increased risk of spontaneous abortion associated with preovulatory or postovulatory delayed conceptions.
NASA Astrophysics Data System (ADS)
Vincenzo, F.; Matteucci, F.; Spitoni, E.
2017-04-01
We present a theoretical method for solving the chemical evolution of galaxies by assuming an instantaneous recycling approximation for chemical elements restored by massive stars and the delay time distribution formalism for delayed chemical enrichment by Type Ia Supernovae. The galaxy gas mass assembly history, together with the assumed stellar yields and initial mass function, represents the starting point of this method. We derive a simple and general equation, which closely relates the Laplace transforms of the galaxy gas accretion history and star formation history, which can be used to simplify the problem of retrieving these quantities in the galaxy evolution models assuming a linear Schmidt-Kennicutt law. We find that - once the galaxy star formation history has been reconstructed from our assumptions - the differential equation for the evolution of the chemical element X can be suitably solved with classical methods. We apply our model to reproduce the [O/Fe] and [Si/Fe] versus [Fe/H] chemical abundance patterns as observed at the solar neighbourhood by assuming a decaying exponential infall rate of gas and different delay time distributions for Type Ia Supernovae; we also explore the effect of assuming a non-linear Schmidt-Kennicutt law, with the index of the power law being k = 1.4. Although approximate, we conclude that our model with the single-degenerate scenario for Type Ia Supernovae provides the best agreement with the observed set of data. Our method can be used by other complementary galaxy stellar population synthesis models to predict also the chemical evolution of galaxies.
Palstra, Friso P; Heyer, Evelyne; Austerlitz, Frédéric
2015-06-01
The demographic history of modern humans constitutes a combination of expansions, colonizations, contractions, and remigrations. The advent of large scale genetic data combined with statistically refined methods facilitates inference of this complex history. Here we study the demographic history of two genetically admixed ethnic groups in Central Asia, an area characterized by high levels of genetic diversity and a history of recurrent immigration. Using Approximate Bayesian Computation, we infer that the timing of admixture markedly differs between the two groups. Admixture in the traditionally agricultural Tajiks could be dated back to the onset of the Neolithic transition in the region, whereas admixture in Kyrgyz is more recent, and may have involved the westward movement of Turkic peoples. These results are confirmed by a coalescent method that fits an isolation-with-migration model to the genetic data, with both Central Asian groups having received gene flow from the extremities of Eurasia. Interestingly, our analyses also uncover signatures of gene flow from Eastern to Western Eurasia during Paleolithic times. In conclusion, the high genetic diversity currently observed in these two Central Asian peoples most likely reflects the effects of recurrent immigration that likely started before historical times. Conversely, conquests during historical times may have had a relatively limited genetic impact. These results emphasize the need for a better understanding of the genetic consequences of transmission of culture and technological innovations, as well as those of invasions and conquests. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Adding fling effects to processed ground‐motion time histories
Kamai, Ronnie; Abrahamson, Norman A.; Graves, Robert
2014-01-01
Fling is the engineering term for the effects of the permanent tectonic offset, caused by a rupturing fault in the recorded ground motions near the fault. It is expressed by a one‐sided pulse in ground velocity and a nonzero final displacement at the end of shaking. Standard processing of earthquake time histories removes some of the fling effects that may be required for engineering applications. A method to parameterize the fling‐step time history and to superimpose it onto traditionally processed time histories has been developed by Abrahamson (2002). In this paper, we first present an update to the Abrahamson (2002) fling‐step models, in which the fling step is parameterized as a single cycle of a sine wave. Parametric models are presented for the sine‐wave amplitude (Dsite) and period (Tf). The expressions for Dsite and Tf are derived from an extensive set of finite‐fault simulations conducted on the Southern California Earthquake Center broadband platform (see Data and Resources). The simulations were run with the Graves and Pitarka (2010) hybrid simulation method and included strike‐slip and reverse scenarios for magnitudes of 6.0–8.2 and dips of 30 through 90. Next, an improved approach for developing design ground motions with fling effects is presented, which deals with the problem of double‐counting intermediate period components that were not removed by the standard ground‐motion processing. Finally, the results are validated against a set of 84 empirical recordings containing fling.
NASA Astrophysics Data System (ADS)
Dutrieux, L.; Jakovac, C. C.; Siti, L. H.; Kooistra, L.
2015-12-01
We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The BFAST framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in regimes. To further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil. Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation. We validated the number of cultivation cycles predicted against in-situ information collected from farmers interviews, resulting in a Normalized RMSE of 0.25. Overall the method performed well, producing maps with coherent patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for land use history mapping in the tropics and beyond. Spatial and temporal patterns were further analysed with an ecological perspective in a follow-up study. Results show that changes in land use patterns such as land use intensification and reduced agricultural expansion reflect the socio-economic transformations that occurred in the region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bai, Shirong; Davis, Michael J.; Skodje, Rex T.
2015-11-12
The sensitivity of kinetic observables is analyzed using a newly developed sum over histories representation of chemical kinetics. In the sum over histories representation, the concentrations of the chemical species are decomposed into the sum of probabilities for chemical pathways that follow molecules from reactants to products or intermediates. Unlike static flux methods for reaction path analysis, the sum over histories approach includes the explicit time dependence of the pathway probabilities. Using the sum over histories representation, the sensitivity of an observable with respect to a kinetic parameter such as a rate coefficient is then analyzed in terms of howmore » that parameter affects the chemical pathway probabilities. The method is illustrated for species concentration target functions in H-2 combustion where the rate coefficients are allowed to vary over their associated uncertainty ranges. It is found that large sensitivities are often associated with rate limiting steps along important chemical pathways or by reactions that control the branching of reactive flux« less
Optimal design of structures for earthquake loads by a hybrid RBF-BPSO method
NASA Astrophysics Data System (ADS)
Salajegheh, Eysa; Gholizadeh, Saeed; Khatibinia, Mohsen
2008-03-01
The optimal seismic design of structures requires that time history analyses (THA) be carried out repeatedly. This makes the optimal design process inefficient, in particular, if an evolutionary algorithm is used. To reduce the overall time required for structural optimization, two artificial intelligence strategies are employed. In the first strategy, radial basis function (RBF) neural networks are used to predict the time history responses of structures in the optimization flow. In the second strategy, a binary particle swarm optimization (BPSO) is used to find the optimum design. Combining the RBF and BPSO, a hybrid RBF-BPSO optimization method is proposed in this paper, which achieves fast optimization with high computational performance. Two examples are presented and compared to determine the optimal weight of structures under earthquake loadings using both exact and approximate analyses. The numerical results demonstrate the computational advantages and effectiveness of the proposed hybrid RBF-BPSO optimization method for the seismic design of structures.
Bunge, Hans-Peter; Richards, M A; Baumgardner, J R
2002-11-15
Data assimilation is an approach to studying geodynamic models consistent simultaneously with observables and the governing equations of mantle flow. Such an approach is essential in mantle circulation models, where we seek to constrain an unknown initial condition some time in the past, and thus cannot hope to use first-principles convection calculations to infer the flow history of the mantle. One of the most important observables for mantle-flow history comes from models of Mesozoic and Cenozoic plate motion that provide constraints not only on the surface velocity of the mantle but also on the evolution of internal mantle-buoyancy forces due to subducted oceanic slabs. Here we present five mantle circulation models with an assimilated plate-motion history spanning the past 120 Myr, a time period for which reliable plate-motion reconstructions are available. All models agree well with upper- and mid-mantle heterogeneity imaged by seismic tomography. A simple standard model of whole-mantle convection, including a factor 40 viscosity increase from the upper to the lower mantle and predominantly internal heat generation, reveals downwellings related to Farallon and Tethys subduction. Adding 35% bottom heating from the core has the predictable effect of producing prominent high-temperature anomalies and a strong thermal boundary layer at the base of the mantle. Significantly delaying mantle flow through the transition zone either by modelling the dynamic effects of an endothermic phase reaction or by including a steep, factor 100, viscosity rise from the upper to the lower mantle results in substantial transition-zone heterogeneity, enhanced by the effects of trench migration implicit in the assimilated plate-motion history. An expected result is the failure to account for heterogeneity structure in the deepest mantle below 1500 km, which is influenced by Jurassic plate motions and thus cannot be modelled from sequential assimilation of plate motion histories limited in age to the Cretaceous. This result implies that sequential assimilation of past plate-motion models is ineffective in studying the temporal evolution of core-mantle-boundary heterogeneity, and that a method for extrapolating present-day information backwards in time is required. For short time periods (of the order of perhaps a few tens of Myr) such a method exists in the form of crude 'backward' convection calculations. For longer time periods (of the order of a mantle overturn), a rigorous approach to extrapolating information back in time exists in the form of iterative nonlinear optimization methods that carry assimilated information into the past through the use of an adjoint mantle convection model.
Real-Time Tracking by Double Templates Matching Based on Timed Motion History Image with HSV Feature
Li, Zhiyong; Li, Pengfei; Yu, Xiaoping; Hashem, Mervat
2014-01-01
It is a challenge to represent the target appearance model for moving object tracking under complex environment. This study presents a novel method with appearance model described by double templates based on timed motion history image with HSV color histogram feature (tMHI-HSV). The main components include offline template and online template initialization, tMHI-HSV-based candidate patches feature histograms calculation, double templates matching (DTM) for object location, and templates updating. Firstly, we initialize the target object region and calculate its HSV color histogram feature as offline template and online template. Secondly, the tMHI-HSV is used to segment the motion region and calculate these candidate object patches' color histograms to represent their appearance models. Finally, we utilize the DTM method to trace the target and update the offline template and online template real-timely. The experimental results show that the proposed method can efficiently handle the scale variation and pose change of the rigid and nonrigid objects, even in illumination change and occlusion visual environment. PMID:24592185
Minimal-assumption inference from population-genomic data
NASA Astrophysics Data System (ADS)
Weissman, Daniel; Hallatschek, Oskar
Samples of multiple complete genome sequences contain vast amounts of information about the evolutionary history of populations, much of it in the associations among polymorphisms at different loci. Current methods that take advantage of this linkage information rely on models of recombination and coalescence, limiting the sample sizes and populations that they can analyze. We introduce a method, Minimal-Assumption Genomic Inference of Coalescence (MAGIC), that reconstructs key features of the evolutionary history, including the distribution of coalescence times, by integrating information across genomic length scales without using an explicit model of recombination, demography or selection. Using simulated data, we show that MAGIC's performance is comparable to PSMC' on single diploid samples generated with standard coalescent and recombination models. More importantly, MAGIC can also analyze arbitrarily large samples and is robust to changes in the coalescent and recombination processes. Using MAGIC, we show that the inferred coalescence time histories of samples of multiple human genomes exhibit inconsistencies with a description in terms of an effective population size based on single-genome data.
Süelözgen, Tufan; Isoglu, Cemal Selcuk; Turk, Hakan; Yoldas, Mehmet; Karabicak, Mustafa; Ergani, Batuhan; Boyacioglu, Hayal; Ilbey, Yusuf Ozlem; Zorlu, Ferruh
2016-01-01
This study aimed to evaluate whether one-shot dilatation technique is as safe in patients with a history of open-stone surgery as it is in patients without previous open-stone surgery. Between January 2007 and February 2015, 82 patients who underwent percutaneous nephrolithotomy (PNL) surgery with one-shot dilation technique who previously had open-stone surgery were retrospectively reviewed and evaluated (Group 1). Another 82 patients were selected randomly among patients who had PNL with one-shot dilation technique, but with no history of open renal surgery (Group 2). Age, gender, type of kidney stone, duration of surgery, radiation exposure time, and whether or not there was any bleeding requiring perioperative and postoperative transfusion were noted for each patient. The stone-free rates, operation and fluoroscopy time, and peroperative and postoperative complication rates were similar in both groups (p>0.05). Our experience indicated that PNL with one-shot dilation technique is a reliable method in patients with a history of open-stone surgery.
NASA Astrophysics Data System (ADS)
Liu, Tianyu; Du, Xining; Ji, Wei; Xu, X. George; Brown, Forrest B.
2014-06-01
For nuclear reactor analysis such as the neutron eigenvalue calculations, the time consuming Monte Carlo (MC) simulations can be accelerated by using graphics processing units (GPUs). However, traditional MC methods are often history-based, and their performance on GPUs is affected significantly by the thread divergence problem. In this paper we describe the development of a newly designed event-based vectorized MC algorithm for solving the neutron eigenvalue problem. The code was implemented using NVIDIA's Compute Unified Device Architecture (CUDA), and tested on a NVIDIA Tesla M2090 GPU card. We found that although the vectorized MC algorithm greatly reduces the occurrence of thread divergence thus enhancing the warp execution efficiency, the overall simulation speed is roughly ten times slower than the history-based MC code on GPUs. Profiling results suggest that the slow speed is probably due to the memory access latency caused by the large amount of global memory transactions. Possible solutions to improve the code efficiency are discussed.
Scaled accelerographs for design of structures in Quetta, Baluchistan, Pakistan
NASA Astrophysics Data System (ADS)
Bhatti, Abdul Qadir
2016-12-01
Structural design for seismic excitation is usually based on peak values of forces and deformations over the duration of earthquake. In determining these peak values dynamic analysis is done which requires either response history analysis (RHA), also called time history analysis, or response spectrum analysis (RSA), both of which depend upon ground motion severity. In the past, PGA has been used to describe ground motion severity, because seismic force on a rigid body is proportional to the ground acceleration. However, it has been pointed out that single highest peak on accelerograms is a very unreliable description of the accelerograms as a whole. In this study, we are considering 0.2- and 1-s spectral acceleration. Seismic loading has been defined in terms of design spectrum and time history which will lead us to two methods of dynamic analysis. Design spectrum for Quetta will be constructed incorporating the parameters of ASCE 7-05/IBC 2006/2009, which is being used by modern codes and regulation of the world like IBC 2006/2009, ASCE 7-05, ATC-40, FEMA-356 and others. A suite of time history representing design earthquake will also be prepared, this will be a helpful tool to carryout time history dynamic analysis of structures in Quetta.
McCook, Stuart
2013-12-01
The "global turn" in the history of science offers new ways to think about how to do national and regional histories of science, in this case the history of science in Latin America. For example, it questions structuralist and diffusionist models of the spread of science and shows the often active role that people in Latin America (and the rest of the Global South) played in the construction of "universal" scientific knowledge. It suggests that even national or regional histories of science must be situated in a global context; all too often, such histories have treated global processes as a distant backdrop. At the same time, historians need to pay constant attention to the role of power in the construction of scientific knowledge. Finally, this essay highlights a methodological tool for writing globally inflected histories of science: the method of "following".
Methods in Teaching Region and Diversity in U.S. Western Women's History
ERIC Educational Resources Information Center
Jackson-Abernathy, Brenda K.
2013-01-01
History teachers may well feel challenged with the task of bringing women into their American West curriculums due to the great diversity of women in the West during the nineteenth century. At the same time, the past thirty years or so have produced a plethora of monographs, articles, and primary source collections on women in the American West.…
Ait Kaci Azzou, Sadoune; Larribe, Fabrice; Froda, Sorana
2015-01-01
The effective population size over time (demographic history) can be retraced from a sample of contemporary DNA sequences. In this paper, we propose a novel methodology based on importance sampling (IS) for exploring such demographic histories. Our starting point is the generalized skyline plot with the main difference being that our procedure, skywis plot, uses a large number of genealogies. The information provided by these genealogies is combined according to the IS weights. Thus, we compute a weighted average of the effective population sizes on specific time intervals (epochs), where the genealogies that agree more with the data are given more weight. We illustrate by a simulation study that the skywis plot correctly reconstructs the recent demographic history under the scenarios most commonly considered in the literature. In particular, our method can capture a change point in the effective population size, and its overall performance is comparable with the one of the bayesian skyline plot. We also introduce the case of serially sampled sequences and illustrate that it is possible to improve the performance of the skywis plot in the case of an exponential expansion of the effective population size. PMID:26300910
Papali, Alfred; Hines, Stella E
2015-03-01
Although the process of taking an occupational and environmental history has remained largely the same, the context in which it is done has changed dramatically over recent years. This review examines the role of the occupational and environmental history in the context of the changing nature of medical practice and discusses methods for evaluating patients with contemporary exposure-related respiratory illnesses. Surveillance for occupational lung disease using mnemonic devices, screening questions and the use of structured questionnaires can significantly increase the likelihood and accuracy of detection. Electronic health records likewise can be adapted to include the most important elements of the occupational and environmental history. The emergence of new technologies and industries will lead to respiratory diseases in novel occupational and environmental contexts. Using the methods described herein can make detecting these diseases easier and less time-consuming.
How History Helped Einstein in Special Relativity
NASA Astrophysics Data System (ADS)
Martinez, Alberto
2013-04-01
I will discuss how the German intellectual movement known as ``critical history'' motivated several physicists in the late 1900s to radically analyze the fundamental principles of mechanics, leading eventually to Einstein's special theory of relativity. Eugen Karl Dühring, Johann Bernhard Stallo, Ludwig Lange, and Ernst Mach wrote critical histories of mechanics, some of which emphasized notions of relativity and observation, in opposition to old metaphysical concepts that seemed to infect the foundations of physics. This strand of critical history included the ``genetic method'' of analyzing how concepts develop over time, in our minds, by way of ordinary experiences, which by 1904 was young Albert Einstein's favorite approach for examining fundamental notions. Thus I will discuss how history contributed in Einstein's path to relativity, as well as comment more generally on Einstein's views on history.
NASA Technical Reports Server (NTRS)
Decker, A. J.
1984-01-01
The holographic recording of the time history of a flow feature in three dimensions is discussed. The use of diffuse illumination holographic interferometry or the three-dimensional visualization of flow features such as shock waves and turbulent eddies is described. The double-exposure and time-average methods are compared using the characteristic function and the results from a flow simulator. A time history requires a large hologram recording rate. Results of holographic cinematography of the shock waves in a flutter cascade are presented as an example. Future directions of this effort, including the availability and development of suitable lasers, are discussed. Previously announced in STAR as N84-21849
A numerical analysis of phase-change problems including natural convection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Y.; Faghri, A.
1990-08-01
Fixed grid solutions for phase-change problems remove the need to satisfy conditions at the phase-change front and can be easily extended to multidimensional problems. The two most important and widely used methods are enthalpy methods and temperature-based equivalent heat capacity methods. Both methods in this group have advantages and disadvantages. Enthalpy methods (Shamsundar and Sparrow, 1975; Voller and Prakash, 1987; Cao et al., 1989) are flexible and can handle phase-change problems occurring both at a single temperature and over a temperature range. The drawback of this method is that although the predicted temperature distributions and melting fronts are reasonable, themore » predicted time history of the temperature at a typical grid point may have some oscillations. The temperature-based fixed grid methods (Morgan, 1981; Hsiao and Chung, 1984) have no such time history problems and are more convenient with conjugate problems involving an adjacent wall, but have to deal with the severe nonlinearity of the governing equations when the phase-change temperature range is small. In this paper, a new temperature-based fixed-grid formulation is proposed, and the reason that the original equivalent heat capacity model is subject to such restrictions on the time step, mesh size, and the phase-change temperature range will also be discussed.« less
[Thoughts and methods of study on acupuncture medical history: an example of Mr. MA Ji-Xing].
Yang, Feng; Zhu, Ling
2014-03-01
Mr. MA Ji-xing has devoted himself into the study of acupuncture medical history for more than 70 years. As a result, a great work of Zhenjiuxue Tongshi (see text), History of Acupuncture-Moxibustion) has been completed. The author has expensively studied for history of acupuncture medicine in time and space. Base on abundant historical materials, deliberate textual research as well as strategically situated academic view, it is considered as a masterpiece of acupuncture on real significance. It is worthwhile to note that the book has a systematic and profound explanation on Bian-stone therapy, unearthed literature relics of acupuncture, the bronze figure or illustration of acupoint as well as special topics of Japan and Korea acupuncture history. Filled several gaps of the field, and explored some significant new paths of study, it laid the groundwork for the profound study and unscramble of traditional acupuncture theory as well as the investigation of the academic history, which is considered to have a profound and persistent influence. The careful sorting and profound digging of many distinguish thoughts and methods of Mr. MA Ji-xing in the study of acupuncture medical history has significant meaning in references and enlightenment of the future research on acupuncture medical history.
A review on the history of tympanoplasty.
Sarkar, Saurav
2013-12-01
The history of myringoplasty and tympanoplasty is one of the most interesting in the history of ear surgery. The aims and ambitions of otologists have evolved along with time and experience. The objective of this article is to give an idea about the evolution of tympanoplasty, thus giving inspiration to future surgeons in their quest for a perfect technique which would be as good as a normal ear and its hearing. The history of otology starts from the early Egyptian healers. Hippocrates in his time observed that ear infections may be cause of death especially in young children. Early surgeries were performed mainly for drainage in order to save the life of the child having the ear disease. With time and scientific developments newer methods of treatment started to evolve. The invention of antimicrobials and their usage threw a new light into the treatment of otology infections. Then after the advent of microscope and with better understanding of the anatomy and physiology of ear and its diseases treatment strategies and surgical planning kept on advancing. Surgeons with time have become more interested in returning back the hearing along with curing infection from ear. But the quest is on for the perfect surgical technique which would give best results with minimal maneuvering. History of tympanoplasty nearly sums up the history of evolution of otology as a whole. The quest is still on to devise a way so as to give maximum post-operative hearing using minimal instrumentation.
Whole lot of parts: stress in extreme environments.
Steel, G Daniel
2005-06-01
Stress has been a central interest for researchers of human behavior in extreme and unusual environments and also for those who are responsible for planning and carrying out expeditions involving such environments. This paper compares the actuarial and case study methods for predicting reactions to stress. Actuarial studies are useful, but do not tap enough variables to allow us to predict how a specific individual will cope with the rigors of an individual mission. Case histories provide a wealth of detail, but few investigators understand the challenges of properly applying this method. This study reviews some of the strengths and weaknesses of the actuarial and case history methods, and presents a four celled taxonomy of stress based on method (actuarial and case history) and effects (distress and eustress). For both research and operational purposes, the person, the setting, and time should not be considered independently; rather, it is an amalgam of these variables that provides the proper basis of analysis.
ERIC Educational Resources Information Center
Gross, Jacob P. K.; Torres, Vasti
2010-01-01
Using a competing risks event history model this study explores the effects of differentiated forms of financial aid on the postsecondary enrollment patterns of Latino college students in Indiana. Much of the prior research on financial aid has employed cross-sectional methods, which assume that the effects of aid do not vary across time. This…
NASA Astrophysics Data System (ADS)
Lee, Chung-Shuo; Chen, Yan-Yu; Yu, Chi-Hua; Hsu, Yu-Chuan; Chen, Chuin-Shan
2017-07-01
We present a semi-analytical solution of a time-history kernel for the generalized absorbing boundary condition in molecular dynamics (MD) simulations. To facilitate the kernel derivation, the concept of virtual atoms in real space that can conform with an arbitrary boundary in an arbitrary lattice is adopted. The generalized Langevin equation is regularized using eigenvalue decomposition and, consequently, an analytical expression of an inverse Laplace transform is obtained. With construction of dynamical matrices in the virtual domain, a semi-analytical form of the time-history kernel functions for an arbitrary boundary in an arbitrary lattice can be found. The time-history kernel functions for different crystal lattices are derived to show the generality of the proposed method. Non-equilibrium MD simulations in a triangular lattice with and without the absorbing boundary condition are conducted to demonstrate the validity of the solution.
Madden, Tessa; Secura, Gina M; Allsworth, Jenifer E; Peipert, Jeffrey F
2011-12-01
Women undergoing induced abortion may be more motivated to choose long-acting reversible contraception (LARC), including the intrauterine device (IUD) and implant, than women without a history of abortion. Our objective was to determine whether the contraceptive method chosen is influenced by a recent history of induced abortion and access to immediate postabortion contraception. This was a subanalysis of the Contraceptive CHOICE Project. We compared contraception chosen by women with a recent history of abortion to women without a recent history. Participants with a recent history of abortion were divided into immediate postabortion contraception and delayed-start contraception groups. Data were available for 5083 women: 3410 women without a recent abortion history, 937 women who received immediate postabortion contraception and 736 women who received delayed-start postabortion contraception. Women offered immediate postabortion contraception were more than three times as likely to choose an IUD [adjusted relative risk (RR(adj)) 3.30, 95% confidence interval (CI) 2.67-4.85] and 50% more likely to choose the implant (RR(adj) 1.51, 95%CI 1.12-2.03) compared to women without a recent abortion. There was no difference in contraceptive method selected among women offered delayed-start postabortion contraception compared to women without a recent abortion. Women offered immediate postabortion contraception are more likely to choose the IUD and implant than women without a recent abortion history. Increasing access to immediate postabortion LARC is essential to preventing repeat unintended pregnancies. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Jiao, Quanjun; Zhang, Xiao; Sun, Qi
2018-03-01
The availability of dense time series of Landsat images pro-vides a great chance to reconstruct forest disturbance and change history with high temporal resolution, medium spatial resolution and long period. This proposal aims to apply forest change detection method in Hainan Jianfengling Forest Park using yearly Landsat time-series images. A simple detection method from the dense time series Landsat NDVI images will be used to reconstruct forest change history (afforestation and deforestation). The mapping result showed a large decrease occurred in the extent of closed forest from 1980s to 1990s. From the beginning of the 21st century, we found an increase in forest areas with the implementation of forestry measures such as the prohibition of cutting and sealing in our study area. Our findings provide an effective approach for quickly detecting forest changes in tropical original forest, especially for afforestation and deforestation, and a comprehensive analysis tool for forest resource protection.
Jacquez, Geoffrey M; Meliker, Jaymie R; Avruskin, Gillian A; Goovaerts, Pierre; Kaufmann, Andy; Wilson, Mark L; Nriagu, Jerome
2006-08-03
Methods for analyzing space-time variation in risk in case-control studies typically ignore residential mobility. We develop an approach for analyzing case-control data for mobile individuals and apply it to study bladder cancer in 11 counties in southeastern Michigan. At this time data collection is incomplete and no inferences should be drawn - we analyze these data to demonstrate the novel methods. Global, local and focused clustering of residential histories for 219 cases and 437 controls is quantified using time-dependent nearest neighbor relationships. Business address histories for 268 industries that release known or suspected bladder cancer carcinogens are analyzed. A logistic model accounting for smoking, gender, age, race and education specifies the probability of being a case, and is incorporated into the cluster randomization procedures. Sensitivity of clustering to definition of the proximity metric is assessed for 1 to 75 k nearest neighbors. Global clustering is partly explained by the covariates but remains statistically significant at 12 of the 14 levels of k considered. After accounting for the covariates 26 Local clusters are found in Lapeer, Ingham, Oakland and Jackson counties, with the clusters in Ingham and Oakland counties appearing in 1950 and persisting to the present. Statistically significant focused clusters are found about the business address histories of 22 industries located in Oakland (19 clusters), Ingham (2) and Jackson (1) counties. Clusters in central and southeastern Oakland County appear in the 1930's and persist to the present day. These methods provide a systematic approach for evaluating a series of increasingly realistic alternative hypotheses regarding the sources of excess risk. So long as selection of cases and controls is population-based and not geographically biased, these tools can provide insights into geographic risk factors that were not specifically assessed in the case-control study design.
Plumbing the depths of batholiths
Zen, E.-A.
1989-01-01
Knowledge of the pressure of consolidation of a pluton and the pressure-time history of magmatic evolution should allow better understanding of the tectonic and thermal history of the crust. Available methods to estimate pressures of plutons are mainly for those of consolidation. These are either extrinsic, based on geological context, or intrinsic, based on mineral texture, mineral assemblage, fluid inclusions, mineral inclusions, apparent cooling rates, and mineral chemistry. The methods of lattice-dimension matching of mineral inclusions and of detailed chemistry for zoned minerals could lead to pressure-time reconstructions. Future barometers based on mineral chemistry should use atomic species that have low diffusion coefficients and whose values are not sensitive to computational schemes and cumulative analytical errors. Aluminum and silicon in coexisting hornblende, biotite, pyroxene, plagioclase, or garnet are reasonable candidate phases for barometry. -from Author
Repeat Pregnancy among Urban Adolescents: Sociodemographic, Family, and Health Factors.
ERIC Educational Resources Information Center
Coard, Stephanie Irby; Nitz, Katherine; Felice, Marianne E.
2000-01-01
Examines sociodemographic, family, and health factors associated with repeat pregnancy in a clinical sample of urban, first-time mothers. Results indicate that postpartum contraceptive method was associated with repeat pregnancy at year one; contraceptive use, maternal age, history of miscarriages, and postpartum contraceptive method were…
Exposure to environmental contaminants is well documented to adversely impact the development of the nervous system. However, the time, animal and resource intensive EPA and OECD testing guideline methods for developmental neurotoxicity (DNT) are not a viable solution to characte...
McDonald, Craig M.; Henricson, Erik K.; Abresch, R. Ted; Han, Jay J.; Escolar, Diana M.; Florence, Julaine M.; Duong, Tina; Arrieta, Adrienne; Clemens, Paula R.; Hoffman, Eric P.; Cnaan, Avital
2014-01-01
Contemporary natural history data in Duchenne muscular dystrophy (DMD) is needed to assess care recommendations and aid in planning future trials. Methods The Cooperative International Neuromuscular Research Group (CINRG) DMD Natural History Study (DMD-NHS) enrolled 340 individuals, aged 2–28 years, with DMD in a longitudinal, observational study at 20 centers. Assessments obtained every 3 months for 1 year, at 18 months, and annually thereafter included: clinical history; anthropometrics; goniometry; manual muscle testing; quantitative muscle strength; timed function tests; pulmonary function; and patient-reported outcomes/ health-related quality-of-life instruments. Results Glucocorticoid (GC) use at baseline was 62% present, 14% past, and 24% GC-naive. In those ≥6 years of age, 16% lost ambulation over the first 12 months (mean age 10.8 years). Conclusions Detailed information on the study methodology of the CINRG DMD-NHS lays the groundwork for future analyses of prospective longitudinal natural history data. These data will assist investigators in designing clinical trials of novel therapeutics. PMID:23677550
Shi, Kuan; Wu, Wenzhong; Liu, Lanying; Wang, Hesheng; Chen, Dong; Liu, Chengyong; Zhang, Cong
2017-06-12
To study the primary and secondary factors of the allergic history, the frequency of acupoint application and the time of acupoint application in the treatment of bronchial asthma and optimize its scheme. Eighty patients of bronchial asthma were selected as the subjects in the orthogonal trial. The herbal medicines were the empirical formula of acupoint application (prepared at the ratio as 2:2:1:1:1:1:1:1:1 of semen brassicae , rhizome corydalis , unprocessed radix kansui , asarum sieboldii , ephedra , semen lepidii , syzygium aromaticum , cortex cinnamomi and fructus gleditsiae ) and used on bilateral Feishu (BL 13), Xinshu (BL 15), Geshu (BL 17) and Shenshu (BL 23). Firstly, two groups were divided according to allergic history (40 cases with allergic history and 40 cases without allergic history), and then four subgroups were divided on the basis of the two main groups, 10 cases in each one. Through studying three factors and two levels, i.e. allergic history (Factor A:A Ⅰ :with allergic history; A Ⅱ :without allergic history), the frequency of acupoint application (Factor B:B Ⅰ :4 times; B Ⅱ :10 times, in which, in the group of 4-time applications, the application was given once every 10 days; in the group of 10-time applications, the application was given once every 4 days); and the time of application (Factor C:C Ⅰ :4 h; C Ⅱ :8 h), the optimal scheme was screened on the basis of the attack frequency before and after treatment and the score of the asthma quality life questionnaire (AQLQ) before treatment and 6 months after treatment in the patients of each group. ① The orthogonal trial indicated that the best optimal scheme was A Ⅰ B Ⅱ C Ⅰ , meaning the patients with allergic history were treated with acupoint application for 10 times, remained for 4 h. ②Factor B (frequency of acupoint application) and C (time of acpoint application) were the significant influential factors of AQLQ scores (both P <0.05). ③The comparison of the attack frequency and AQLQ score before and after treatment in all of the patients showed that the different combinations of factor levels induced the different impacts on the asthma attack frequency and AQLQ scores. Except in the group No.1 and the group No.5, the improvements were all significant in the rest groups, indicating the significant differences ( P <0.05, P <0.01). Acupoint application reduces apparently the attack frequency of asthma in the patients and improves the living quality. The primary and secondary relationship among the allergic history, the frequency of acupoint application and the time of acupoint application for the impacts on the therapeutic effects are:the frequency of acupoint application > the time of acupoint application > the allergic history. The best optimal scheme is A Ⅰ B Ⅱ C Ⅰ , meaning the patients with allergic history are treated with acupoint application for 10 times, remained for 4h.
Elastic and viscoelastic calculations of stresses in sedimentary basins
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.
This study presents a method for estimating the stress state within reservoirs at depth using a time-history approach for both elastic and viscoelastic rock behavior. Two features of this model are particularly significant for stress calculations. The first is the time-history approach, where we assume that the present in situ stress is a result of the entire history of the rock mass, rather than due only to the present conditions. The model can incorporate: (1) changes in pore pressure due to gas generation; (2) temperature gradients and local thermal episodes; (3) consolidation and diagenesis through time-varying material properties; and (4)more » varying tectonic episodes. The second feature is the use of a new viscoelastic model. Rather than assume a form of the relaxation function, a complete viscoelastic solution is obtained from the elastic solution through the viscoelastic correspondence principal. Simple rate models are then applied to obtain the final rock behavior. Example calculations for some simple cases are presented that show the contribution of individual stress or strain components. Finally, a complete example of the stress history of rocks in the Piceance basin is attempted. This calculation compares favorably with present-day stress data in this location. This model serves as a predictor for natural fracture genesis and expected rock fracturing from the model is compared with actual fractures observed in this region. These results show that most current estimates of in situ stress at depth do not incorporate all of the important mechanisms and a more complete formulation, such as this study, is required for acceptable stress calculations. The method presented here is general and is applicable to any basin having a relatively simple geologic history. 25 refs., 18 figs.« less
Locally adaptive methods for KDE-based random walk models of reactive transport in porous media
NASA Astrophysics Data System (ADS)
Sole-Mari, G.; Fernandez-Garcia, D.
2017-12-01
Random Walk Particle Tracking (RWPT) coupled with Kernel Density Estimation (KDE) has been recently proposed to simulate reactive transport in porous media. KDE provides an optimal estimation of the area of influence of particles which is a key element to simulate nonlinear chemical reactions. However, several important drawbacks can be identified: (1) the optimal KDE method is computationally intensive and thereby cannot be used at each time step of the simulation; (2) it does not take advantage of the prior information about the physical system and the previous history of the solute plume; (3) even if the kernel is optimal, the relative error in RWPT simulations typically increases over time as the particle density diminishes by dilution. To overcome these problems, we propose an adaptive branching random walk methodology that incorporates the physics, the particle history and maintains accuracy with time. The method allows particles to efficiently split and merge when necessary as well as to optimally adapt their local kernel shape without having to recalculate the kernel size. We illustrate the advantage of the method by simulating complex reactive transport problems in randomly heterogeneous porous media.
Numerical considerations in the development and implementation of constitutive models
NASA Technical Reports Server (NTRS)
Haisler, W. E.; Imbrie, P. K.
1985-01-01
Several unified constitutive models were tested in uniaxial form by specifying input strain histories and comparing output stress histories. The purpose of the tests was to evaluate several time integration methods with regard to accuracy, stability, and computational economy. The sensitivity of the models to slight changes in input constants was also investigated. Results are presented for In100 at 1350 F and Hastelloy-X at 1800 F.
Bateson, Thomas F; Kopylev, Leonid
2015-01-01
Recent meta-analyses of occupational epidemiology studies identified two important exposure data quality factors in predicting summary effect measures for asbestos-associated lung cancer mortality risk: sufficiency of job history data and percent coverage of work history by measured exposures. The objective was to evaluate different exposure parameterizations suggested in the asbestos literature using the Libby, MT asbestos worker cohort and to evaluate influences of exposure measurement error caused by historically estimated exposure data on lung cancer risks. Focusing on workers hired after 1959, when job histories were well-known and occupational exposures were predominantly based on measured exposures (85% coverage), we found that cumulative exposure alone, and with allowance of exponential decay, fit lung cancer mortality data similarly. Residence-time-weighted metrics did not fit well. Compared with previous analyses based on the whole cohort of Libby workers hired after 1935, when job histories were less well-known and exposures less frequently measured (47% coverage), our analyses based on higher quality exposure data yielded an effect size as much as 3.6 times higher. Future occupational cohort studies should continue to refine retrospective exposure assessment methods, consider multiple exposure metrics, and explore new methods of maintaining statistical power while minimizing exposure measurement error.
Exploring Time-Lapse Photography as a Means for Qualitative Data Collection
ERIC Educational Resources Information Center
Persohn, Lindsay
2015-01-01
Collecting information via time-lapse photography is nothing new. Scientists and artists have been using this kind of data since the late 1800s. However, my research and experiments with time-lapse have shown that great potential may lie in its application to educational and social scientific research methods. This article is part history, part…
2009-01-01
Background In recent times, medical schools have committed to developing good communication and history taking skills in students. However, there remains an unresolved question as to which constitutes the best educational method. Our study aims to investigate whether the use of videotape recording is superior to verbal feedback alone in the teaching of clinical skills and the role of student self-assessment on history taking and communication skills. Methods A randomized controlled trial was designed. The study was conducted with 52 of the Dokuz Eylul University Faculty of Medicine second year students. All students' performances of communication and history taking skills were assessed twice. Between these assessments, the study group had received both verbal and visual feedback by watching their video recordings on patient interview; the control group received only verbal feedback from the teacher. Results Although the self-assessment of the students did not change significantly, assessors' ratings increased significantly for videotaped interviews at the second time. Conclusions Feedback based on videotaped interviews is superior to the feedback given solely based on the observation of assessors. PMID:20021688
Adapting to the Weather: Lessons from U.S. History.
Bleakley, Hoyt; Hong, Sok Chul
2017-09-01
An important unknown in understanding the impact of climate change is the scope of adaptation, which requires observations on historical time scales. We consider how weather across U.S. history (1860-2000) has affected various measures of productivity. Using cross-sectional and panel methods, we document significant responses of agricultural and individual productivity to weather. We find strong effects of hotter and wetter weather early in U.S. history, but these effects have been attenuated in recent decades. The results suggest that estimates from a given period may be of limited use in forecasting the longer-term impacts of climate change.
Predicting thermal history a-priori for magnetic nanoparticle hyperthermia of internal carcinoma
NASA Astrophysics Data System (ADS)
Dhar, Purbarun; Sirisha Maganti, Lakshmi
2017-08-01
This article proposes a simplistic and realistic method where a direct analytical expression can be derived for the temperature field within a tumour during magnetic nanoparticle hyperthermia. The approximated analytical expression for thermal history within the tumour is derived based on the lumped capacitance approach and considers all therapy protocols and parameters. The present method is simplistic and provides an easy framework for estimating hyperthermia protocol parameters promptly. The model has been validated with respect to several experimental reports on animal models such as mice/rabbit/hamster and human clinical trials. It has been observed that the model is able to accurately estimate the thermal history within the carcinoma during the hyperthermia therapy. The present approach may find implications in a-priori estimation of the thermal history in internal tumours for optimizing magnetic hyperthermia treatment protocols with respect to the ablation time, tumour size, magnetic drug concentration, field strength, field frequency, nanoparticle material and size, tumour location, and so on.
A hierarchical Bayesian method for vibration-based time domain force reconstruction problems
NASA Astrophysics Data System (ADS)
Li, Qiaofeng; Lu, Qiuhai
2018-05-01
Traditional force reconstruction techniques require prior knowledge on the force nature to determine the regularization term. When such information is unavailable, the inappropriate term is easily chosen and the reconstruction result becomes unsatisfactory. In this paper, we propose a novel method to automatically determine the appropriate q as in ℓq regularization and reconstruct the force history. The method incorporates all to-be-determined variables such as the force history, precision parameters and q into a hierarchical Bayesian formulation. The posterior distributions of variables are evaluated by a Metropolis-within-Gibbs sampler. The point estimates of variables and their uncertainties are given. Simulations of a cantilever beam and a space truss under various loading conditions validate the proposed method in providing adaptive determination of q and better reconstruction performance than existing Bayesian methods.
Ait Kaci Azzou, S; Larribe, F; Froda, S
2016-10-01
In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history. Copyright © 2016. Published by Elsevier Inc.
Hempstead, Katherine; Nguyen, Tuan; Barber, Catherine; Rosenberg-Wohl, Sarah; Azrael, Deborah
2013-01-01
Objectives. We examined time-varying and time-invariant characteristics of nonfatal intentional self-harm episodes in relation to subsequent episodes of self-harm and suicide. Methods. We conducted a follow-up cohort study through 2007 of 3600 patients discharged from hospitals in New Jersey with a primary diagnosis of intentional self-harm in 2003. We determined repetition of self-harm from hospital records and suicide from state registers. Results. Use of methods other than drug overdose and cutting in self-harm events, greater medical severity of nonfatal episodes, and a history of multiple self-harm episodes increased the risk of suicide. However, most suicides occurred without these risk factors. Most suicides took place without intervening episodes of self-harm, and most persons used a low-lethality method (drug overdose or cutting) in their index episode, but switched to a more lethal method in their fatal episode. Conclusions. Our findings suggest that preventing suicide among persons with a history of self-harm must account for the possibility that they will adopt methods with higher case-fatality ratios than they previously tried. PMID:23597351
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, P E
Tips and case histories on computer use for idea and outline processing: Productivity software to solve problems of idea hierarchy, transitions, and developments is matched to solutions for communicators. One case is text that ranges from methods and procedures to histories and legal definitions of classification for the US Department of Energy. Applications of value to writers, editors, and managers are for research; calendars; creativity; prioritization; idea discovery and manipulation; file and time management; and contents, indexes, and glossaries. 6 refs., 7 figs.
Panic Attack History and Smoking Topography
Farris, Samantha G.; Brown, Lily A.; Goodwin, Renee D.; Zvolensky, Michael J.
2016-01-01
Background Little is known about panic attacks and puffing topography, a behavioral index of the value of smoking reinforcement. This study examined smoking style during the course of smoking of a single cigarette among adult daily smokers with and without a history of panic attacks. Method Participants (n = 124, Mage = 43.9, SD = 9.7; 44.4% female) were non-treatment seeking daily smokers. Lifetime panic attack history was assessed via diagnostic assessment; 28.2% (n = 35) of the sample had a panic attack history. Participants smoked one cigarette during an ad libitum smoking trial. Puff volume, duration, and inter-puff interval were measured using the Clinical Research Support System (CReSS) pocket device. Results Regression analyses revealed that panic attack status was not associated with significant differences in average puff volume, duration, or inter-puff interval. Multi-level modeling was used to examine puffing trajectories. Puff-level data revealed that there was a significant quadratic time x panic effect for puff volume and duration. Those with a panic attack history demonstrated relatively sustained levels of both puff volume and duration over time, whereas those without a history of panic attacks demonstrated an increase followed by a decrease in volume and duration over time. These effects were not accounted for by the presence of general psychopathology. Discussion Smokers with a panic attack history demonstrate more persistent efforts to self-regulate the delivery of nicotine, and thus may be at risk for continued smoking and dependence. Tailored treatment may be needed to address unique vulnerabilities among this group. PMID:28033542
Orsini, Luisa; Schwenk, Klaus; De Meester, Luc; Colbourne, John K.; Pfrender, Michael E.; Weider, Lawrence J.
2013-01-01
Evolutionary changes are determined by a complex assortment of ecological, demographic and adaptive histories. Predicting how evolution will shape the genetic structures of populations coping with current (and future) environmental challenges has principally relied on investigations through space, in lieu of time, because long-term phenotypic and molecular data are scarce. Yet, dormant propagules in sediments, soils and permafrost are convenient natural archives of population-histories from which to trace adaptive trajectories along extended time periods. DNA sequence data obtained from these natural archives, combined with pioneering methods for analyzing both ecological and population genomic time-series data, are likely to provide predictive models to forecast evolutionary responses of natural populations to environmental changes resulting from natural and anthropogenic stressors, including climate change. PMID:23395434
NASA Technical Reports Server (NTRS)
Cisowski, S. M.; Fuller, M.
1986-01-01
A method for determining a planetary body's magnetic field environment over time is proposed. This relative paleointensity method is based on the normalization of natural remanence to saturation remanence magnetization as measured after each sample is exposed to a strong magnetic field. It is shown that this method is well suited to delineating order-of-magnitude changes in magnetizing fields.
A Comparison of PSD Enveloping Methods for Nonstationary Vibration
NASA Technical Reports Server (NTRS)
Irvine, Tom
2015-01-01
There is a need to derive a power spectral density (PSD) envelope for nonstationary acceleration time histories, including launch vehicle data, so that components can be designed and tested accordingly. This paper presents the results of the three methods for an actual flight accelerometer record. Guidelines are given for the application of each method to nonstationary data. The method can be extended to other scenarios, including transportation vibration.
Performance of cancer cluster Q-statistics for case-control residential histories
Sloan, Chantel D.; Jacquez, Geoffrey M.; Gallagher, Carolyn M.; Ward, Mary H.; Raaschou-Nielsen, Ole; Nordsborg, Rikke Baastrup; Meliker, Jaymie R.
2012-01-01
Few investigations of health event clustering have evaluated residential mobility, though causative exposures for chronic diseases such as cancer often occur long before diagnosis. Recently developed Q-statistics incorporate human mobility into disease cluster investigations by quantifying space- and time-dependent nearest neighbor relationships. Using residential histories from two cancer case-control studies, we created simulated clusters to examine Q-statistic performance. Results suggest the intersection of cases with significant clustering over their life course, Qi, with cases who are constituents of significant local clusters at given times, Qit, yielded the best performance, which improved with increasing cluster size. Upon comparison, a larger proportion of true positives were detected with Kulldorf’s spatial scan method if the time of clustering was provided. We recommend using Q-statistics to identify when and where clustering may have occurred, followed by the scan method to localize the candidate clusters. Future work should investigate the generalizability of these findings. PMID:23149326
Do work and family care histories predict health in older women?
Benson, Rebecca; Glaser, Karen; Corna, Laurie M.; Platts, Loretta G.; Di Gessa, Giorgio; Worts, Diana; Price, Debora; McDonough, Peggy; Sacker, Amanda
2017-01-01
Abstract Background Social and policy changes in the last several decades have increased women’s options for combining paid work with family care. We explored whether specific combinations of work and family care over the lifecourse are associated with variations in women’s later life health. Methods We used sequence analysis to group women in the English Longitudinal Study of Ageing according to their work histories and fertility. Using logistic regression, we tested for group differences in later life disability, depressive symptomology and mortality, while controlling for childhood health and socioeconomic position and a range of adult socio-economic circumstances and health behaviours. Results Women who transitioned from family care to either part-time work after a short break from the labour force, or to full-time work, reported lower odds of having a disability compared with the reference group of women with children who were mostly employed full-time throughout. Women who shifted from family care to part-time work after a long career break had lower odds of mortality than the reference group. Depressive symptoms were not associated with women’s work and family care histories. Conclusion Women’s work histories are predictive of their later life disability and mortality. This relationship may be useful in targeting interventions aimed at improving later life health. Further research is necessary to explore the mechanisms linking certain work histories to poorer later life health and to design interventions for those affected. PMID:29036311
Optimal trajectory generation for mechanical arms. M.S. Thesis
NASA Technical Reports Server (NTRS)
Iemenschot, J. A.
1972-01-01
A general method of generating optimal trajectories between an initial and a final position of an n degree of freedom manipulator arm with nonlinear equations of motion is proposed. The method is based on the assumption that the time history of each of the coordinates can be expanded in a series of simple time functions. By searching over the coefficients of the terms in the expansion, trajectories which minimize the value of a given cost function can be obtained. The method has been applied to a planar three degree of freedom arm.
Generation of Stationary Non-Gaussian Time Histories with a Specified Cross-spectral Density
Smallwood, David O.
1997-01-01
The paper reviews several methods for the generation of stationary realizations of sampled time histories with non-Gaussian distributions and introduces a new method which can be used to control the cross-spectral density matrix and the probability density functions (pdfs) of the multiple input problem. Discussed first are two methods for the specialized case of matching the auto (power) spectrum, the skewness, and kurtosis using generalized shot noise and using polynomial functions. It is then shown that the skewness and kurtosis can also be controlled by the phase of a complex frequency domain description of the random process. The general casemore » of matching a target probability density function using a zero memory nonlinear (ZMNL) function is then covered. Next methods for generating vectors of random variables with a specified covariance matrix for a class of spherically invariant random vectors (SIRV) are discussed. Finally the general case of matching the cross-spectral density matrix of a vector of inputs with non-Gaussian marginal distributions is presented.« less
Blomquist, Kerstin K.; Grilo, Carlos M.
2015-01-01
Objective A preliminary examination of the significance of family histories of anxiety in the expression of binge eating disorder (BED) and associated functioning. Methods Participants were 166 overweight patients with BED assessed using diagnostic interviews. Participants were administered a structured psychiatric history interview about their first-degree relatives (parents, siblings, children) (N=897) to determine lifetime diagnoses of DSM-IV anxiety disorders and completed a battery of questionnaires assessing current and historical eating and weight variables and associated psychological functioning (depression). Results BED patients with a family history of anxiety disorder were significantly more likely than BED patients without a family history of anxiety disorder to have lifetime diagnoses of anxiety disorders and mood disorders but not substance use disorders. A family history of anxiety was not significantly associated with timing or sequencing of age at onset of anxiety disorder, binge eating, dieting, or obesity, or with variability in current levels of binge eating, eating disorder psychopathology, or psychological functioning. Conclusions Although replication with direct interview method is needed, our preliminary findings suggest that a family history of anxiety confers greater risk for comorbid anxiety and mood disorders but is largely unrelated to the development of binge eating, dieting, or obesity and unrelated to variability in eating disorder psychopathology or psychological functioning in overweight patients with BED. PMID:26343481
Time's American Adventures: American Historians and their Writing since 1776
ERIC Educational Resources Information Center
Goetzmann, William
1976-01-01
Surveys the activities and methods of American historians from 1776-1976 and investigates ways in which contemporary historians are reacting to the lack of interest in history as a major discipline. (Author/DB)
Diffusion of Impaired Driving Laws Among US States
Silver, Diana
2015-01-01
Objectives. We examined internal and external determinants of state’s adoption of impaired driving laws. Methods. Data included 7 state-level, evidence-based public health laws collected from 1980 to 2010. We used event history analyses to identify predictors of first-time law adoption and subsequent adoption between state pairs. The independent variables were internal state factors, including the political environment, legislative professionalism, government capacity, state resources, legislative history, and policy-specific risk factors. The external factors were neighboring states’ history of law adoption and changes in federal law. Results. We found a strong secular trend toward an increased number of laws over time. The proportion of younger drivers and the presence of a neighboring state with similar laws were the strongest predictors of first-time law adoption. The predictors of subsequent law adoption included neighbor state adoption and previous legislative action. Alcohol laws were negatively associated with first-time adoption of impaired driving laws, suggesting substitution effects among policy choices. Conclusions. Organizations seeking to stimulate state policy changes may need to craft strategies that engage external actors, such as neighboring states, in addition to mobilizing within-state constituencies. PMID:26180969
NASA Astrophysics Data System (ADS)
Hiramatsu, K.; Matsui, T.; Ito, A.; Miyakita, T.; Osada, Y.; Yamamoto, T.
2004-10-01
Aircraft noise measurements were recorded at the residential areas in the vicinity of Kadena Air Base, Okinawa in 1968 and 1972 at the time of the Vietnam war. The estimated equivalent continuous A-weighted sound pressure level LAeq for 24 h was 85 dB.The time history of sound level during 24 h was estimated from the measurement conducted in 1968, and the sound level was converted into the spectrum level at the centre frequency of the critical band of temporary threshold shift (TTS) using the results of spectrum analysis of aircraft noise operated at the airfield. With the information of spectrum level and its time history, TTS was calculated as a function of time and level change. The permanent threshold shift was also calculated by means of Robinson's method and ISO's method. The results indicate the noise exposure around Kadena Air Base was hazardous to hearing and is likely to have caused hearing loss to people living in its vicinity.
ANNUAL PATIENT TIME COSTS ASSOCIATED WITH MEDICAL CARE AMONG CANCER SURVIVORS IN THE UNITED STATES
Yabroff, K. Robin; Guy, Gery P.; Ekwueme, Donatus U.; McNeel, Timothy; Rozjabek, Heather M.; Dowling, Emily; Li, Chunyu; Virgo, Katherine S.
2014-01-01
Background Although patient time costs are recommended for inclusion in cost-effectiveness analyses, these data are not routinely collected. We used nationally representative data and a medical service-based approach to estimate annual patient time costs among cancer survivors. Methods We identified 6,699 cancer survivors and 86,412 individuals without a cancer history ≥ 18 years from the 2008–2011 Medical Expenditure Panel Survey (MEPS). Service use was categorized as hospitalizations, emergency room (ER) use, provider visits, ambulatory surgery, chemotherapy, and radiation therapy. Service time estimates were applied to frequencies for each service category and the U.S. median wage rate in 2011 was used to value time. We evaluated the association between cancer survivorship and service use frequencies and patient time costs with multivariable regression models, stratified by age group (18–64 and 65+ years). Sensitivity analyses evaluated different approaches for valuing time. Results Cancer survivors were more likely to have hospitalizations, ER visits, ambulatory surgeries, and provider visits in the past year than individuals without a cancer history in adjusted analyses (p<0.05). Annual patient time was higher for cancer survivors than individuals without a cancer history among those ages 18–64 (30.2 vs. 13.6 hours; p<0.001) and ages 65+ (55.1 vs. 36.6 hours; p<0.001), as were annual patient time costs (18–64 years: $500 vs. $226; p<0.001 and 65+ years: $913 vs. $607; p<0.001). Conclusions Cancer survivors had greater annual medical service use and patient time costs than individuals without a cancer history. This medical service-based approach for estimating annual time costs can also be applied to other conditions. PMID:24926706
Determination of near and far field acoustics for advanced propeller configurations
NASA Technical Reports Server (NTRS)
Korkan, K. D.; Jaeger, S. M.; Kim, J. H.
1989-01-01
A method has been studied for predicting the acoustic field of the SR-3 transonic propfan using flow data generated by two versions of the NASPROP-E computer code. Since the flow fields calculated by the solvers include the shock-wave system of the propeller, the nonlinear quadrupole noise source term is included along with the monopole and dipole noise sources in the calculation of the acoustic near field. Acoustic time histories in the near field are determined by transforming the azimuthal coordinate in the rotating, blade-fixed coordinate system to the time coordinate in a nonrotating coordinate system. Fourier analysis of the pressure time histories is used to obtain the frequency spectra of the near-field noise.
MacNeil, Adam; Lee, Chung-Won; Dietz, Vance
2014-09-03
Accurate estimates of vaccination coverage are crucial for assessing routine immunization program performance. Community based household surveys are frequently used to assess coverage within a country. In household surveys to assess routine immunization coverage, a child's vaccination history is classified on the basis of observation of the immunization card, parental recall of receipt of vaccination, or both; each of these methods has been shown to commonly be inaccurate. The use of serologic data as a biomarker of vaccination history is a potential additional approach to improve accuracy in classifying vaccination history. However, potential challenges, including the accuracy of serologic methods in classifying vaccination history, varying vaccine types and dosing schedules, and logistical and financial implications must be considered. We provide historic and scientific context for the potential use of serologic data to assess vaccination history and discuss in detail key areas of importance for consideration in the context of using serologic data for classifying vaccination history in household surveys. Further studies are needed to directly evaluate the performance of serologic data compared with use of immunization cards or parental recall for classification of vaccination history in household surveys, as well assess the impact of age at the time of sample collection on serologic titers, the predictive value of serology to identify a fully vaccinated child for multi-dose vaccines, and the cost impact and logistical issues on outcomes associated with different types of biological samples for serologic testing. Published by Elsevier Ltd.
Cortisol at the Emergency Room Rape Visit as a Predictor of PTSD and Depression Symptoms Over Time
Walsh, Kate; Nugent, Nicole R.; Kotte, Amelia; Amstadter, Ananda B.; Wang, Sheila; Guille, Constance; Acierno, Ron; Kilpatrick, Dean G.; Resnick, Heidi S.
2013-01-01
Background Dysregulation of the hypothalamic-pituitary-adrenal axis, typically reflected by alterations in cortisol responsivity, has been associated with exposure to traumatic events and the development of stress-related disorders such as posttraumatic stress disorder (PTSD) and depression. Methods Serum cortisol was measured at the time of a post sexual assault medical exam among a sample of 323 female victims of recent sexual assault. Analyses were conducted among 235 participants who provided data regarding history of previous assault as well as PTSD and depression symptoms during at least one of three follow-ups. Results Growth curve models suggested that prior history of assault and serum cortisol were positively associated with the intercept and negatively associated with the slope of PTSD and depression symptoms after controlling for covariates. Prior history of assault and serum cortisol also interacted to predict the intercept and slope of PTSD and depression symptoms such that women with a prior history of assault and lower ER cortisol had higher initial symptoms that decreased at a slower rate relative to women without a prior history and those with higher ER cortisol. Conclusions Prior history of assault was associated with diminished acute cortisol responsivity at the emergency room visit. Prior assault history and cortisol both independently and interactively predicted PTSD and depression symptoms at first follow-up and over the course a six-month follow-up. PMID:23806832
Guinot, Guillaume; Adnet, Sylvain; Cappetta, Henri
2012-01-01
Modern selachians and their supposed sister group (hybodont sharks) have a long and successful evolutionary history. Yet, although selachian remains are considered relatively common in the fossil record in comparison with other marine vertebrates, little is known about the quality of their fossil record. Similarly, only a few works based on specific time intervals have attempted to identify major events that marked the evolutionary history of this group. Phylogenetic hypotheses concerning modern selachians' interrelationships are numerous but differ significantly and no consensus has been found. The aim of the present study is to take advantage of the range of recent phylogenetic hypotheses in order to assess the fit of the selachian fossil record to phylogenies, according to two different branching methods. Compilation of these data allowed the inference of an estimated range of diversity through time and evolutionary events that marked this group over the past 300 Ma are identified. Results indicate that with the exception of high taxonomic ranks (orders), the selachian fossil record is by far imperfect, particularly for generic and post-Triassic data. Timing and amplitude of the various identified events that marked the selachian evolutionary history are discussed. Some identified diversity events were mentioned in previous works using alternative methods (Early Jurassic, mid-Cretaceous, K/T boundary and late Paleogene diversity drops), thus reinforcing the efficiency of the methodology presented here in inferring evolutionary events. Other events (Permian/Triassic, Early and Late Cretaceous diversifications; Triassic/Jurassic extinction) are newly identified. Relationships between these events and paleoenvironmental characteristics and other groups' evolutionary history are proposed.
History without time: Buffon's natural history as a nonmathematical physique.
Hoquet, Thierry
2010-03-01
While "natural history" is practically synonymous with the name of Buffon, the term itself has been otherwise overlooked by historians of science. This essay attempts to address this omission by investigating the meanings of "physique," "natural philosophy," and "history," among other terms, with the purpose of understanding Buffon's actual objectives. It also shows that Buffon never claimed to be a Newtonian and should not be considered as such; the goal is to provide a historical analysis that resituates Buffon's thought within his own era. This is done, primarily, by eschewing the often-studied question of time in Buffon. Instead, this study examines the nontemporal meanings of the word "history" within the naturalist's theory and method. The title of his Natural History is examined both as an indicator of the kind of science that Buffon was hoping to achieve and as a source of great misinterpretation among his peers. Unlike Buffon, many of his contemporaries actually envisioned the study of nature from a Baconian perspective where history was restricted to the mere collection of facts and where philosophy, which was the implicit and ultimate goal of studying nature, was seen, at least for the present, as unrealizable. Buffon confronts this tendency insofar as his Histoire naturelle claims to be the real physique that, along with describing nature, also sought to identify general laws and provide clear insight into what true knowledge of nature is or should be. According to Buffon, history (both natural and civil) is not analogous to mathematics; it is a nonmathematical method whose scope encompasses both nature and society. This methodological stance gives rise to the "physicization" of certain moral concepts--a gesture that was interpreted by his contemporaries as Epicurean and atheist. In addition, Buffon reduces a number of metaphysically tainted historical concepts (e.g., antediluvian monuments) to objects of physical analysis, thereby confronting the very foundation of natural theology. In Buffon, as this essay makes clear, natural history is paving the way for a new physique (science of natural beings), independent from mathematics and from God, that treats naturalia in a philosophical and "historical" manner that is not necessarily "temporal."
[Chromosome as a chronicler: Genetic dating, historical events, and DNA-genealogic temptation].
Balanovsky, O P; Zaporozhchenko, V V
2016-07-01
Nonrecombinant portions of the genome, Y chromosome and mitochondrial DNA, are widely used for research on human population gene pools and reconstruction of their history. These systems allow the genetic dating of clusters of emerging haplotypes. The main method for age estimations is ρ statistics, which is an average number of mutations from founder haplotype to all modern-day haplotypes. A researcher can estimate the age of the cluster by multiplying this number by the mutation rate. The second method of estimation, ASD, is used for STR haplotypes of the Y chromosome and is based on the squared difference in the number of repeats. In addition to the methods of calculation, methods of Bayesian modeling assume a new significance. They have greater computational cost and complexity, but they allow obtaining an a posteriori distribution of the value of interest that is the most consistent with experimental data. The mutation rate must be known for both calculation methods and modeling methods. It can be determined either during the analysis of lineages or by providing calibration points based on populations with known formation time. These two approaches resulted in rate estimations for Y-chromosomal STR haplotypes with threefold difference. This contradiction was only recently refuted through the use of sequence data for the complete Y chromosome; “whole-genomic” rates of single nucleotide mutations obtained by both methods are mutually consistent and mark the area of application for different rates of STR markers. An issue even more crucial than that of the rates is correlation of the reconstructed history of the haplogroup (a cluster of haplotypes) and the history of the population. Although the need for distinguishing “lineage history” and “population history” arose in the earliest days of phylogeographic research, reconstructing the population history using genetic dating requires a number of methods and conditions. It is known that population history events leave distinct traces in the history of haplogroups only under certain demographic conditions. Direct identification of national history with the history of its occurring haplogroups is inappropriate and is avoided in population genetic studies, although because of its simplicity and attractiveness it is a constant temptation for researchers. An example of DNA genealogy, an amateur field that went beyond the borders of even citizen science and is consistently using the principle of equating haplogroup with lineage and population, which leads to absurd results (e.g., Eurasia as an origin of humankind), can serve as a warning against a simplified approach for interpretation of genetic dating results.
Bayesian analysis of biogeography when the number of areas is large.
Landis, Michael J; Matzke, Nicholas J; Moore, Brian R; Huelsenbeck, John P
2013-11-01
Historical biogeography is increasingly studied from an explicitly statistical perspective, using stochastic models to describe the evolution of species range as a continuous-time Markov process of dispersal between and extinction within a set of discrete geographic areas. The main constraint of these methods is the computational limit on the number of areas that can be specified. We propose a Bayesian approach for inferring biogeographic history that extends the application of biogeographic models to the analysis of more realistic problems that involve a large number of areas. Our solution is based on a "data-augmentation" approach, in which we first populate the tree with a history of biogeographic events that is consistent with the observed species ranges at the tips of the tree. We then calculate the likelihood of a given history by adopting a mechanistic interpretation of the instantaneous-rate matrix, which specifies both the exponential waiting times between biogeographic events and the relative probabilities of each biogeographic change. We develop this approach in a Bayesian framework, marginalizing over all possible biogeographic histories using Markov chain Monte Carlo (MCMC). Besides dramatically increasing the number of areas that can be accommodated in a biogeographic analysis, our method allows the parameters of a given biogeographic model to be estimated and different biogeographic models to be objectively compared. Our approach is implemented in the program, BayArea.
Drug exposure in register-based research—An expert-opinion based evaluation of methods
Taipale, Heidi; Koponen, Marjaana; Tolppanen, Anna-Maija; Hartikainen, Sirpa; Ahonen, Riitta; Tiihonen, Jari
2017-01-01
Background In register-based pharmacoepidemiological studies, construction of drug exposure periods from drug purchases is a major methodological challenge. Various methods have been applied but their validity is rarely evaluated. Our objective was to conduct an expert-opinion based evaluation of the correctness of drug use periods produced by different methods. Methods Drug use periods were calculated with three fixed methods: time windows, assumption of one Defined Daily Dose (DDD) per day and one tablet per day, and with PRE2DUP that is based on modelling of individual drug purchasing behavior. Expert-opinion based evaluation was conducted with 200 randomly selected purchase histories of warfarin, bisoprolol, simvastatin, risperidone and mirtazapine in the MEDALZ-2005 cohort (28,093 persons with Alzheimer’s disease). Two experts reviewed purchase histories and judged which methods had joined correct purchases and gave correct duration for each of 1000 drug exposure periods. Results The evaluated correctness of drug use periods was 70–94% for PRE2DUP, and depending on grace periods and time window lengths 0–73% for tablet methods, 0–41% for DDD methods and 0–11% for time window methods. The highest rate of evaluated correct solutions for each method class were observed for 1 tablet per day with 180 days grace period (TAB_1_180, 43–73%), and 1 DDD per day with 180 days grace period (1–41%). Time window methods produced at maximum only 11% correct solutions. The best performing fixed method TAB_1_180 reached highest correctness for simvastatin 73% (95% CI 65–81%) whereas 89% (95% CI 84–94%) of PRE2DUP periods were judged as correct. Conclusions This study shows inaccuracy of fixed methods and the urgent need for new data-driven methods. In the expert-opinion based evaluation, the lowest error rates were observed with data-driven method PRE2DUP. PMID:28886089
Diffusion archeology for diffusion progression history reconstruction.
Sefer, Emre; Kingsford, Carl
2016-11-01
Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring - perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial diffusion data. Here, we tackle the problem of reconstructing a diffusion history from one or more snapshots of the diffusion state. This ability can be invaluable to learn when certain computer nodes are infected or which people are the initial disease spreaders to control future diffusions. We formulate this problem over discrete-time SEIRS-type diffusion models in terms of maximum likelihood. We design methods that are based on submodularity and a novel prize-collecting dominating-set vertex cover (PCDSVC) relaxation that can identify likely diffusion steps with some provable performance guarantees. Our methods are the first to be able to reconstruct complete diffusion histories accurately in real and simulated situations. As a special case, they can also identify the initial spreaders better than the existing methods for that problem. Our results for both meme and contaminant diffusion show that the partial diffusion data problem can be overcome with proper modeling and methods, and that hidden temporal characteristics of diffusion can be predicted from limited data.
Diffusion archeology for diffusion progression history reconstruction
Sefer, Emre; Kingsford, Carl
2015-01-01
Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring — perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial diffusion data. Here, we tackle the problem of reconstructing a diffusion history from one or more snapshots of the diffusion state. This ability can be invaluable to learn when certain computer nodes are infected or which people are the initial disease spreaders to control future diffusions. We formulate this problem over discrete-time SEIRS-type diffusion models in terms of maximum likelihood. We design methods that are based on submodularity and a novel prize-collecting dominating-set vertex cover (PCDSVC) relaxation that can identify likely diffusion steps with some provable performance guarantees. Our methods are the first to be able to reconstruct complete diffusion histories accurately in real and simulated situations. As a special case, they can also identify the initial spreaders better than the existing methods for that problem. Our results for both meme and contaminant diffusion show that the partial diffusion data problem can be overcome with proper modeling and methods, and that hidden temporal characteristics of diffusion can be predicted from limited data. PMID:27821901
Isolation with Migration Models for More Than Two Populations
Hey, Jody
2010-01-01
A method for studying the divergence of multiple closely related populations is described and assessed. The approach of Hey and Nielsen (2007, Integration within the Felsenstein equation for improved Markov chain Monte Carlo methods in population genetics. Proc Natl Acad Sci USA. 104:2785–2790) for fitting an isolation-with-migration model was extended to the case of multiple populations with a known phylogeny. Analysis of simulated data sets reveals the kinds of history that are accessible with a multipopulation analysis. Necessarily, processes associated with older time periods in a phylogeny are more difficult to estimate; and histories with high levels of gene flow are particularly difficult with more than two populations. However, for histories with modest levels of gene flow, or for very large data sets, it is possible to study large complex divergence problems that involve multiple closely related populations or species. PMID:19955477
Isolation with migration models for more than two populations.
Hey, Jody
2010-04-01
A method for studying the divergence of multiple closely related populations is described and assessed. The approach of Hey and Nielsen (2007, Integration within the Felsenstein equation for improved Markov chain Monte Carlo methods in population genetics. Proc Natl Acad Sci USA. 104:2785-2790) for fitting an isolation-with-migration model was extended to the case of multiple populations with a known phylogeny. Analysis of simulated data sets reveals the kinds of history that are accessible with a multipopulation analysis. Necessarily, processes associated with older time periods in a phylogeny are more difficult to estimate; and histories with high levels of gene flow are particularly difficult with more than two populations. However, for histories with modest levels of gene flow, or for very large data sets, it is possible to study large complex divergence problems that involve multiple closely related populations or species.
A multi-domain spectral method for time-fractional differential equations
NASA Astrophysics Data System (ADS)
Chen, Feng; Xu, Qinwu; Hesthaven, Jan S.
2015-07-01
This paper proposes an approach for high-order time integration within a multi-domain setting for time-fractional differential equations. Since the kernel is singular or nearly singular, two main difficulties arise after the domain decomposition: how to properly account for the history/memory part and how to perform the integration accurately. To address these issues, we propose a novel hybrid approach for the numerical integration based on the combination of three-term-recurrence relations of Jacobi polynomials and high-order Gauss quadrature. The different approximations used in the hybrid approach are justified theoretically and through numerical examples. Based on this, we propose a new multi-domain spectral method for high-order accurate time integrations and study its stability properties by identifying the method as a generalized linear method. Numerical experiments confirm hp-convergence for both time-fractional differential equations and time-fractional partial differential equations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campione, Salvatore; Warne, Larry K.; Sainath, Kamalesh
In this report we overview the fundamental concepts for a pair of techniques which together greatly hasten computational predictions of electromagnetic pulse (EMP) excitation of finite-length dissipative conductors over a ground plane. In a time- domain, transmission line (TL) model implementation, predictions are computationally bottlenecked time-wise, either for late-time predictions (about 100ns-10000ns range) or predictions concerning EMP excitation of long TLs (order of kilometers or more ). This is because the method requires a temporal convolution to account for the losses in the ground. Addressing this to facilitate practical simulation of EMP excitation of TLs, we first apply a techniquemore » to extract an (approximate) complex exponential function basis-fit to the ground/Earth's impedance function, followed by incorporating this into a recursion-based convolution acceleration technique. Because the recursion-based method only requires the evaluation of the most recent voltage history data (versus the entire history in a "brute-force" convolution evaluation), we achieve necessary time speed- ups across a variety of TL/Earth geometry/material scenarios. Intentionally Left Blank« less
NASA Astrophysics Data System (ADS)
Sole-Mari, G.; Fernandez-Garcia, D.
2016-12-01
Random Walk Particle Tracking (RWPT) coupled with Kernel Density Estimation (KDE) has been recently proposed to simulate reactive transport in porous media. KDE provides an optimal estimation of the area of influence of particles which is a key element to simulate nonlinear chemical reactions. However, several important drawbacks can be identified: (1) the optimal KDE method is computationally intensive and thereby cannot be used at each time step of the simulation; (2) it does not take advantage of the prior information about the physical system and the previous history of the solute plume; (3) even if the kernel is optimal, the relative error in RWPT simulations typically increases over time as the particle density diminishes by dilution. To overcome these problems, we propose an adaptive branching random walk methodology that incorporates the physics, the particle history and maintains accuracy with time. The method allows particles to efficiently split and merge when necessary as well as to optimally adapt their local kernel shape without having to recalculate the kernel size. We illustrate the advantage of the method by simulating complex reactive transport problems in randomly heterogeneous porous media.
Martin, Thomas E.; Riordan, Margaret M.; Repin, Rimi; Mouton, James C.; Blake, William M.
2017-01-01
AimAdult survival is central to theories explaining latitudinal gradients in life history strategies. Life history theory predicts higher adult survival in tropical than north temperate regions given lower fecundity and parental effort. Early studies were consistent with this prediction, but standard-effort netting studies in recent decades suggested that apparent survival rates in temperate and tropical regions strongly overlap. Such results do not fit with life history theory. Targeted marking and resighting of breeding adults yielded higher survival estimates in the tropics, but this approach is thought to overestimate survival because it does not sample social and age classes with lower survival. We compared the effect of field methods on tropical survival estimates and their relationships with life history traits.LocationSabah, Malaysian Borneo.Time period2008–2016.Major taxonPasseriformes.MethodsWe used standard-effort netting and resighted individuals of all social and age classes of 18 tropical songbird species over 8 years. We compared apparent survival estimates between these two field methods with differing analytical approaches.ResultsEstimated detection and apparent survival probabilities from standard-effort netting were similar to those from other tropical studies that used standard-effort netting. Resighting data verified that a high proportion of individuals that were never recaptured in standard-effort netting remained in the study area, and many were observed breeding. Across all analytical approaches, addition of resighting yielded substantially higher survival estimates than did standard-effort netting alone. These apparent survival estimates were higher than for temperate zone species, consistent with latitudinal differences in life histories. Moreover, apparent survival estimates from addition of resighting, but not from standard-effort netting alone, were correlated with parental effort as measured by egg temperature across species.Main conclusionsInclusion of resighting showed that standard-effort netting alone can negatively bias apparent survival estimates and obscure life history relationships across latitudes and among tropical species.
Pre-mare cratering and early solar system history
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetherill, G.W.
1974-01-01
An evaluation of the application of the high extra-lunar flux in pre- mare times to more general problems of early solar system history is attempted by combining the results of dynamic studies with lunar chronological data. Dynamical studies permit separate evaluation of the possible sources for both the normal flux during the first 600 m.y. years of lunar history as well as the peak which apparently occurred 4.0 b.y. ago. Dynamical studies have been carried out in order to determine the extent to which a heliocentric flux could be confined to the Moon (and Earth). A Monte Carlo method hasmore » been used to calculate the relative impact rates of planet-crossing bodies with the moon and the terrestrial planets. It is concluded that the time-variation of the flux on these planets is closely related to that on the moon. (STAR)« less
Investigating European genetic history through computer simulations.
Currat, Mathias; Silva, Nuno M
2013-01-01
The genetic diversity of Europeans has been shaped by various evolutionary forces including their demographic history. Genetic data can thus be used to draw inferences on the population history of Europe using appropriate statistical methods such as computer simulation, which constitutes a powerful tool to study complex models. Here, we focus on spatially explicit simulation, a method which takes population movements over space and time into account. We present its main principles and then describe a series of studies using this approach that we consider as particularly significant in the context of European prehistory. All simulation studies agree that ancient demographic events played a significant role in the establishment of the European gene pool; but while earlier works support a major genetic input from the Near East during the Neolithic transition, the most recent ones revalue positively the contribution of pre-Neolithic hunter-gatherers and suggest a possible impact of very ancient demographic events. This result of a substantial genetic continuity from pre-Neolithic times to the present challenges some recent studies analyzing ancient DNA. We discuss the possible reasons for this discrepancy and identify future lines of investigation in order to get a better understanding of European evolution.
An ultra-high-speed cinematographic method for the study of wakes in hypersonic ballistic ranges
NASA Astrophysics Data System (ADS)
Koeneke, Axel; Jaeggy, Bernard Charles; Koerber, Germain
1987-11-01
Optical methods are among the only possibilities to study hypersonic wakes in ballistic ranges. Because of the flow velocities involved the methods employed must permit exposure time well below one microsecond. The ISL has used ultrahigh speed visualization techniques for the study of the transition of hypersonic wakes for quite some time, but the means available up to now did not permit investigation of the time-history of the instabilities in the wake. The use of a laser equipped with an acousto-optical modulator is proposed as a source of ultrashort, highly energetic pulses with high repetition rate to be used to record a certain number of images of the same experiment in order to study the time history of these instabilities. Advantages of the laser as a light source are not only the high energies available together with pulse duration down to 20 nanoseconds, but mostly the free choice of repetition rate independently of exposure time, and the possibility to synchronize the pulses with external events. The laser is a point source and as such can be used in a variety of different optical setups. The coherent nature of the laser light even permits holographic techniques. The reception system capable of recording the images at a sufficient rate is the basic problem in the development and use of the proposed setup.
Historical theses on nursing and caring sciences in Finland: a literature review.
Lukana, Anne; Leena, Salminen; Marjo, Kaartinen; Helena, Leino-Kilpi
2013-12-01
The purpose of this literature review was to review the theses (masters, licentiate and doctoral theses) on the history of nursing and caring sciences in Finland. The research questions were as follows: 1.What is the number and characteristics of these historical theses (target groups, methods and sources) on nursing and caring sciences have been produced in Finland? 2.What periods of time have been under investigation in these theses? 3.What topics have been investigated in these theses? The theses on the history of nursing and caring sciences were retrieved from the theses index of the universities that offer education in nursing and caring sciences in Finland. The literature search covered the time period 1979-2010. Altogether, 58 theses were reviewed and analysed via content analysis. Of all of the theses (n = 3969) produced in nursing and caring sciences, 58 of them focused on historical topics (<2%). The most common target group was healthcare personnel. The most common research method was the traditional historical method. Primary and secondary sources were used both together and separately. Nearly all of the theses examined the history of the 1900s, whereas only a few of them examined time periods before that. The four main topics of the theses were nursing practice, nursing education, nursing management and philosophy of nursing. The most common topic was nursing practice, especially psychiatric nursing. Research on the history of nursing and caring sciences in Finland has received only marginal attention from researchers. This literature review offers a description of the historical research produced on nursing and caring sciences and the topics of interest. In future, it will be necessary to more closely examine several historical topics that have been neglected in the study of nursing and caring sciences. © 2012 The Authors Scandinavian Journal of Caring Sciences © 2012 Nordic College of Caring Science.
2006-01-01
Background Methods for analyzing space-time variation in risk in case-control studies typically ignore residential mobility. We develop an approach for analyzing case-control data for mobile individuals and apply it to study bladder cancer in 11 counties in southeastern Michigan. At this time data collection is incomplete and no inferences should be drawn – we analyze these data to demonstrate the novel methods. Global, local and focused clustering of residential histories for 219 cases and 437 controls is quantified using time-dependent nearest neighbor relationships. Business address histories for 268 industries that release known or suspected bladder cancer carcinogens are analyzed. A logistic model accounting for smoking, gender, age, race and education specifies the probability of being a case, and is incorporated into the cluster randomization procedures. Sensitivity of clustering to definition of the proximity metric is assessed for 1 to 75 k nearest neighbors. Results Global clustering is partly explained by the covariates but remains statistically significant at 12 of the 14 levels of k considered. After accounting for the covariates 26 Local clusters are found in Lapeer, Ingham, Oakland and Jackson counties, with the clusters in Ingham and Oakland counties appearing in 1950 and persisting to the present. Statistically significant focused clusters are found about the business address histories of 22 industries located in Oakland (19 clusters), Ingham (2) and Jackson (1) counties. Clusters in central and southeastern Oakland County appear in the 1930's and persist to the present day. Conclusion These methods provide a systematic approach for evaluating a series of increasingly realistic alternative hypotheses regarding the sources of excess risk. So long as selection of cases and controls is population-based and not geographically biased, these tools can provide insights into geographic risk factors that were not specifically assessed in the case-control study design. PMID:16887016
An efficient temporal database design method based on EER
NASA Astrophysics Data System (ADS)
Liu, Zhi; Huang, Jiping; Miao, Hua
2007-12-01
Many existing methods of modeling temporal information are based on logical model, which makes relational schema optimization more difficult and more complicated. In this paper, based on the conventional EER model, the author attempts to analyse and abstract temporal information in the phase of conceptual modelling according to the concrete requirement to history information. Then a temporal data model named BTEER is presented. BTEER not only retains all designing ideas and methods of EER which makes BTEER have good upward compatibility, but also supports the modelling of valid time and transaction time effectively at the same time. In addition, BTEER can be transformed to EER easily and automatically. It proves in practice, this method can model the temporal information well.
The History of Optical Analysis of Milk: The Development and Use of Lactoscopes
NASA Astrophysics Data System (ADS)
Millan-Verdú, C.; Garrigós-Oltra, Ll.; Blanes-Nadal, G.; Domingo-Beltrán, M.
2003-07-01
The 19th century use of optical methods to detect the fraudulent adulteration of milk (by means of adding water) is presented in this article. The development and use of these optical methods was based on the principle of diaphanometry and illustrates the conflict that existed between two different approaches to scientific knowledge at that time. On the one hand, there were physicians who were more concerned with the practicality of the methods than on their accuracy; on the other hand, there were physicists who were fundamentally more interested in the accuracy of the results than the practicality of its use. This paper examines that conflict by analyzing the original lactoscope of Alfred Donné and the history of subsequent developments of the lactoscope.
[Croupous pneumonia: the history of studies (from S.P. Botkin to our days)].
Fesenko, O V; Sinopal'nikov, A I
2012-01-01
Etiology of croupous pneumonia remaining a challenging problem has a long history of investigations. The term croupous pneumonia is traditionally used in this country to describe the form of the disease characterized by hyperergic inflammation and specific clinical/laboratory parameters. A valuable contribution to the study of this condition was made by S.P. Botkin. This paper deals with evolution of the views of epidemiology, etiology, pathomorphology, and methods of therapy of croupous pneumonia since Botkin's time to these days.
Analysis of wind-resistant and stability for cable tower in cable-stayed bridge with four towers
NASA Astrophysics Data System (ADS)
Meng, Yangjun; Li, Can
2017-06-01
Wind speed time history simulation methods have been introduced first, especially the harmonic synthesis method introduced in detail. Second, taking Chishi bridge for example, choosing the particular sections, and combined with the design wind speed, three-component coefficient simulate analysis between -4°and 4°has been carry out with the Fluent software. The results show that drag coefficient reaches maximum when the attack Angle is 1°. According to measured wind speed samples,time history curves of wind speed at bridge deck and tower roof have been obtained,and wind-resistant time history analysis for No.5 tower has been carry out. Their results show that the dynamic coefficients are different with different calculation standard, especially transverse bending moment, pulsating crosswind load does not show a dynamic amplification effect.Under pulsating wind loads at bridge deck or tower roof, the maximum displacement at the top of the tower and the maximum stress at the bottom of the tower are within the allowable range. The transverse stiffness of tower is greater than that of the longitudinal stiffness, therefore wind-resistant analysis should give priority to the longitudinal direction. Dynamic coefficients are different with different standard, the maximum dynamic coefficient should be used for the pseudo-static analysis.Finally, the static stability of tower is analyzed with different load combinations, and the galloping stabilities of cable tower is proved.
Caplan, Daniel J.; Pankow, James S.; Cai, Jianwen; Offenbacher, Steven; Beck, James D.
2009-01-01
Background Results from numerous studies have suggested links between periodontal disease and coronary heart disease (CHD), but endodontic disease has not been studied extensively in this regard. Methods The authors evaluated the relationship between self-reported history of endodontic therapy (ET) and prevalent CHD in the Atherosclerosis Risk in Communities (ARIC) Study, aprospective epidemiologic study sponsored by the National Heart, Lung, and Blood Institute. The authors used multivariable logistic regres-sionto analyze data obtained from oral health questionnaires, medical evaluations and clinical dental examinations. Results Of 6,651 participants analyzed, 50.4 percent reported never having had ET; 21.5 percent reported having had ET one time; and 28.0 percent reported having had ET two or more times. Final multivariable regression models indicated that among participants with 25 or more teeth, those reporting having had ET two or more times had 1.62 (95 percent confidence interval [CI], 1.04–2.53) times the odds of prevalent CHD compared with those reporting never having had ET. Among participants with 24 or fewer teeth, no significant differences in CHD prevalence were observed among groups regardless of their history of ET. Conclusions Among participants with 25 or more teeth, those with a greater self-reported history of ET were more likely to have CHD than were those reporting no history of ET. Clinical Implications More accurate epidemiologic quantification of endodontic infection and inflammation is required before definitive conclusions can be made about potential relationships between endodontic disease and CHD. PMID:19654253
Real time aircraft fly-over noise discrimination
NASA Astrophysics Data System (ADS)
Genescà, M.; Romeu, J.; Pàmies, T.; Sánchez, A.
2009-06-01
A method for measuring aircraft noise time history with automatic elimination of simultaneous urban noise is presented in this paper. A 3 m-long 12-microphone sparse array has been proven to give good performance in a wide range of urban placements. Nowadays, urban placements have to be avoided because their background noise has a great influence on the measurements made by sound level meters or single microphones. Because of the small device size and low number of microphones (that make it so easy to set up), the resolution of the device is not high enough to provide a clean aircraft noise time history by only applying frequency domain beamforming to the spatial cross-correlations of the microphones' signals. Therefore, a new step to the processing algorithm has been added to eliminate this handicap.
Huang, Shaodan; Xiong, Jianyin; Zhang, Yinping
2013-10-15
The indoor pollution caused by formaldehyde and volatile organic compounds (VOCs) emitted from building materials poses an adverse effect on people's health. It is necessary to understand and control the behaviors of the emission sources. Based on detailed mass transfer analysis on the emission process in a ventilated chamber, this paper proposes a novel method of measuring the three emission characteristic parameters, i.e., the initial emittable concentration, the diffusion coefficient and the partition coefficient. A linear correlation between the logarithm of dimensionless concentration and time is derived. The three parameters can then be calculated from the intercept and slope of the correlation. Compared with the closed chamber C-history method, the test is performed under ventilated condition thus some commonly-used measurement instruments (e.g., GC/MS, HPLC) can be applied. While compared with other methods, the present method can rapidly and accurately measure the three parameters, with experimental time less than 12h and R(2) ranging from 0.96 to 0.99 for the cases studied. Independent experiment was carried out to validate the developed method, and good agreement was observed between the simulations based on the determined parameters and experiments. The present method should prove useful for quick characterization of formaldehyde/VOC emissions from indoor materials. Copyright © 2013 Elsevier B.V. All rights reserved.
Xiong, Jianyin; Yao, Yuan; Zhang, Yinping
2011-04-15
The initial emittable concentration (C(m,0)), the diffusion coefficient (D(m)), and the material/air partition coefficient (K) are the three characteristic parameters influencing emissions of formaldehyde and volatile organic compounds (VOCs) from building materials or furniture. It is necessary to determine these parameters to understand emission characteristics and how to control them. In this paper we develop a new method, the C-history method for a closed chamber, to measure these three parameters. Compared to the available methods of determining the three parameters described in the literature, our approach has the following salient features: (1) the three parameters can be simultaneously obtained; (2) it is time-saving, generally taking less than 3 days for the cases studied (the available methods tend to need 7-28 days); (3) the maximum relative standard deviations of the measured C(m,0), D(m) and K are 8.5%, 7.7%, and 9.8%, respectively, which are acceptable for engineering applications. The new method was validated by using the characteristic parameters determined in the closed chamber experiment to predict the observed emissions in a ventilated full scale chamber experiment, proving that the approach is reliable and convincing. Our new C-history method should prove useful for rapidly determining the parameters required to predict formaldehyde and VOC emissions from building materials as well as for furniture labeling.
Harvey, Philip D.; Reichenberg, Abraham; Bowie, Christopher R.; Patterson, Thomas L.; Heaton, Robert K.
2010-01-01
Background Chronically institutionalized patients with schizophrenia have been reported to manifest cognitive and functional decline. Previous studies were limited by the fact that current environment could not be separated from life-time illness course. The present study examined older outpatients who varied in their lifetime history of long-term psychiatric inpatient stay. Methods Community dwelling patients with schizophrenia (n=111) and healthy comparison subjects (n=76) were followed up to 45 months and examined two or more times with a neuropsychological (NP) battery and performance-based measures of everyday living skills (UCSD Performance-based skills assessment; UPSA) and social competence. A mixed-effects model repeated-measures method was used to examine changes. Results There was a significant effect of institutional stay on the course of the UPSA. When the schizophrenia patients who completed all three assessments were divided on the basis of length of institutional stay and compared to healthy comparison subjects, patients with longer stays worsened on the UPSA and social competence while patients with shorter stays improved. For NP performance, both patient samples worsened slightly while the HC group manifested a practice effect. Reliable change index (RCI) analyses showed that worsening on the UPSA for longer stay patients was definitely nonrandom. Conclusions Life-time history of institutional stay was associated with worsening on measures of social and everyday living skills. NP performance in schizophrenia did not evidence the practice effect seen in the HC sample. These data suggest that schizophrenia patients with a history of long institutional stay may worsen even if they are no longer institutionalized. PMID:20202624
Family history of stroke and severity of neurologic deficit after stroke
Case, L.D.; Worrall, B.B.; Brown, R.D.; Brott, T.G.; Frankel, M.; Silliman, S.; Rich, S.S.
2008-01-01
Background A family history of stroke is an independent risk factor for stroke. Objective To assess whether severity of neurologic deficit after stroke is associated with a family history of stroke. Methods The Ischemic Stroke Genetics Study, a five-center study of first-ever symptomatic ischemic stroke, assessed case subjects prospectively for a family history of stroke-affected first-degree relatives. Certified adjudicators used the NIH Stroke Scale (NIHSS) to determine the severity of neurologic deficit. Results A total of 505 case subjects were enrolled (median age, 65 years; 55% male), with 81% enrolled within 1 week of onset of symptoms. A sibling history of stroke was associated with more severe stroke. The odds of an NIHSS score of 5 or higher were 2.0 times greater for cases with a sibling history of stroke compared with cases with no sibling history (95% CI, 1.0 to 3.9). An association of family history of stroke in parents or children with stroke severity was not detected. Conclusions A sibling history of stroke increased the likelihood of a more severe stroke in the case subjects, independent of age, sex, and other potential confounding factors. Other family history characteristics were not associated with stroke severity. PMID:17060565
The search for extra-terrestrial intelligence.
Drake, Frank
2011-02-13
Modern history of the search for extra-terrestrial intelligence is reviewed. The history of radio searches is discussed, as well as the major advances that have occurred in radio searches and prospects for new instruments and search strategies. Recent recognition that searches for optical and infrared signals make sense, and the reasons for this are described, as well as the equipment and special detection methods used in optical searches. The long-range future of the search for extra-terrestrial intelligence (SETI) is discussed in the context of the history of rapid change, on the cosmic and even the human time scale, of the paradigms guiding SETI searches. This suggests that SETI searches be conducted with a very open mind.
ERIC Educational Resources Information Center
Hillbrand, Marc; Waite, Bradley M.
1992-01-01
Used Experience Sampling Method to investigate experiences of anger in 10 patients at maximum security forensic institute who had histories of severe, violent behavior. Found severity of anger influenced by type of activity in which subject was engaged and by emotional valence of preceding events but not by time of day nor by type of interpersonal…
ERIC Educational Resources Information Center
Adams-Budde, Melissa; Howard, Christy; Jolliff, Grant; Myers, Joy
2014-01-01
The purpose of this mixed methods sequential explanatory study was to explain the relationship between literacy experiences over time and the literacy identities of the doctoral students in a teacher education and higher education program. The quantitative phase, surveying 36 participants, revealed a positive correlation between participant's…
[The history of home hemodialysis and its likely revival].
Ralli, Chiara; Imperiali, Patrizio; Duranti, Ennio
2016-01-01
The home extracorporeal hemodialysis, which aroused a great interest in the past, has not kept its promises due to the complexity and expectations for family involvement in treatment management. In the United States NxStage One portable system was proposed and designed for home use. In this work we describe, starting from the history of home hemodialysis, the method with NxStage system by comparing it with the conventional HD in 5 patients. The dialysis efficiency was similar between the two treatments, even if home hemodialysis showed a reduction in serum urea, creatinine and phosphorus. At the same time phosphate binders use decreased with an increase in serum calcium while hemoglobin increased reducing doses of erythropoietin. The method was successful in the training of the patients and their partners during hospital training and at home. Patients have shown great enthusiasm at the beginning and during the therapy, which is developed around the users personal needs, being able to decide at its own times during 24 hours according to personal needs, in addition to faster recovery after the dialysis. This method certainly improved the patients' wellness and increased their autonomy.
Comparison of three explicit multigrid methods for the Euler and Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Chima, Rodrick V.; Turkel, Eli; Schaffer, Steve
1987-01-01
Three explicit multigrid methods, Ni's method, Jameson's finite-volume method, and a finite-difference method based on Brandt's work, are described and compared for two model problems. All three methods use an explicit multistage Runge-Kutta scheme on the fine grid, and this scheme is also described. Convergence histories for inviscid flow over a bump in a channel for the fine-grid scheme alone show that convergence rate is proportional to Courant number and that implicit residual smoothing can significantly accelerate the scheme. Ni's method was slightly slower than the implicitly-smoothed scheme alone. Brandt's and Jameson's methods are shown to be equivalent in form but differ in their node versus cell-centered implementations. They are about 8.5 times faster than Ni's method in terms of CPU time. Results for an oblique shock/boundary layer interaction problem verify the accuracy of the finite-difference code. All methods slowed considerably on the stretched viscous grid but Brandt's method was still 2.1 times faster than Ni's method.
Issues Using the Life History Calendar in Disability Research
Scott, Tiffany N.; Harrison, Tracie
2011-01-01
Background Overall, there is a dearth of research reporting mixed-method data collection procedures using the LHC within disability research. Objective This report provides practical knowledge on use of the life history calendar (LHC) from the perspective of a mixed-method life history study of mobility impairment situated within a qualitative paradigm. Methods In this paper the method related literature referring to the LHC was reviewed along with its epistemological underpinnings. Further, the uses of the LHC in disability research were illustrated using preliminary data from reports of disablement in Mexican American and Non-Hispanic White women with permanent mobility impairment. Results From our perspective, the LHC was most useful when approached from an interpretive paradigm when gathering data from women of varied ethnic and socioeconomic strata. While we found the LHC the most useful tool currently available for studying disablement over the life course, there were challenges associated with its use. The LHC required extensive interviewer training. In addition, large segments of time were needed for completion depending on the type of participant responses. Conclusions Researchers planning to conduct a disability study may find our experience using the LHC valuable for anticipating issues that may arise when the LHC is used in mixed-method research. PMID:22014674
NASA Astrophysics Data System (ADS)
Prawin, J.; Rama Mohan Rao, A.
2018-01-01
The knowledge of dynamic loads acting on a structure is always required for many practical engineering problems, such as structural strength analysis, health monitoring and fault diagnosis, and vibration isolation. In this paper, we present an online input force time history reconstruction algorithm using Dynamic Principal Component Analysis (DPCA) from the acceleration time history response measurements using moving windows. We also present an optimal sensor placement algorithm to place limited sensors at dynamically sensitive spatial locations. The major advantage of the proposed input force identification algorithm is that it does not require finite element idealization of structure unlike the earlier formulations and therefore free from physical modelling errors. We have considered three numerical examples to validate the accuracy of the proposed DPCA based method. Effects of measurement noise, multiple force identification, different kinds of loading, incomplete measurements, and high noise levels are investigated in detail. Parametric studies have been carried out to arrive at optimal window size and also the percentage of window overlap. Studies presented in this paper clearly establish the merits of the proposed algorithm for online load identification.
Ryu, Hyeuk; Luco, Nicolas; Baker, Jack W.; Karaca, Erdem
2008-01-01
A methodology was recently proposed for the development of hazard-compatible building fragility models using parameters of capacity curves and damage state thresholds from HAZUS (Karaca and Luco, 2008). In the methodology, HAZUS curvilinear capacity curves were used to define nonlinear dynamic SDOF models that were subjected to the nonlinear time history analysis instead of the capacity spectrum method. In this study, we construct a multilinear capacity curve with negative stiffness after an ultimate (capping) point for the nonlinear time history analysis, as an alternative to the curvilinear model provided in HAZUS. As an illustration, here we propose parameter values of the multilinear capacity curve for a moderate-code low-rise steel moment resisting frame building (labeled S1L in HAZUS). To determine the final parameter values, we perform nonlinear time history analyses of SDOF systems with various parameter values and investigate their effects on resulting fragility functions through sensitivity analysis. The findings improve capacity curves and thereby fragility and/or vulnerability models for generic types of structures.
ERIC Educational Resources Information Center
Reardon, Sean F.; Brennan, Robert T.; Buka, Stephen L.
2002-01-01
Developed procedures for constructing a retrospective person-period data set from cross-sectional data and discusses modeling strategies for estimating multilevel discrete-time event history models. Applied the methods to the analysis of cigarette use by 1,979 urban adolescents. Results show the influence of the racial composition of the…
Bayesian Analysis of Biogeography when the Number of Areas is Large
Landis, Michael J.; Matzke, Nicholas J.; Moore, Brian R.; Huelsenbeck, John P.
2013-01-01
Historical biogeography is increasingly studied from an explicitly statistical perspective, using stochastic models to describe the evolution of species range as a continuous-time Markov process of dispersal between and extinction within a set of discrete geographic areas. The main constraint of these methods is the computational limit on the number of areas that can be specified. We propose a Bayesian approach for inferring biogeographic history that extends the application of biogeographic models to the analysis of more realistic problems that involve a large number of areas. Our solution is based on a “data-augmentation” approach, in which we first populate the tree with a history of biogeographic events that is consistent with the observed species ranges at the tips of the tree. We then calculate the likelihood of a given history by adopting a mechanistic interpretation of the instantaneous-rate matrix, which specifies both the exponential waiting times between biogeographic events and the relative probabilities of each biogeographic change. We develop this approach in a Bayesian framework, marginalizing over all possible biogeographic histories using Markov chain Monte Carlo (MCMC). Besides dramatically increasing the number of areas that can be accommodated in a biogeographic analysis, our method allows the parameters of a given biogeographic model to be estimated and different biogeographic models to be objectively compared. Our approach is implemented in the program, BayArea. [ancestral area analysis; Bayesian biogeographic inference; data augmentation; historical biogeography; Markov chain Monte Carlo.] PMID:23736102
Micheletti, Vania Celina Dezoti; Moreira, José da Silva; Ribeiro, Marta Osório; Kritski, Afranio Lineu; Braga, José Ueleres
2014-01-01
OBJECTIVE: To describe the prevalence of multidrug-resistant tuberculosis (MDR-TB) among tuberculosis patients in a major Brazilian city, evaluated via the Second National Survey on Antituberculosis Drug Resistance, as well as the social, demographic, and clinical characteristics of those patients. METHODS: Clinical samples were collected from tuberculosis patients seen between 2006 to 2007 at three hospitals and five primary health care clinics participating in the survey in the city of Porto Alegre, Brazil. The samples were subjected to drug susceptibility testing. The species of mycobacteria was confirmed using biochemical methods. RESULTS: Of the 299 patients included, 221 (73.9%) were men and 77 (27.3%) had a history of tuberculosis. The mean age was 36 years. Of the 252 patients who underwent HIV testing, 66 (26.2%) tested positive. The prevalence of MDR-TB in the sample as a whole was 4.7% (95% CI: 2.3-7.1), whereas it was 2.2% (95% CI: 0.3-4.2) among the new cases of tuberculosis and 12.0% (95% CI: 4.5-19.5) among the patients with a history of tuberculosis treatment. The multivariate analysis showed that a history of tuberculosis and a longer time to diagnosis were both associated with MDR-TB. CONCLUSIONS: If our results are corroborated by other studies conducted in Brazil, a history of tuberculosis treatment and a longer time to diagnosis could be used as predictors of MDR-TB. PMID:24831400
1991-03-21
discussion of spectral factorability and motivations for broadband analysis, the report is subdivided into four main sections. In Section 1.0, we...estimates. The motivation for developing our multi-channel deconvolution method was to gain information about seismic sources, most notably, nuclear...with complex constraints for estimating the rupture history. Such methods (applied mostly to data sets that also include strong rmotion data), were
A new method for determining the plasma electron density using three-color interferometer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arakawa, Hiroyuki; Kawano, Yasunori; Itami, Kiyoshi
2012-06-15
A new method for determining the plasma electron density using the fractional fringes on three-color interferometer is proposed. Integrated phase shift on each interferometer is derived without using the temporal history of the fractional fringes. The dependence on the fringe resolution and the electrical noise are simulated on the wavelengths of CO{sub 2} laser. Short-time integrations of the fractional fringes enhance the reliability of this method.
Lu, Wentian; Benson, Rebecca; Glaser, Karen; Corna, Laurie M; Worts, Diana; McDonough, Peggy; Price, Debora; Sacker, Amanda
2017-01-01
Background Given the acceleration of population ageing and policy changes to extend working lives, evidence is needed on the ability of older adults to work for longer. To understand more about the health impacts of work, this study examined the relationship between employment histories before retirement and trajectories of frailty thereafter. Methods The sample comprised 2765 women and 1621 men from the English Longitudinal Study of Ageing. We used gendered typologies of life-time employment and a frailty index (FI). Multilevel growth curve models were used to predict frailty trajectories by employment histories. Results Women who had a short break for family care, then did part-time work till 59 years had a lower FI after 60 years than those who undertook full-time work until 59 years. Women who were largely family carers or non-employed throughout adulthood, had higher levels of frailty at 60 years but experienced a slower decline with age. Men who worked full-time but early exited at either 49 or 60 years had a higher FI at 65 years than those who worked full-time up to 65 years. Interaction between employment histories and age indicated that men in full-time work who experienced an early exit at 49 tended to report slower declines. Conclusions For women, experiencing distinct periods throughout the lifecourse of either work or family care may be advantageous for lessening frailty risk in later life. For men, leaving paid employment before 65 years seems to be beneficial for decelerating increases in frailty thereafter. Continuous full-time work until retirement age conferred no long-term health benefits. PMID:27913614
Application of wavelet multi-resolution analysis for correction of seismic acceleration records
NASA Astrophysics Data System (ADS)
Ansari, Anooshiravan; Noorzad, Assadollah; Zare, Mehdi
2007-12-01
During an earthquake, many stations record the ground motion, but only a few of them could be corrected using conventional high-pass and low-pass filtering methods and the others were identified as highly contaminated by noise and as a result useless. There are two major problems associated with these noisy records. First, since the signal to noise ratio (S/N) is low, it is not possible to discriminate between the original signal and noise either in the frequency domain or in the time domain. Consequently, it is not possible to cancel out noise using conventional filtering methods. The second problem is the non-stationary characteristics of the noise. In other words, in many cases the characteristics of the noise are varied over time and in these situations, it is not possible to apply frequency domain correction schemes. When correcting acceleration signals contaminated with high-level non-stationary noise, there is an important question whether it is possible to estimate the state of the noise in different bands of time and frequency. Wavelet multi-resolution analysis decomposes a signal into different time-frequency components, and besides introducing a suitable criterion for identification of the noise among each component, also provides the required mathematical tool for correction of highly noisy acceleration records. In this paper, the characteristics of the wavelet de-noising procedures are examined through the correction of selected real and synthetic acceleration time histories. It is concluded that this method provides a very flexible and efficient tool for the correction of very noisy and non-stationary records of ground acceleration. In addition, a two-step correction scheme is proposed for long period correction of the acceleration records. This method has the advantage of stable results in displacement time history and response spectrum.
NASA Astrophysics Data System (ADS)
Hoch, Jeffrey C.
2017-10-01
Non-Fourier methods of spectrum analysis are gaining traction in NMR spectroscopy, driven by their utility for processing nonuniformly sampled data. These methods afford new opportunities for optimizing experiment time, resolution, and sensitivity of multidimensional NMR experiments, but they also pose significant challenges not encountered with the discrete Fourier transform. A brief history of non-Fourier methods in NMR serves to place different approaches in context. Non-Fourier methods reflect broader trends in the growing importance of computation in NMR, and offer insights for future software development.
"Tectonic Petrameter," An Alternative Method to Teaching the Geologic Time Scale
NASA Astrophysics Data System (ADS)
Posner, E. S.
2011-12-01
I have over a decade of experience as a performance poet and am now a graduate student in the geosciences. I have created a performance poem / play script, "Tectonic Petrameter," as an alternative method of teaching the geologic time scale. "The Archean came next and it was a blast. Tectonic plates were smaller and they moved pretty fast. In an enthusiastic flash of ash, volcanic islands smashed together." The use of rhyme and rhythm presents a different and interdisciplinary approach to teaching Earth history that appeals to a wide range of learning styles and makes science fun, while clearly describing important concepts in geology and events in Earth history. "Now it's time to get down with the Coal Swamp Stomp! Tap your feet to the beat of the formation of peat like a plant plantation soaking up the bright heat." "Tectonic Petrameter" by itself is an illustrated spoken-word poem that leads audiences from all levels of scientific background on an excitingly educational journey through geologic time. I will perform my 10-minute memorized poem and present results from my ongoing study to assess the effectiveness of "Tectonic Petrameter" as a teaching tool in K-12 and introductory undergraduate classroom curricula. I propose that using "Tectonic Petrameter" as a performance piece and theatrical play script in K-12 and introductory undergraduate classrooms, as well as in broader community venues, may be an avenue for breaking down barriers related to teaching about Earth's long and complex history. Digital copies of "Tectonic Petrameter" will be made available to interested parties.
Continuous Time in Consistent Histories
NASA Astrophysics Data System (ADS)
Savvidou, Konstantina
1999-12-01
We discuss the case of histories labelled by a continuous time parameter in the History Projection Operator consistent-histories quantum theory. We describe how the appropriate representation of the history algebra may be chosen by requiring the existence of projection operators that represent propositions about time averages of the energy. We define the action operator for the consistent histories formalism, as the quantum analogue of the classical action functional, for the simple harmonic oscillator case. We show that the action operator is the generator of two types of time transformations that may be related to the two laws of time-evolution of the standard quantum theory: the `state-vector reduction' and the unitary time-evolution. We construct the corresponding classical histories and demonstrate the relevance with the quantum histories; we demonstrate how the requirement of the temporal logic structure of the theory is sufficient for the definition of classical histories. Furthermore, we show the relation of the action operator to the decoherence functional which describes the dynamics of the system. Finally, the discussion is extended to give a preliminary account of quantum field theory in this approach to the consistent histories formalism.
Constructing stage-structured matrix population models from life tables: comparison of methods
Diaz-Lopez, Jasmin
2017-01-01
A matrix population model is a convenient tool for summarizing per capita survival and reproduction rates (collectively vital rates) of a population and can be used for calculating an asymptotic finite population growth rate (λ) and generation time. These two pieces of information can be used for determining the status of a threatened species. The use of stage-structured population models has increased in recent years, and the vital rates in such models are often estimated using a life table analysis. However, potential bias introduced when converting age-structured vital rates estimated from a life table into parameters for a stage-structured population model has not been assessed comprehensively. The objective of this study was to investigate the performance of methods for such conversions using simulated life histories of organisms. The underlying models incorporate various types of life history and true population growth rates of varying levels. The performance was measured by comparing differences in λ and the generation time calculated using the Euler-Lotka equation, age-structured population matrices, and several stage-structured population matrices that were obtained by applying different conversion methods. The results show that the discretization of age introduces only small bias in λ or generation time. Similarly, assuming a fixed age of maturation at the mean age of maturation does not introduce much bias. However, aggregating age-specific survival rates into a stage-specific survival rate and estimating a stage-transition rate can introduce substantial bias depending on the organism’s life history type and the true values of λ. In order to aggregate survival rates, the use of the weighted arithmetic mean was the most robust method for estimating λ. Here, the weights are given by survivorship curve after discounting with λ. To estimate a stage-transition rate, matching the proportion of individuals transitioning, with λ used for discounting the rate, was the best approach. However, stage-structured models performed poorly in estimating generation time, regardless of the methods used for constructing the models. Based on the results, we recommend using an age-structured matrix population model or the Euler-Lotka equation for calculating λ and generation time when life table data are available. Then, these age-structured vital rates can be converted into a stage-structured model for further analyses. PMID:29085763
Constructing stage-structured matrix population models from life tables: comparison of methods.
Fujiwara, Masami; Diaz-Lopez, Jasmin
2017-01-01
A matrix population model is a convenient tool for summarizing per capita survival and reproduction rates (collectively vital rates) of a population and can be used for calculating an asymptotic finite population growth rate ( λ ) and generation time. These two pieces of information can be used for determining the status of a threatened species. The use of stage-structured population models has increased in recent years, and the vital rates in such models are often estimated using a life table analysis. However, potential bias introduced when converting age-structured vital rates estimated from a life table into parameters for a stage-structured population model has not been assessed comprehensively. The objective of this study was to investigate the performance of methods for such conversions using simulated life histories of organisms. The underlying models incorporate various types of life history and true population growth rates of varying levels. The performance was measured by comparing differences in λ and the generation time calculated using the Euler-Lotka equation, age-structured population matrices, and several stage-structured population matrices that were obtained by applying different conversion methods. The results show that the discretization of age introduces only small bias in λ or generation time. Similarly, assuming a fixed age of maturation at the mean age of maturation does not introduce much bias. However, aggregating age-specific survival rates into a stage-specific survival rate and estimating a stage-transition rate can introduce substantial bias depending on the organism's life history type and the true values of λ . In order to aggregate survival rates, the use of the weighted arithmetic mean was the most robust method for estimating λ . Here, the weights are given by survivorship curve after discounting with λ . To estimate a stage-transition rate, matching the proportion of individuals transitioning, with λ used for discounting the rate, was the best approach. However, stage-structured models performed poorly in estimating generation time, regardless of the methods used for constructing the models. Based on the results, we recommend using an age-structured matrix population model or the Euler-Lotka equation for calculating λ and generation time when life table data are available. Then, these age-structured vital rates can be converted into a stage-structured model for further analyses.
Real-Time Assessment of Mechanical Tissue Trauma in Surgery.
Chandler, James H; Mushtaq, Faisal; Moxley-Wyles, Benjamin; West, Nicholas P; Taylor, Gregory W; Culmer, Peter R
2017-10-01
This work presents a method to assess and prevent tissue trauma in real-time during surgery. Tissue trauma occurs routinely during laparoscopic surgery with potentially severe consequences. As such, it is crucial that a surgeon is able to regulate the pressure exerted by surgical instruments. We propose a novel method to assess the onset of tissue trauma by considering the mechanical response of tissue as it is loaded in real-time. We conducted a parametric study using a lab-based grasping model and differing load conditions. Mechanical stress-time data were analyzed to characterize the tissue response to grasps. Qualitative and quantitative histological analyses were performed to inspect damage characteristics of the tissue under different load conditions. These were correlated against the mechanical measures to identify the nature of trauma onset with respect to our predictive metric. Results showed increasing tissue trauma with load and a strong correlation with the mechanical response of the tissue. Load rate and load history also showed a clear effect on tissue response. The proposed method for trauma assessment was effective in identifying damage. The metric can be normalized with respect to loading rate and history, making it feasible in the unconstrained environment of intraoperative surgery. This work demonstrates that tissue trauma can be predicted using mechanical measures in real-time. Applying this technique to laparoscopic tools has the potential to reduce unnecessary tissue trauma and its associated complications by indicating through user feedback or actively regulating the mechanical impact of surgical instruments.
Chau, Destiny F; Reddy, Arundathi; Breheny, Patrick; Young, Anna Rebecca; Ashford, Eric; Song, Megan; Zhang, Christina; Taylor, Tammy; Younes, Abbas; Vazifedan, Turaj
2017-01-01
Background and Aims: Post-operative vomiting (POV) in children remains a significant clinical problem. This prospective study aims to investigate the applicability of well-established adult early post-operative nausea and vomiting (PONV) risk factors on paediatric POV after adenotonsillectomies under regulated anaesthetic conditions. Methods: After Institutional Review Board approval, 213 children aged 3–10-year-old were enrolled. The participants had pre-operative questionnaires completed, followed protocolised anaesthetic plans and had saliva analysed for cotinine. The primary outcomes were POV as correlated with age, gender, family or personal history of PONV, motion sickness history, opioid use, surgical time, anaesthetic time and environmental tobacco smoke (ETS) exposure, as assessed by cotinine levels and questionnaire reports. Data on analgesics, antiemetics and POV incidence before post-anaesthesia care unit discharge were collected. Statistical analysis was done through multiple logistic regression. Results: A total of 200 patients finalised the study. Early POV occurred in 32%. Family history of PONV (odds ratio [OR] = 5.3, P < 0.01) and motion sickness history (OR = 4.4, P = 0.02) were highly significant risk factors. Age reached borderline statistical significance (OR = 1.4, P = 0.05). None of the other factors reached statistical significance. Conclusion: Early POV occurs frequently in paediatric patients undergoing adenotonsillectomies. In this paediatric-aged group, the incidence of POV was affected by the family history of PONV, and history of motion sickness. Age, female gender, opioid use, surgical and anaesthetic times did not affect the incidence of POV. ETS exposure, as assessed by cotinine levels and questionnaire reports, had no protective effect on early paediatric POV. PMID:29307901
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smallwood, D.O.
In a previous paper Smallwood and Paez (1991) showed how to generate realizations of partially coherent stationary normal time histories with a specified cross-spectral density matrix. This procedure is generalized for the case of multiple inputs with a specified cross-spectral density function and a specified marginal probability density function (pdf) for each of the inputs. The specified pdfs are not required to be Gaussian. A zero memory nonlinear (ZMNL) function is developed for each input to transform a Gaussian or normal time history into a time history with a specified non-Gaussian distribution. The transformation functions have the property that amore » transformed time history will have nearly the same auto spectral density as the original time history. A vector of Gaussian time histories are then generated with the specified cross-spectral density matrix. These waveforms are then transformed into the required time history realizations using the ZMNL function.« less
Investigating Supermassive Black Hole Spin at Different Redshift
NASA Astrophysics Data System (ADS)
Sinanan-Singh, Jasmine
2018-01-01
Supermassive black hole (SMBH) spin encodes vital information about the history of SMBH growth. High spins indicate a history of growth through large mass accretion events, which spin-up the black hole; Intermediate spins indicate a history of galactic mergers, which don't tend to systemcatically spin-up or spin-down black holes; low spins are attributed to successive, small accretion events with random orientations. Examining spin over different redshifts will help us understand the relative growth of SMBHs by mergers or accretion over cosmic time, an important part of understanding how SMBHs and their host galaxies co-evolved over time. To study spin, we compute the Fe K alpha emission line from the X-ray spectra of AGN sources in the Chandra-COSMOS Legacy Survey. We stack rest frame AGN spectra to improve the signal-to-noise ratio since the photon counts are low for individual spectra, and then average the spectra using an unwieghted mean. Our method is derived from Corral et al. (2008). We test our method on the two brightest sources in the COSMOS Survey and compute the rest frame average Fe K alpha emission line for different redshift bins. The SAO REU program is funded by the National Science Foundation REU and Department of Defense ASSURE programs under NSF Grant AST-1659473, and by the Smithsonian Institution.
Vig, Hetal S.; McCarthy, Anne Marie; Liao, Kaijun; Demeter, Mirar Bristol; Fredericks, Tracey; Armstrong, Katrina
2013-01-01
Background Standard BRCA genetic testing criteria include young age of diagnosis, family history, and Jewish ancestry. The purpose of this study was to assess the effect of these criteria on BRCA test utilization in breast cancer patients. Methods Breast cancer patients aged 18-64yrs living in Pennsylvania in 2007 completed a survey on family history of breast and ovarian cancer and BRCA testing (N=2213). Multivariate logistic regression was used to estimate odds of BRCA testing by patient characteristics, and predicted probabilities of testing were calculated for several clinical scenarios. Results Young age at diagnosis (<50 yrs.) was strongly associated with BRCA testing, with women diagnosed before age 50 yrs. having nearly five times the odds of receiving BRCA testing compared to women diagnosed at age 50 or older (OR=4.81, 95% CI: 3.85-6.00, p<0.001). Despite a similar BRCA mutation prevalence estimate (8-10%), a young Jewish patient <50 yrs. with no family history had markedly higher predicted probability of testing (63%) compared with an older, non-Jewish breast cancer patient with more than 1 first degree relative (FDR) (43%). Conclusion Age at diagnosis, Jewish ancestry, and both maternal and paternal family history are strongly predictive of BRCA testing. However, among women diagnosed at age 50 or older, family history may be an underutilized criterion that may benefit from targeted intervention. Impact Robust methods specific to ascertaining detailed family history, such as through electronic medical records (EMR), are needed to accurately identify patients for BRCA testing. PMID:23917453
Constitutive Modelling of Resins in the Stiffness Domain
NASA Astrophysics Data System (ADS)
Klasztorny, M.
2004-09-01
An analytic method for inverting the constitutive compliance equations of viscoelasticity for resins is developed. These equations describe the HWKK/H rheological model, which makes it possible to simulate, with a good accuracy, short-, medium- and long-term viscoelastic processes in epoxy and polyester resins. These processes are of first-rank reversible isothermal type. The time histories of deviatoric stresses are simulated with three independent strain history functions of fractional and normal exponential types. The stiffness equations are described by two elastic and six viscoelastic constants having a clear physic meaning (three long-term relaxation coefficients and three relaxation times). The time histories of axiatoric stresses are simulated as perfectly elastic. The inversion method utilizes approximate constitutive stiffness equations of viscoelasticity for the HWKK/H model. The constitutive compliance equations for the model are a basis for determining the exact complex shear stiffness, whereas the approximate constitutive stiffness equations are used for determining the approximate complex shear stiffness. The viscoelastic constants in the stiffness domain are derived by equating the exact and approximate complex shear stiffnesses. The viscoelastic constants are obtained for Epidian 53 epoxy and Polimal 109 polyester resins. The accuracy of the approximate constitutive stiffness equations are assessed by comparing the approximate and exact complex shear stiffnesses. The constitutive stiffness equations for the HWKK/H model are presented in uncoupled (shear/bulk) and coupled forms. Formulae for converting the constants of shear viscoelasticity into the constants of coupled viscoelasticity are given as well.
Analysis of general-aviation accidents using ATC radar records
NASA Technical Reports Server (NTRS)
Wingrove, R. C.; Bach, R. E., Jr.
1982-01-01
It is pointed out that general aviation aircraft usually do not carry flight recorders, and in accident investigations the only available data may come from the Air Traffic Control (ATC) records. A description is presented of a technique for deriving time-histories of aircraft motions from ATC radar records. The employed procedure involves a smoothing of the raw radar data. The smoothed results, in combination with other available information (meteorological data and aircraft aerodynamic data) are used to derive the expanded set of motion time-histories. Applications of the considered analytical methods are related to different types of aircraft, such as light piston-props, executive jets, and commuter turboprops, as well as different accident situations, such as takeoff, climb-out, icing, and deep stall.
Tree age, disturbance history, and carbon stocks and fluxes in subalpine Rocky Mountain forests
J.B. Bradford; R.A. Birdsey; L.A. Joyce; M.G. Ryan
2008-01-01
Forest carbon stocks and fluxes vary with forest age, and relationships with forest age are often used to estimate fluxes for regional or national carbon inventories. Two methods are commonly used to estimate forest age: observed tree age or time since a known disturbance. To clarify the relationships between tree age, time since disturbance and forest carbon storage...
NASA Astrophysics Data System (ADS)
Johnson, Kyle L.; Rodgers, Theron M.; Underwood, Olivia D.; Madison, Jonathan D.; Ford, Kurtis R.; Whetten, Shaun R.; Dagel, Daryl J.; Bishop, Joseph E.
2018-05-01
Additive manufacturing enables the production of previously unachievable designs in conjunction with time and cost savings. However, spatially and temporally fluctuating thermal histories can lead to residual stress states and microstructural variations that challenge conventional assumptions used to predict part performance. Numerical simulations offer a viable way to explore the root causes of these characteristics, and can provide insight into methods of controlling them. Here, the thermal history of a 304L stainless steel cylinder produced using the Laser Engineered Net Shape process is simulated using finite element analysis (FEA). The resultant thermal history is coupled to both a solid mechanics FEA simulation to predict residual stress and a kinetic Monte Carlo model to predict the three-dimensional grain structure evolution. Experimental EBSD measurements of grain structure and in-process infrared thermal data are compared to the predictions.
The history and evolution of sutures in pelvic surgery
Muffly, Tyler M; Tizzano, Anthony P; Walters, Mark D
2011-01-01
Summary The purpose of the study is to review the history and innovations of sutures used in pelvic surgery. Based on a review of the literature using electronic- and hand-searched databases we identified appropriate articles and gynaecology surgical textbooks regarding suture for wound closure. The first documented uses of suture are explored and then the article focuses on the use of knotted materials in pelvic surgery. The development of suture of natural materials is followed chronologically until the present time where synthetic suture is implanted during countless surgeries every day. This millennial history of suture contains an appreciation of the early work of Susruta, Celsus, Paré and Lister, including a survey of some significant developments of suture methods over the last 100 years. Most surgeons know little about the history and science of sutures. A retrospective view of suture is critical to the appreciation of the current work and development of this common tool. PMID:21357979
NASA Astrophysics Data System (ADS)
Johnson, Kyle L.; Rodgers, Theron M.; Underwood, Olivia D.; Madison, Jonathan D.; Ford, Kurtis R.; Whetten, Shaun R.; Dagel, Daryl J.; Bishop, Joseph E.
2017-12-01
Additive manufacturing enables the production of previously unachievable designs in conjunction with time and cost savings. However, spatially and temporally fluctuating thermal histories can lead to residual stress states and microstructural variations that challenge conventional assumptions used to predict part performance. Numerical simulations offer a viable way to explore the root causes of these characteristics, and can provide insight into methods of controlling them. Here, the thermal history of a 304L stainless steel cylinder produced using the Laser Engineered Net Shape process is simulated using finite element analysis (FEA). The resultant thermal history is coupled to both a solid mechanics FEA simulation to predict residual stress and a kinetic Monte Carlo model to predict the three-dimensional grain structure evolution. Experimental EBSD measurements of grain structure and in-process infrared thermal data are compared to the predictions.
The history and evolution of sutures in pelvic surgery.
Muffly, Tyler M; Tizzano, Anthony P; Walters, Mark D
2011-03-01
The purpose of the study is to review the history and innovations of sutures used in pelvic surgery. Based on a review of the literature using electronic- and hand-searched databases we identified appropriate articles and gynaecology surgical textbooks regarding suture for wound closure. The first documented uses of suture are explored and then the article focuses on the use of knotted materials in pelvic surgery. The development of suture of natural materials is followed chronologically until the present time where synthetic suture is implanted during countless surgeries every day. This millennial history of suture contains an appreciation of the early work of Susruta, Celsus, Paré and Lister, including a survey of some significant developments of suture methods over the last 100 years. Most surgeons know little about the history and science of sutures. A retrospective view of suture is critical to the appreciation of the current work and development of this common tool.
History by history statistical estimators in the BEAM code system.
Walters, B R B; Kawrakow, I; Rogers, D W O
2002-12-01
A history by history method for estimating uncertainties has been implemented in the BEAMnrc and DOSXYznrc codes replacing the method of statistical batches. This method groups scored quantities (e.g., dose) by primary history. When phase-space sources are used, this method groups incident particles according to the primary histories that generated them. This necessitated adding markers (negative energy) to phase-space files to indicate the first particle generated by a new primary history. The new method greatly reduces the uncertainty in the uncertainty estimate. The new method eliminates one dimension (which kept the results for each batch) from all scoring arrays, resulting in memory requirement being decreased by a factor of 2. Correlations between particles in phase-space sources are taken into account. The only correlations with any significant impact on uncertainty are those introduced by particle recycling. Failure to account for these correlations can result in a significant underestimate of the uncertainty. The previous method of accounting for correlations due to recycling by placing all recycled particles in the same batch did work. Neither the new method nor the batch method take into account correlations between incident particles when a phase-space source is restarted so one must avoid restarts.
Bai, Shirong; Skodje, Rex T
2017-08-17
A new approach is presented for simulating the time-evolution of chemically reactive systems. This method provides an alternative to conventional modeling of mass-action kinetics that involves solving differential equations for the species concentrations. The method presented here avoids the need to solve the rate equations by switching to a representation based on chemical pathways. In the Sum Over Histories Representation (or SOHR) method, any time-dependent kinetic observable, such as concentration, is written as a linear combination of probabilities for chemical pathways leading to a desired outcome. In this work, an iterative method is introduced that allows the time-dependent pathway probabilities to be generated from a knowledge of the elementary rate coefficients, thus avoiding the pitfalls involved in solving the differential equations of kinetics. The method is successfully applied to the model Lotka-Volterra system and to a realistic H 2 combustion model.
Factors associated with sterilization use among women leaving a U.S. jail: a mixed methods study.
Ramaswamy, Megha; Kelly, Patricia J
2014-07-31
Despite the high rates of reported sterilization use among women who have spent time in correctional facilities, little is known about the context in which women in this population choose this option. The objective of our study was to use both quantitative and qualitative methods to understand factors associated with sterilization use among women leaving a U.S. jail. We administered a cross-sectional survey with 102 jailed women who were participating in a study about contraceptive use after release from jail, and then conducted semi-structured interviews with 29 of those women after their release from jail. We used logistic regression and analytic induction to assess factors associated with self-reported sterilization use. In our cross-sectional survey, one-third of our sample reported a history of sterilization use. Controlling for age and past pregnancies, the only factor associated with sterilization use was physical abuse history before age 16. In semi-structured interviews, we found that women's primary motivation for sterilization was the desire to limit childbearing permanently, in some cases where other contraceptive methods had failed them. The decision for sterilization was generally supported by family, partners, and providers. Many women who opted for sterilization expressed financial concern about supporting children and/or reported family histories of sterilization. The decision to use the permanent method of sterilization as a contraceptive method is a complex one. Results from this study suggest that while explicit coercion may not be a factor in women's choice for sterilization, interpersonal relationship histories, negative experiences with contraceptives, and structural constraints, such as financial concerns and ongoing criminal justice involvement, seem to influence sterilization use among the vulnerable group of women with criminal justice histories. Public health programs that connect women to reproductive health services should acknowledge constraints on contraceptive decision-making in vulnerable populations.
[The practice of research in history].
Poisson, Michel
2013-11-01
History is a discipline whose research methods present similarities with those of other human and social sciences, with certain specificities. The nursing profession can use these methods to showcase its history and the history of nursing care.
Kucmin, Tomasz; Płowaś-Goral, Małgorzata; Nogalski, Adam
2015-02-01
Cardiopulmonary resuscitation (CPR) is relatively novel branch of medical science, however first descriptions of mouth-to-mouth ventilation are to be found in the Bible and literature is full of descriptions of different resuscitation methods - from flagellation and ventilation with bellows through hanging the victims upside down and compressing the chest in order to stimulate ventilation to rectal fumigation with tobacco smoke. The modern history of CPR starts with Kouwenhoven et al. who in 1960 published a paper regarding heart massage through chest compressions. Shortly after that in 1961Peter Safar presented a paradigm promoting opening the airway, performing rescue breaths and chest compressions. First CPR guidelines were published in 1966. Since that time guidelines were modified and improved numerously by two leading world expert organizations ERC (European Resuscitation Council) and AHA (American Heart Association) and published in a new version every 5 years. Currently 2010 guidelines should be obliged. In this paper authors made an attempt to present history of development of resuscitation techniques and methods and assess the influence of previous lifesaving methods on nowadays technologies, equipment and guidelines which allow to help those women and men whose life is in danger due to sudden cardiac arrest. © 2015 MEDPRESS.
Hooked on Inquiry: History Labs in the Methods Course
ERIC Educational Resources Information Center
Wood, Linda Sargent
2012-01-01
Methods courses provide a rich opportunity to unpack what it means to "learn history by doing history." To help explain what "doing history" means, the author has created history labs to walk teacher candidates through the historical process. Each lab poses a historical problem, requires analysis of primary and secondary…
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nitao, J J
The goal of the Event Reconstruction Project is to find the location and strength of atmospheric release points, both stationary and moving. Source inversion relies on observational data as input. The methodology is sufficiently general to allow various forms of data. In this report, the authors will focus primarily on concentration measurements obtained at point monitoring locations at various times. The algorithms being investigated in the Project are the MCMC (Markov Chain Monte Carlo), SMC (Sequential Monte Carlo) Methods, classical inversion methods, and hybrids of these. They refer the reader to the report by Johannesson et al. (2004) for explanationsmore » of these methods. These methods require computing the concentrations at all monitoring locations for a given ''proposed'' source characteristic (locations and strength history). It is anticipated that the largest portion of the CPU time will take place performing this computation. MCMC and SMC will require this computation to be done at least tens of thousands of times. Therefore, an efficient means of computing forward model predictions is important to making the inversion practical. In this report they show how Green's functions and reciprocal Green's functions can significantly accelerate forward model computations. First, instead of computing a plume for each possible source strength history, they can compute plumes from unit impulse sources only. By using linear superposition, they can obtain the response for any strength history. This response is given by the forward Green's function. Second, they may use the law of reciprocity. Suppose that they require the concentration at a single monitoring point x{sub m} due to a potential (unit impulse) source that is located at x{sub s}. instead of computing a plume with source location x{sub s}, they compute a ''reciprocal plume'' whose (unit impulse) source is at the monitoring locations x{sub m}. The reciprocal plume is computed using a reversed-direction wind field. The wind field and transport coefficients must also be appropriately time-reversed. Reciprocity says that the concentration of reciprocal plume at x{sub s} is related to the desired concentration at x{sub m}. Since there are many less monitoring points than potential source locations, the number of forward model computations is drastically reduced.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Esmaeili, Shahrzad; Lloyd, David J.
2005-11-15
Differential scanning calorimetry is used to quantify the evolution of the volume fraction of precipitates during age hardening in AlMgSiCu alloys. The calorimetry tests are run on alloy samples after aging for various times at 180 deg. C and the change in the collective heat effects from the major precipitation and dissolution processes in each run are used to determine the precipitation state of the samples. The method is implemented on alloys with various thermal histories prior to artificial aging, including commercial pre-aging histories. The estimated values for the relative volume fraction of precipitates are compared with the results frommore » a newly developed analytical method using isothermal calorimetry and a related quantitative transmission electron microscopy work. Excellent agreement is obtained between the results from various methods.« less
FLEXAN (version 2.0) user's guide
NASA Technical Reports Server (NTRS)
Stallcup, Scott S.
1989-01-01
The FLEXAN (Flexible Animation) computer program, Version 2.0 is described. FLEXAN animates 3-D wireframe structural dynamics on the Evans and Sutherland PS300 graphics workstation with a VAX/VMS host computer. Animation options include: unconstrained vibrational modes, mode time histories (multiple modes), delta time histories (modal and/or nonmodal deformations), color time histories (elements of the structure change colors through time), and rotational time histories (parts of the structure rotate through time). Concurrent color, mode, delta, and rotation, time history animations are supported. FLEXAN does not model structures or calculate the dynamics of structures; it only animates data from other computer programs. FLEXAN was developed to aid in the study of the structural dynamics of spacecraft.
Enlightening the life sciences: the history of halobacterial and microbial rhodopsin research.
Grote, Mathias; O'Malley, Maureen A
2011-11-01
The history of research on microbial rhodopsins offers a novel perspective on the history of the molecular life sciences. Events in this history play important roles in the development of fields such as general microbiology, membrane research, bioenergetics, metagenomics and, very recently, neurobiology. New concepts, techniques, methods and fields have arisen as a result of microbial rhodopsin investigations. In addition, the history of microbial rhodopsins sheds light on the dynamic connections between basic and applied science, and hypothesis-driven and data-driven approaches. The story begins with the late nineteenth century discovery of microorganisms on salted fish and leads into ecological and taxonomical studies of halobacteria in hypersaline environments. These programmes were built on by the discovery of bacteriorhodopsin in organisms that are part of what is now known as the archaeal genus Halobacterium. The transfer of techniques from bacteriorhodopsin studies to the metagenomic discovery of proteorhodopsin in 2000 further extended the field. Microbial rhodopsins have also been used as model systems to understand membrane protein structure and function, and they have become the target of technological applications such as optogenetics and nanotechnology. Analysing the connections between these historical episodes provides a rich example of how science works over longer time periods, especially with regard to the transfer of materials, methods and concepts between different research fields. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barker, C.E.; Crysdale, B.L.
1990-05-01
The burial history of this fractured Niobrara Limestone reservoir and source rock offers a setting for studying the stabilization of thermal maturity because soon after peak temperature of approximately 100{degree}C was reached, exhumation lowered temperature to about 60-70{degree}C. Vitrinite reflectance (Rm = 0.6-0.7%) and published clay mineralogy data from the Niobrara Limestone indicate that peak paleotemperature was approximately 100{degree}C. Fluid inclusion data also indicate oil migration occurred at 100{degree}C. Burial history reconstruction indicates 100{degree}C was reached in the Niobrara Limestone only during minimum burial, which occurred at 70 Ma and 8000 ft depth. However, erosion beginning at 70 Ma andmore » continuing until 50 Ma removed over 3,000 ft of rock. This depth of erosion agrees with an Rm of 0.4% measured in surface samples of the Pierre Shale. The exhumation of the reservoir decreased temperature by about 30{degree}C to near the corrected bottom-hole temperature of 50-70{degree}C. Lopatin time-temperature index (TTI) analysis suggests the Niobrara Limestone as a source rock matured to the oil generation stage (TTI = 10) about 25 Ma, significantly later than maximum burial, and after exhumation caused cooling. The Lopatin TTI method in this case seems to overestimate the influence of heating time. If time is an important factor, thermal maturity should continue to increase after peak burial and temperature so that vitrinite reflectance will not be comparable to peak paleotemperatures estimated from geothermometers set at near-peak temperature and those estimated from burial history reconstruction. The agreement between geothermometry and the burial history reconstruction in Berthoud State 4 suggests that the influence of heating time must be small. The elapsed time available at near peak temperatures was sufficient to allow stabilization of thermal maturation in this case.« less
Review and Future Research Directions about Major Monitoring Method of Soil Erosion
NASA Astrophysics Data System (ADS)
LI, Yue; Bai, Xiaoyong; Tian, Yichao; Luo, Guangjie
2017-05-01
Soil erosion is a highly serious ecological problem that occurs worldwide. Hence,scientific methods for accurate monitoring are needed to obtain soil erosion data. At present,numerous methods on soil erosion monitoring are being used internationally. In this paper, wepresent a systematic classification of these methods based on the date of establishment andtype of approach. This classification comprises five categories: runoff plot method, erosion pinmethod, radionuclide tracer method, model estimation, and 3S technology combined method.The backgrounds of their establishment are briefly introduced, the history of their developmentis reviewed, and the conditions for their application are enumerated. Their respectiveadvantages and disadvantages are compared and analysed, and future prospects regarding theirdevelopment are discussed. We conclude that the methods of soil erosion monitoring in the past 100 years of their development constantly considered the needs of the time. According to the progress of soil erosion monitoring technology throughout its history, we predict that the future trend in this field would move toward the development of quantitative, precise, and composite methods. This report serves as a valuable reference for scientific and technological workers globally, especially those engaged in soil erosion research.
Adaptive form-finding method for form-fixed spatial network structures
NASA Astrophysics Data System (ADS)
Lan, Cheng; Tu, Xi; Xue, Junqing; Briseghella, Bruno; Zordan, Tobia
2018-02-01
An effective form-finding method for form-fixed spatial network structures is presented in this paper. The adaptive form-finding method is introduced along with the example of designing an ellipsoidal network dome with bar length variations being as small as possible. A typical spherical geodesic network is selected as an initial state, having bar lengths in a limit group number. Next, this network is transformed into the ellipsoidal shape as desired by applying compressions on bars according to the bar length variations caused by transformation. Afterwards, the dynamic relaxation method is employed to explicitly integrate the node positions by applying residual forces. During the form-finding process, the boundary condition of constraining nodes on the ellipsoid surface is innovatively considered as reactions on the normal direction of the surface at node positions, which are balanced with the components of the nodal forces in a reverse direction induced by compressions on bars. The node positions are also corrected according to the fixed-form condition in each explicit iteration step. In the serial results of time history, the optimal solution is found from a time history of states by properly choosing convergence criteria, and the presented form-finding procedure is proved to be applicable for form-fixed problems.
Clerici, Carlo Alfredo; Veneroni, Laura; Patriarca, Carlo
2014-11-01
Andrea Pasta was an eclectic visionary light years ahead of his time. He made numerous contributions to the field of medicine, some recognized by his contemporaries and others so visionary that they are being applied only in modern times. His contributions spanned the disciplines of psychology, gynaecology, haematology, infectious diseases and the doctor-patient relationship. Well known among his contemporaries, he combined a passion for clinical medicine and a keen interest in history and art with a strict research methodology and an approach to caring for patients as human beings. By studying his life and works, we can better understand the magnitude and significance of his innovative method and its applicability in modern times and also the significance of his many contributions. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Multiaxis Rainflow Fatigue Methods for Nonstationary Vibration
NASA Technical Reports Server (NTRS)
Irvine, T.
2016-01-01
Mechanical structures and components may be subjected to cyclical loading conditions, including sine and random vibration. Such systems must be designed and tested accordingly. Rainflow cycle counting is the standard method for reducing a stress time history to a table of amplitude-cycle pairings prior to the Palmgren-Miner cumulative damage calculation. The damage calculation is straightforward for sinusoidal stress but very complicated for random stress, particularly for nonstationary vibration. This paper evaluates candidate methods and makes a recommendation for further study of a hybrid technique.
Al-Saran, Yazeed; Al-Moawi, Ahlam; Bin Dous, Abdullah; Al-Ahaideb, Abdulaziz
2017-01-01
Aim The study aim was to determine the prevalence of neck, shoulder, and low-back pains and to explore the factors associated with musculoskeletal pain (MSP) among medical students at university hospitals in central Saudi Arabia. Method This cross-sectional study was conducted at a government institution using an online self-administered, modified version of the Standardised Nordic Questionnaire in the English language. Results A total of 469 students responded to our survey. The prevalence of MSP in at least one body site at any time, in the past week, and in the past year was 85.3%, 54.4%, and 81.9%, respectively. Factors significantly associated with MSP in at least one body site at any time were being in the clinical year (P = 0.032), history of trauma (P = 0.036), history of depressive symptoms (P < 0.001), and history of psychosomatic symptoms (P < 0.001). On multivariable regression analysis, factors associated with MSP were history of trauma (P = 0.016) and depressive (P = 0.002) or psychosomatic symptoms (P = 0.004). Conclusion MSP among Saudi medical students is high, particularly among those in the clinical years and those with history of trauma and with depressive or psychosomatic symptoms. Medical institutions should be aware of this serious health issue and preventive measures are warranted. PMID:29238618
NASA Astrophysics Data System (ADS)
Henry, Mary Catherine
The use of active and passive remote sensing systems for relating forest spatial patterns to fire history was tested over one of the Arizona Sky Islands. Using Landsat Thematic Mapper (TM), Shuttle Imaging Radar (SIR-C), and data fusion I examined the relationship between landscape metrics and a range of fire history characteristics. Each data type (TM, SIR-C, and fused) was processed in the following manner: each band, channel, or derived feature was simplified to a thematic layer and landscape statistics were calculated for plots with known fire history. These landscape metrics were then correlated with fire history characteristics, including number of fire-free years in a given time period, mean fire-free interval, and time since fire. Results from all three case studies showed significant relationships between fire history and forest spatial patterns. Data fusion performed as well or better than Landsat TM alone, and better than SIR-C alone. These comparisons were based on number and strength of significant correlations each method achieved. The landscape metric that was most consistent and obtained the greatest number of significant correlations was Shannon's Diversity Index. Results also agreed with field-based research that has linked higher fire frequency to increased landscape diversity and patchiness. An additional finding was that the fused data seem to detect fire-related spatial patterns over a range of scales.
Carroy, Jacqueline; Schmidgen, Henning
2004-01-01
This article diiscusses from a comparative perspective the complex history of the reaction experiment with the Hipp chronoscope, one of the central experiments of late 19th-century psychology. It focuses on Wilhelm Wundt's (1832-1920) Institute for Experimental Psychology in Leipzig and on the Paris Laboratory for Physiological Psychology at the Sorbonne, which was initially directed by Henry Beaunis (1830-1921), but soon came to be dominated by the research activities of Alfred Binet (1857-1911). When the Paris psychologists founded their Laboratory in 1889, they took the Leipzig Institute as their model. In the early 1890s they adopted the reaction time experiment that had been central to Wundt's psychology. Shortly after, they modified this experiment according to their own specific interests. For Binet, it no longer served as a method for identifying the elementary components of "general" consciousness (as in Wundt), but for classifying "individual" personalities. The methodological and technological changes that Binet introduced into the experimental practice of psychology had no immediate impact on the research work in Leipzig. However, they influenced the "Wurzburg School" of psychology under Wundt's former assistant, Oswald Külpe (1862-1915). This illustrates that the comparative history of transfers of "experimental systems" (Rheinberger) across national borders is not simply a history of mere transports. Rather, it is a history of transferences that sometimes includes surprising "re-transferences".
Provenance information as a tool for addressing engineered nanoparticle reproducibility challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baer, Donald R.; Munusamy, Prabhakaran; Thrall, Brian D.
Nanoparticles of various types are of increasing research and technological importance in biological and other applications. Difficulties in the production and delivery of nanoparticles with consistent and well defined properties appear in many forms and have a variety of causes. Among several issues are those associated with incomplete information about the history of particles involved in research studies including the synthesis method, sample history after synthesis including time and nature of storage and the detailed nature of any sample processing or modification. In addition, the tendency of particles to change with time or environmental condition suggests that the time betweenmore » analysis and application is important and some type of consistency or verification process can be important. The essential history of a set of particles can be identified as provenance information tells the origin or source of a batch of nano-objects along with information related to handling and any changes that may have taken place since it was originated. A record of sample provenance information for a set of particles can play a useful role in identifying some of the sources and decreasing the extent of particle variability and the observed lack of reproducibility observed by many researchers.« less
Relationship between Bullying and Suicidal Behaviour in Youth presenting to the Emergency Department
Alavi, Nazanin; Reshetukha, Taras; Prost, Eric; Antoniak, Kristen; Patel, Charmy; Sajid, Saad; Groll, Dianne
2017-01-01
Objective Increasing numbers of adolescents are visiting emergency departments with suicidal ideation. This study examines the relationship between bullying and suicidal ideation in emergency department settings. Method A chart review was conducted for all patients under 18 years of age presenting with a mental health complaint to the emergency departments at Kingston General or Hotel Dieu Hospitals in Kingston, Canada, between January 2011 and January 2015. Factors such as age, gender, history of abuse, history of bullying, type and time of bullying, and diagnoses were documented. Results 77% of the adolescents had experienced bullying, while 68.9% had suicide ideation at presentation. While controlling for age, gender, grade, psychiatric diagnosis, and abuse, a history of bullying was the most significant predictor of suicidal ideation. Individuals in this study who reported cyber bullying were 11.5 times more likely to have suicidal ideation documented on presentation, while individuals reporting verbal bullying were 8.4 times more likely. Conclusions The prevalence of bullying in adolescent patients presenting to emergency departments is high. The relationship found between suicidal ideation and bullying demonstrates that clinicians should ask questions about bullying as a risk factor for suicide ideation during the assessment of children and adolescents. PMID:28747929
Wu, Hang; Wu, Shixiang; Qiu, Nansheng; Chang, Jian; Bao, Rima; Zhang, Xin; Liu, Nian; Liu, Shuai
2018-01-01
Apatite fission-track (AFT) analysis, a widely used low-temperature thermochronology method, can provide details of the hydrocarbon generation history of source rocks for use in hydrocarbon exploration. The AFT method is based on the annealing behavior of fission tracks generated by 238 U fission in apatite particles during geological history. Due to the cumbersome experimental steps and high expense, it is imperative to find an efficient and inexpensive technique to determinate the annealing degree of AFT. In this study, on the basis of the ellipsoid configuration of tracks, the track volume fraction model (TVFM) is established and the fission-track volume index is proposed. Furthermore, terahertz time domain spectroscopy (THz-TDS) is used for the first time to identify the variation of the AFT annealing degree of Durango apatite particles heated at 20, 275, 300, 325, 450, and 500 ℃ for 10 h. The THz absorbance of the sample increases with the degree of annealing. In addition, the THz absorption index is exponentially related to annealing temperature and can be used to characterize the fission-track volume index. Terahertz time domain spectroscopy can be an ancillary technique for AFT thermochronological research. More work is urgently needed to extrapolate experimental data to geological conditions.
Hoch, Jeffrey C
2017-10-01
Non-Fourier methods of spectrum analysis are gaining traction in NMR spectroscopy, driven by their utility for processing nonuniformly sampled data. These methods afford new opportunities for optimizing experiment time, resolution, and sensitivity of multidimensional NMR experiments, but they also pose significant challenges not encountered with the discrete Fourier transform. A brief history of non-Fourier methods in NMR serves to place different approaches in context. Non-Fourier methods reflect broader trends in the growing importance of computation in NMR, and offer insights for future software development. Copyright © 2017 Elsevier Inc. All rights reserved.
Gilarte, Patricia; Kreuzinger-Janik, Bianca; Majdi, Nabil; Traunspurger, Walter
2015-01-01
The nematode Pristionchus pacificus is of growing interest as a model organism in evolutionary biology. However, despite multiple studies of its genetics, developmental cues, and ecology, the basic life-history traits (LHTs) of P. pacificus remain unknown. In this study, we used the hanging drop method to follow P. pacificus at the individual level and thereby quantify its LHTs. This approach allowed direct comparisons with the LHTs of Caenorhabditis elegans recently determined using this method. When provided with 5×10(9) Escherichia coli cells ml(-1) at 20°C, the intrinsic rate of natural increase of P. pacificus was 1.125 (individually, per day); mean net production was 115 juveniles produced during the life-time of each individual, and each nematode laid an average of 270 eggs (both fertile and unfertile). The mean age of P. pacificus individuals at first reproduction was 65 h, and the average life span was 22 days. The life cycle of P. pacificus is therefore slightly longer than that of C. elegans, with a longer average life span and hatching time and the production of fewer progeny.
Using Embedded Visual Coding to Support Contextualization of Historical Texts
ERIC Educational Resources Information Center
Baron, Christine
2016-01-01
This mixed-method study examines the think-aloud protocols of 48 randomly assigned undergraduate students to understand what effect embedding a visual coding system, based on reliable visual cues for establishing historical time period, would have on novice history students' ability to contextualize historic documents. Results indicate that using…
The Total Exposure Model (TEM) uses deterministic and stochastic methods to estimate the exposure of a person performing daily activities of eating, drinking, showering, and bathing. There were 250 time histories generated, by subject with activities, for the three exposure ro...
Field testing a mobile inelastic neutron scattering system to measure soil carbon
USDA-ARS?s Scientific Manuscript database
Cropping history in conjunction with soil management practices can have a major impact on the amount of organic carbon (C) stored in soil. Current methods of assessing soil C based on soil coring and subsequent processing procedures prior to laboratory analysis are labor intensive and time consuming...
Retirement Patterns from Career Employment
ERIC Educational Resources Information Center
Cahill, Kevin E.; Giandrea, Michael D.; Quinn, Joseph F.
2006-01-01
Purpose: This article investigates how older Americans leave their career jobs and estimates the extent of intermediate labor force activity (bridge jobs) between full-time work on a career job and complete labor-force withdrawal. Design and Methods: Using data from the Health and Retirement Study, we explored the work histories and retirement…
New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi; Lung, Shun-Fat
2017-01-01
A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.
[Diagnosis of musculoskeletal ambulation disability symptom complex (MADS)].
Ito, Hiromoto
2008-11-01
It was described to diagnosis of Musculoskeletal Ambulation Disability Symptom Complex (MADS). The definition of MADS is an person of advanced years with lower leveled balance and walking ability, high risk for fall, and who is keeping to the house. The diagnosis of MADS was decided to a history of 11 musculoskeletal disorders and evaluation of balanced and walking function. The determination method of time of one leg standing and 3 m timed timed up and go test were described.
NASA Astrophysics Data System (ADS)
Czarnecki, Łukasz; Grech, Dariusz; Pamuła, Grzegorz
2008-12-01
We confront global and local methods to analyze the financial crash-like events on the Polish financial market from the critical phenomena point of view. These methods are based on the analysis of log-periodicity and the local fractal properties of financial time series in the vicinity of phase transitions (crashes). The whole history (1991-2008) of Warsaw Stock Exchange Index (WIG) describing the largest developing financial market in Europe, is analyzed in a daily time horizon. We find that crash-like events on the Polish financial market are described better by the log-divergent price model decorated with log-periodic behavior than the corresponding power-law-divergent price model. Predictions coming from log-periodicity scenario are verified for all main crashes that took place in WIG history. It is argued that crash predictions within log-periodicity model strongly depend on the amount of data taken to make a fit and therefore are likely to contain huge inaccuracies. Turning to local fractal description, we calculate the so-called local (time dependent) Hurst exponent H for the WIG time series and we find the dependence between the behavior of the local fractal properties of the WIG time series and the crashes appearance on the financial market. The latter method seems to work better than the global approach - both for developing as for developed markets. The current situation on the market, particularly related to the Fed intervention in September’07 and the situation on the market immediately after this intervention is also analyzed from the fractional Brownian motion point of view.
Pull-out fibers from composite materials at high rate of loading
NASA Technical Reports Server (NTRS)
Amijima, S.; Fujii, T.
1981-01-01
Numerical and experimental results are presented on the pullout phenomenon in composite materials at a high rate of loading. The finite element method was used, taking into account the existence of a virtual shear deformation layer as the interface between fiber and matrix. Experimental results agree well with those obtained by the finite element method. Numerical results show that the interlaminar shear stress is time dependent, in addition, it is shown to depend on the applied load time history. Under step pulse loading, the interlaminar shear stress fluctuates, finally decaying to its value under static loading.
First-passage problems: A probabilistic dynamic analysis for degraded structures
NASA Technical Reports Server (NTRS)
Shiao, Michael C.; Chamis, Christos C.
1990-01-01
Structures subjected to random excitations with uncertain system parameters degraded by surrounding environments (a random time history) are studied. Methods are developed to determine the statistics of dynamic responses, such as the time-varying mean, the standard deviation, the autocorrelation functions, and the joint probability density function of any response and its derivative. Moreover, the first-passage problems with deterministic and stationary/evolutionary random barriers are evaluated. The time-varying (joint) mean crossing rate and the probability density function of the first-passage time for various random barriers are derived.
NASA Technical Reports Server (NTRS)
Barbely, Natasha L.; Sim, Ben W.; Kitaplioglu, Cahit; Goulding, Pat, II
2010-01-01
Difficulties in obtaining full-scale rotor low frequency noise measurements in wind tunnels are addressed via residual sound reflections due to non-ideal anechoic wall treatments. Examples illustrated with the Boeing-SMART rotor test in the National Full-Scale Aerodynamics Complex (NFAC) 40- by 80-Foot Wind Tunnel facility demonstrated that these reflections introduced distortions in the measured acoustic time histories that are not representative of free-field rotor noise radiation. A simplified reflection analysis, based on the method of images, is used to examine the sound measurement quality in such "less-than-anechoic" environment. Predictions of reflection-adjusted acoustic time histories are qualitatively shown to account for some of the spurious fluctuations observed in wind tunnel noise measurements
The historiography of medical history: from great men to archaeology.
King, C. R.
1991-01-01
The history of medicine is always written from the basis of the historian. Contemporary historiography provides an understanding of the major methods of historical analysis and their influences on the writing of medical history. Medical history in the 20th century has emphasized the historiographic methods of the history of great men, historicism, social history, and intellectual history. Each methodology has inherent biases that influence the historian's analysis of the past. Understanding the historian's biases provides the reader important tools for the interpretation of medical history. PMID:1933068
Identifying when weather influences life-history traits of grazing herbivores.
Sims, Michelle; Elston, David A; Larkham, Ann; Nussey, Daniel H; Albon, Steve D
2007-07-01
1. There is increasing evidence that density-independent weather effects influence life-history traits and hence the dynamics of populations of animals. Here, we present a novel statistical approach to estimate when such influences are strongest. The method is demonstrated by analyses investigating the timing of the influence of weather on the birth weight of sheep and deer. 2. The statistical technique allowed for the pattern of temporal correlation in the weather data enabling the effects of weather in many fine-scale time intervals to be investigated simultaneously. Thus, while previous studies have typically considered weather averaged across a single broad time interval during pregnancy, our approach enabled examination simultaneously of the relationships with weekly and fortnightly averages throughout the whole of pregnancy. 3. We detected a positive effect of temperature on the birth weight of deer, which is strongest in late pregnancy (mid-March to mid-April), and a negative effect of rainfall on the birthweight of sheep, which is strongest during mid-pregnancy (late January to early February). The possible mechanisms underlying these weather-birth weight relationships are discussed. 4. This study enhances our insight into the pattern of the timing of influence of weather on early development. The method is of much more general application and could provide valuable insights in other areas of ecology in which sequences of intercorrelated explanatory variables have been collected in space or in time.
AP American History and the History Major: Keeping Body and Soul Together.
ERIC Educational Resources Information Center
Holbo, Paul S.
For college-level American History, in the high school advanced placement (AP) program and on university campuses, these are the best of times and the worst of times. For the American History AP program, the early 1970's were difficult times, with the examinations under attack as elitist and irrelevant to contemporary problems. The program…
The LEGACY Girls Study: Growth and development in the context of breast cancer family history
John, Esther M.; Terry, Mary Beth; Keegan, Theresa H.M.; Bradbury, Angela R.; Knight, Julia A.; Chung, Wendy K.; Frost, Caren J.; Lilge, Lothar; Patrick-Miller, Linda; Schwartz, Lisa A.; Whittemore, Alice S.; Buys, Saundra S.; Daly, Mary B.; Andrulis, Irene L.
2017-01-01
Background Although the timing of pubertal milestones has been associated with breast cancer risk, few studies of girls’ development include girls at increased breast cancer risk due to their family history. Methods The LEGACY (Lessons in Epidemiology and Genetics of Adult Cancer from Youth) Girls Study was initiated in 2011 in the USA and Canada to assess the relation between early-life exposures and intermediate markers of breast cancer risk (e.g., pubertal development, breast tissue characteristics) and to investigate psychosocial well-being and health behaviors in the context of family history. We describe the methods used to establish and follow a cohort of 1,040 girls ages 6–13 years at baseline, half with a breast cancer family history, and the collection of questionnaire data (family history, early-life exposures, growth and development, psychosocial and behavioral), anthropometry, biospecimens, and breast tissue characteristics using optical spectroscopy. Results During this initial 5-year phase of the study, follow-up visits are conducted every six months for repeated data and biospecimen collection. Participation in baseline components was high (98% for urine, 97.5% for blood or saliva, and 98% for anthropometry). At enrollment, 77% of girls were pre-menarcheal and 49% were at breast Tanner stage T1. Conclusions This study design allows thorough examination of events affecting girls’ growth and development and how they differ across the spectrum of breast cancer risk. A better understanding of early-life breast cancer risk factors will be essential to enhance prevention across the lifespan for those with and without a family history of the disease. PMID:26829160
Melbourne-Thomas, Jessica; Corney, Stuart P.; McMahon, Clive R.; Hindell, Mark A.
2018-01-01
Higher trophic-level species are an integral component of any marine ecosystem. Despite their importance, methods for representing these species in end-to-end ecosystem models often have limited representation of life histories, energetics and behaviour. We built an individual-based model coupled with a dynamic energy budget for female southern elephant seals Mirounga leonina to demonstrate a method for detailed representation of marine mammals. We aimed to develop a model which could i) simulate energy use and life histories, as well as breeding traits of southern elephant seals in an emergent manner, ii) project a stable population over time, and iii) have realistic population dynamics and structure based on emergent life history features (such as age at first breeding, lifespan, fecundity and (yearling) survival). We evaluated the model’s ability to represent a stable population over long time periods (>10 generations), including the sensitivity of the emergent properties to variations in key parameters. Analyses indicated that the model is sensitive to changes in resource availability and energy requirements for the transition from pup to juvenile, and juvenile to adult stage. This was particularly the case for breeding success and yearling survival. This model is suitable for use as a standalone tool for investigating the impacts of changes to behaviour and population responses of southern elephant seals. PMID:29596456
Boundary particle method for Laplace transformed time fractional diffusion equations
NASA Astrophysics Data System (ADS)
Fu, Zhuo-Jia; Chen, Wen; Yang, Hai-Tian
2013-02-01
This paper develops a novel boundary meshless approach, Laplace transformed boundary particle method (LTBPM), for numerical modeling of time fractional diffusion equations. It implements Laplace transform technique to obtain the corresponding time-independent inhomogeneous equation in Laplace space and then employs a truly boundary-only meshless boundary particle method (BPM) to solve this Laplace-transformed problem. Unlike the other boundary discretization methods, the BPM does not require any inner nodes, since the recursive composite multiple reciprocity technique (RC-MRM) is used to convert the inhomogeneous problem into the higher-order homogeneous problem. Finally, the Stehfest numerical inverse Laplace transform (NILT) is implemented to retrieve the numerical solutions of time fractional diffusion equations from the corresponding BPM solutions. In comparison with finite difference discretization, the LTBPM introduces Laplace transform and Stehfest NILT algorithm to deal with time fractional derivative term, which evades costly convolution integral calculation in time fractional derivation approximation and avoids the effect of time step on numerical accuracy and stability. Consequently, it can effectively simulate long time-history fractional diffusion systems. Error analysis and numerical experiments demonstrate that the present LTBPM is highly accurate and computationally efficient for 2D and 3D time fractional diffusion equations.
NASA Astrophysics Data System (ADS)
Wu, Bitao; Wu, Gang; Yang, Caiqian; He, Yi
2018-05-01
A novel damage identification method for concrete continuous girder bridges based on spatially-distributed long-gauge strain sensing is presented in this paper. First, the variation regularity of the long-gauge strain influence line of continuous girder bridges which changes with the location of vehicles on the bridge is studied. According to this variation regularity, a calculation method for the distribution regularity of the area of long-gauge strain history is investigated. Second, a numerical simulation of damage identification based on the distribution regularity of the area of long-gauge strain history is conducted, and the results indicate that this method is effective for identifying damage and is not affected by the speed, axle number and weight of vehicles. Finally, a real bridge test on a highway is conducted, and the experimental results also show that this method is very effective for identifying damage in continuous girder bridges, and the local element stiffness distribution regularity can be revealed at the same time. This identified information is useful for maintaining of continuous girder bridges on highways.
Schildmann, Jan; Bruns, Florian; Hess, Volker; Vollmann, Jochen
2017-01-01
Objective: “History, Theory, Ethics of Medicine” (German: “Geschichte, Theorie, Ethik der Medizin”, abbreviation: GTE) forms part of the obligatory curriculum for medical students in Germany since the winter semester 2003/2004. This paper presents the results of a national survey on the contents, methods and framework of GTE teaching. Methods: Semi-structured questionnaire dispatched in July 2014 to 38 institutions responsible for GTE teaching. Descriptive analysis of quantitative data and content analysis of free-text answers. Results: It was possible to collect data from 29 institutes responsible for GTE teaching (response: 76%). There is at least one professorial chair for GTE in 19 faculties; two professorial chairs or professorships remained vacant at the time of the survey. The number of students taught per academic year ranges from <100 to >350. Teaching in GTE comprises an average of 2.18 hours per week per semester (min: 1, max: 6). Teaching in GTE is proportionally distributed according to an arithmetic average as follows: history: 35.4%, theory 14.7% and ethics 49.9%. Written learning objectives were formulated for GTE in 24 faculties. The preferred themes of teaching in history, theory or ethics which according to respondents should be taught comprise a broad spectrum and vary. Teaching in ethics (79 from a max. of 81 possible points) is, when compared to history (61/81) and theory (53/81), attributed the most significance for the training of medical doctors. Conclusion: 10 years after the introduction of GTE the number of students and the personnel resources available at the institutions vary considerably. In light of the differences regarding the content elicited in this study the pros and cons of heterogeneity in GTE should be discussed. PMID:28584871
Karp, Igor; Sylvestre, Marie-Pierre; Abrahamowicz, Michal; Leffondré, Karen; Siemiatycki, Jack
2016-11-01
Assessment of individual risk of illness is an important activity in preventive medicine. Development of risk-assessment models has heretofore relied predominantly on studies involving follow-up of cohort-type populations, while case-control studies have generally been considered unfit for this purpose. To present a method for individualized assessment of absolute risk of an illness (as illustrated by lung cancer) based on data from a 'non-nested' case-control study. We used data from a case-control study conducted in Montreal, Canada in 1996-2001. Individuals diagnosed with lung cancer (n = 920) and age- and sex-matched lung-cancer-free subjects (n = 1288) completed questionnaires documenting life-time cigarette-smoking history and occupational, medical, and family history. Unweighted and weighted logistic models were fitted. Model overfitting was assessed using bootstrap-based cross-validation and 'shrinkage.' The discriminating ability was assessed by the c-statistic, and the risk-stratifying performance was assessed by examination of the variability in risk estimates over hypothetical risk-profiles. In the logistic models, the logarithm of incidence-density of lung cancer was expressed as a function of age, sex, cigarette-smoking history, history of respiratory conditions and exposure to occupational carcinogens, and family history of lung cancer. The models entailed a minimal degree of overfitting ('shrinkage' factor: 0.97 for both unweighted and weighted models) and moderately high discriminating ability (c-statistic: 0.82 for the unweighted model and 0.66 for the weighted model). The method's risk-stratifying performance was quite high. The presented method allows for individualized assessment of risk of lung cancer and can be used for development of risk-assessment models for other illnesses.
Schildmann, Jan; Bruns, Florian; Hess, Volker; Vollmann, Jochen
2017-01-01
Objective: "History, Theory, Ethics of Medicine" (German: "Geschichte, Theorie, Ethik der Medizin", abbreviation: GTE) forms part of the obligatory curriculum for medical students in Germany since the winter semester 2003/2004. This paper presents the results of a national survey on the contents, methods and framework of GTE teaching. Methods: Semi-structured questionnaire dispatched in July 2014 to 38 institutions responsible for GTE teaching. Descriptive analysis of quantitative data and content analysis of free-text answers. Results: It was possible to collect data from 29 institutes responsible for GTE teaching (response: 76%). There is at least one professorial chair for GTE in 19 faculties; two professorial chairs or professorships remained vacant at the time of the survey. The number of students taught per academic year ranges from <100 to >350. Teaching in GTE comprises an average of 2.18 hours per week per semester (min: 1, max: 6). Teaching in GTE is proportionally distributed according to an arithmetic average as follows: history: 35.4%, theory 14.7% and ethics 49.9%. Written learning objectives were formulated for GTE in 24 faculties. The preferred themes of teaching in history, theory or ethics which according to respondents should be taught comprise a broad spectrum and vary. Teaching in ethics (79 from a max. of 81 possible points) is, when compared to history (61/81) and theory (53/81), attributed the most significance for the training of medical doctors. Conclusion: 10 years after the introduction of GTE the number of students and the personnel resources available at the institutions vary considerably. In light of the differences regarding the content elicited in this study the pros and cons of heterogeneity in GTE should be discussed.
NASA Technical Reports Server (NTRS)
Marti, K.
1986-01-01
A technique of cosmic ray exposure age dating using cosmic ray produced I-129 and Xe-129 components is discussed. The live I-129 - Xe-129 method provides an ideal monitor for cosmic ray flux variations on the 10(7)y - 10(8)y time-scale. It is based on low-energy neutron reactions on Te, and these data, when coupled to those from other methods, may facilitate the detection of complex exposure histories.
Conformal Electromagnetic Particle in Cell: A Review
Meierbachtol, Collin S.; Greenwood, Andrew D.; Verboncoeur, John P.; ...
2015-10-26
We review conformal (or body-fitted) electromagnetic particle-in-cell (EM-PIC) numerical solution schemes. Included is a chronological history of relevant particle physics algorithms often employed in these conformal simulations. We also provide brief mathematical descriptions of particle-tracking algorithms and current weighting schemes, along with a brief summary of major time-dependent electromagnetic solution methods. Several research areas are also highlighted for recommended future development of new conformal EM-PIC methods.
Bontems, Vincent
2014-01-01
The construction of historical frame of reference based on the distinction between and articulation of phenomenological and chronological times. As it relativises the notion of simultaneity and inverts its relation to causality, the special theory of relativity can induce analogous modes of reflection on the themes of "contemporaneity" in the history of art (Panofsky) and in epistemology (Bachelard). This "relativist" method, often misunderstood, sheds light on both historical and presentist methods.
A Review of Spectral Methods for Variable Amplitude Fatigue Prediction and New Results
NASA Technical Reports Server (NTRS)
Larsen, Curtis E.; Irvine, Tom
2013-01-01
A comprehensive review of the available methods for estimating fatigue damage from variable amplitude loading is presented. The dependence of fatigue damage accumulation on power spectral density (psd) is investigated for random processes relevant to real structures such as in offshore or aerospace applications. Beginning with the Rayleigh (or narrow band) approximation, attempts at improved approximations or corrections to the Rayleigh approximation are examined by comparison to rainflow analysis of time histories simulated from psd functions representative of simple theoretical and real world applications. Spectral methods investigated include corrections by Wirsching and Light, Ortiz and Chen, the Dirlik formula, and the Single-Moment method, among other more recent proposed methods. Good agreement is obtained between the spectral methods and the time-domain rainflow identification for most cases, with some limitations. Guidelines are given for using the several spectral methods to increase confidence in the damage estimate.
A new method to extract modal parameters using output-only responses
NASA Astrophysics Data System (ADS)
Kim, Byeong Hwa; Stubbs, Norris; Park, Taehyo
2005-04-01
This work proposes a new output-only modal analysis method to extract mode shapes and natural frequencies of a structure. The proposed method is based on an approach with a single-degree-of-freedom in the time domain. For a set of given mode-isolated signals, the un-damped mode shapes are extracted utilizing the singular value decomposition of the output energy correlation matrix with respect to sensor locations. The natural frequencies are extracted from a noise-free signal that is projected on the estimated modal basis. The proposed method is particularly efficient when a high resolution of mode shape is essential. The accuracy of the method is numerically verified using a set of time histories that are simulated using a finite-element method. The feasibility and practicality of the method are verified using experimental data collected at the newly constructed King Storm Water Bridge in California, United States.
Menstrual characteristics and prevalence of dysmenorrhea in college going girls
Kural, MoolRaj; Noor, Naziya Nagori; Pandit, Deepa; Joshi, Tulika; Patil, Anjali
2015-01-01
Background: Dysmenorrhea is a common gynecological condition with painful menstrual cramps of uterine origin. Prevalence of primary dysmenorrhea is not yet clearly studied in central India. Objective: To study prevalence of primary dysmenorrhea in young girls and to evaluate associated clinical markers of dysmenorrhea. Materials and Methods: In a cross-sectional study, data was collected among 310 girls (18–25 years) on age at menarche, presence and absence of dysmenorrhea, dysmenorrhea duration, pre-menstrual symptoms (PMS), family history, menses irregularities, menstrual history, severity grading using visual analogue scale (VAS) using a semi-structured questionnaire. Results: Dysmenorrhea was reported in 84.2% (261) girls and 15.8% (49) reported no dysmenorrhea. Using VAS, 34.2% of girls experienced severe pain, 36.6% moderate and 29.2% had mild pain. Bleeding duration was found to be significantly associated with dysmenorrhea (χ2 = 10.5; P < 0.05), girls with bleeding duration more than 5 days had 1.9 times more chance of getting dysmenorrhea (OR: 1.9; 95% CI: 1.7–3). Moreover, girls with the presence of clots had 2.07 times higher chance of having dysmenorrhea (OR: 2.07; 95% CI: 1.04–4.1) (P < 0.05). Almost 53.7% girls who had some family history of dysmenorrhea, 90.9% experience the condition themselves (χ2 = 11.5; P < 0.001). Girls with family history of dysmenorrhea had three times greater chance of having the same problem (OR: 3.0; 95% CI: 1.5–5.8; P = 0.001). Conclusion: Dysmenorrhea is found to be highly prevalent among college going girls. Family history, bleeding duration and presence of clots were significant risk factors for dysmenorrhea. PMID:26288786
Wininger, Austin E; Fischer, James P; Likine, Elive F; Gudeman, Andrew S; Brinker, Alexander R; Ryu, Jonathan; Maupin, Kevin A; Lunsford, Shatoria; Whipple, Elizabeth C; Loder, Randall T; Kacena, Melissa A
2017-12-01
In academia, authorship is considered a currency and is important for career advancement. As the Journal of Bone and Mineral Research (JBMR) is the highest-ranked journal in the field of bone, muscle, and mineral metabolism and is the official publication of the American Society for Bone and Mineral Research, we sought to examine authorship changes over JBMR's 30-year history. Two bibliometric methods were used to collect the data. The "decade method" included all published manuscripts throughout 1 year in each decade over the past 30 years starting with the inaugural year, yielding 746 manuscripts for analysis. The "random method" examined 10% of published manuscripts from each of the 30 years, yielding 652 manuscripts for analysis. Using both methods, the average number of authors per manuscript, numerical location of the corresponding author, number of collaborating institutions, number of collaborating countries, number of printed manuscript pages, and the number of times each manuscript was cited all significantly increased between 1986 and 2015 (p < 10 -4 ). Using the decade method, there was a significant increase in the percentage of female first authors over time from 35.8% in 1986 to 47.7% in 2015 (p = 0.02), and this trend was confirmed using the random method. The highest percentage of female first authors in 2015 was in Europe (60.0%), and Europe also had the most dramatic increase in female first authors over time (more than double in 2015 compared with 1986). Likewise, the overall number of female corresponding authors significantly increased during the past 30 years. With the increasing demands of publishing in academic medicine, understanding changes in publishing characteristics over time and by geographical region is important. These findings highlight JBMR's authorship trends over the past 30 years and demonstrate those countries having the most changes and where challenges still exist. © 2017 American Society for Bone and Mineral Research. © 2017 American Society for Bone and Mineral Research.
Code of Federal Regulations, 2014 CFR
2014-10-01
... American history, American government, social studies, or political science for full-time graduate study... history, American government, social studies, or political science for part-time graduate study toward a..., must teach American history, American government, social studies, or political science on a full-time...
Code of Federal Regulations, 2012 CFR
2012-10-01
... American history, American government, social studies, or political science for full-time graduate study... history, American government, social studies, or political science for part-time graduate study toward a..., must teach American history, American government, social studies, or political science on a full-time...
Code of Federal Regulations, 2013 CFR
2013-10-01
... American history, American government, social studies, or political science for full-time graduate study... history, American government, social studies, or political science for part-time graduate study toward a..., must teach American history, American government, social studies, or political science on a full-time...
School Curriculum, Globalisation and the Constitution of Policy Problems and Solutions
ERIC Educational Resources Information Center
Winter, Christine
2012-01-01
To varying degrees, education policy reforms around the world are driven by educational discourses relating to globalisation. At the same time, national and local histories, cultures and politics mediate the effects of globalisation discourses. This paper employs methods of analysis that draw on the concepts of "vernacular globalization"…
Maria Montessori: Portrait of a Young Woman
ERIC Educational Resources Information Center
Povell, Phyllis
2007-01-01
In this article, the author presents the biography of Maria Montessori, who pioneered early childhood education and introduced a new method of pedagogy. The innovations in education that Montessori introduced were enough to reserve a place for her in the history books. Montessori was ahead of her time in many aspects of her life. The decisions…
Greci, Laura S; Katz, David L; Jekel, James
2005-04-01
Although the CDC ACIP (Advisory Committee on Immunization Practices) recommends that appropriate inpatients receive pneumococcal and influenza vaccines, adult vaccination rates for these remain low. We therefore examined perihospitalization vaccination rates for high-risk pneumonia inpatients. A retrospective chart review of all pneumonia patients admitted to one community hospital from 6/1/95 to 5/31/96. Vaccination history, co-morbidity, mortality, and prior and subsequent pneumonia admissions were recorded. Primary care providers and nursing homes were contacted to complete and verify vaccine histories. For 173 total admissions (160 subjects), vaccine histories were documented in the hospital chart in less than 0.5% of patients. While 97% had indications for both vaccines at the time of admission, no vaccines were given in the hospital and less than 5% had documented vaccinations during the subsequent 3 years. Despite clear indications, few patients had documented vaccination at any time. These data lend urgency to the recommendation that pneumococcal and influenza vaccines should be routinely administered to pneumonia inpatients at discharge. Furthermore, they illustrate the need for an improved method for tracking individual adult vaccinations.
Ozcakar, Nilgun; Mevsim, Vildan; Guldal, Dilek; Gunvar, Tolga; Yildirim, Ediz; Sisli, Zafer; Semin, Ilgi
2009-12-19
In recent times, medical schools have committed to developing good communication and history taking skills in students. However, there remains an unresolved question as to which constitutes the best educational method. Our study aims to investigate whether the use of videotape recording is superior to verbal feedback alone in the teaching of clinical skills and the role of student self-assessment on history taking and communication skills. A randomized controlled trial was designed. The study was conducted with 52 of the Dokuz Eylul University Faculty of Medicine second year students. All students' performances of communication and history taking skills were assessed twice. Between these assessments, the study group had received both verbal and visual feedback by watching their video recordings on patient interview; the control group received only verbal feedback from the teacher. Although the self-assessment of the students did not change significantly, assessors' ratings increased significantly for videotaped interviews at the second time. Feedback based on videotaped interviews is superior to the feedback given solely based on the observation of assessors.
Nadachowska-Brzyska, Krystyna; Burri, Reto; Olason, Pall I.; Kawakami, Takeshi; Smeds, Linnéa; Ellegren, Hans
2013-01-01
Profound knowledge of demographic history is a prerequisite for the understanding and inference of processes involved in the evolution of population differentiation and speciation. Together with new coalescent-based methods, the recent availability of genome-wide data enables investigation of differentiation and divergence processes at unprecedented depth. We combined two powerful approaches, full Approximate Bayesian Computation analysis (ABC) and pairwise sequentially Markovian coalescent modeling (PSMC), to reconstruct the demographic history of the split between two avian speciation model species, the pied flycatcher and collared flycatcher. Using whole-genome re-sequencing data from 20 individuals, we investigated 15 demographic models including different levels and patterns of gene flow, and changes in effective population size over time. ABC provided high support for recent (mode 0.3 my, range <0.7 my) species divergence, declines in effective population size of both species since their initial divergence, and unidirectional recent gene flow from pied flycatcher into collared flycatcher. The estimated divergence time and population size changes, supported by PSMC results, suggest that the ancestral species persisted through one of the glacial periods of middle Pleistocene and then split into two large populations that first increased in size before going through severe bottlenecks and expanding into their current ranges. Secondary contact appears to have been established after the last glacial maximum. The severity of the bottlenecks at the last glacial maximum is indicated by the discrepancy between current effective population sizes (20,000–80,000) and census sizes (5–50 million birds) of the two species. The recent divergence time challenges the supposition that avian speciation is a relatively slow process with extended times for intrinsic postzygotic reproductive barriers to evolve. Our study emphasizes the importance of using genome-wide data to unravel tangled demographic histories. Moreover, it constitutes one of the first examples of the inference of divergence history from genome-wide data in non-model species. PMID:24244198
Nadachowska-Brzyska, Krystyna; Burri, Reto; Olason, Pall I; Kawakami, Takeshi; Smeds, Linnéa; Ellegren, Hans
2013-11-01
Profound knowledge of demographic history is a prerequisite for the understanding and inference of processes involved in the evolution of population differentiation and speciation. Together with new coalescent-based methods, the recent availability of genome-wide data enables investigation of differentiation and divergence processes at unprecedented depth. We combined two powerful approaches, full Approximate Bayesian Computation analysis (ABC) and pairwise sequentially Markovian coalescent modeling (PSMC), to reconstruct the demographic history of the split between two avian speciation model species, the pied flycatcher and collared flycatcher. Using whole-genome re-sequencing data from 20 individuals, we investigated 15 demographic models including different levels and patterns of gene flow, and changes in effective population size over time. ABC provided high support for recent (mode 0.3 my, range <0.7 my) species divergence, declines in effective population size of both species since their initial divergence, and unidirectional recent gene flow from pied flycatcher into collared flycatcher. The estimated divergence time and population size changes, supported by PSMC results, suggest that the ancestral species persisted through one of the glacial periods of middle Pleistocene and then split into two large populations that first increased in size before going through severe bottlenecks and expanding into their current ranges. Secondary contact appears to have been established after the last glacial maximum. The severity of the bottlenecks at the last glacial maximum is indicated by the discrepancy between current effective population sizes (20,000-80,000) and census sizes (5-50 million birds) of the two species. The recent divergence time challenges the supposition that avian speciation is a relatively slow process with extended times for intrinsic postzygotic reproductive barriers to evolve. Our study emphasizes the importance of using genome-wide data to unravel tangled demographic histories. Moreover, it constitutes one of the first examples of the inference of divergence history from genome-wide data in non-model species.
Refractory status epilepticus in children with and without prior epilepsy or status epilepticus
Sánchez Fernández, Iván; Jackson, Michele C.; Abend, Nicholas S.; Arya, Ravindra; Brenton, James N.; Carpenter, Jessica L.; Chapman, Kevin E.; Gaillard, William D.; Gaínza-Lein, Marina; Glauser, Tracy A.; Goldstein, Joshua L.; Goodkin, Howard P.; Helseth, Ashley; Kapur, Kush; McDonough, Tiffani L.; Mikati, Mohamad A.; Peariso, Katrina; Riviello, James; Tasker, Robert C.; Topjian, Alexis A.; Wainwright, Mark S.; Wilfong, Angus; Williams, Korwyn
2017-01-01
Objective: To compare refractory convulsive status epilepticus (rSE) management and outcome in children with and without a prior diagnosis of epilepsy and with and without a history of status epilepticus (SE). Methods: This was a prospective observational descriptive study performed from June 2011 to May 2016 on pediatric patients (1 month–21 years of age) with rSE. Results: We enrolled 189 participants (53% male) with a median (25th–75th percentile) age of 4.2 (1.3–9.6) years. Eighty-nine (47%) patients had a prior diagnosis of epilepsy. Thirty-four (18%) patients had a history of SE. The time to the first benzodiazepine was similar in participants with and without a diagnosis of epilepsy (15 [5–60] vs 16.5 [5–42.75] minutes, p = 0.858). Patients with a diagnosis of epilepsy received their first non-benzodiazepine (BZD) antiepileptic drug (AED) later (93 [46–190] vs 50.5 [28–116] minutes, p = 0.002) and were less likely to receive at least one continuous infusion (35/89 [39.3%] vs 57/100 [57%], p = 0.03). Compared to patients with no history of SE, patients with a history of SE received their first BZD earlier (8 [3.5–22.3] vs 20 [5–60] minutes, p = 0.0073), although they had a similar time to first non-BZD AED (76.5 [45.3–124] vs 65 [32.5–156] minutes, p = 0.749). Differences were mostly driven by the patients with an out-of-hospital rSE onset. Conclusions: Our study establishes that children with rSE do not receive more timely treatment if they have a prior diagnosis of epilepsy; however, a history of SE is associated with more timely administration of abortive medication. PMID:28011930
De Jesus, Sol; Wu, Samuel S.; Pei, Qinglin; Hassan, Anhar; Armstrong, Melissa J.; Martinez-Ramirez, Daniel; Schmidt, Peter; Okun, Michael S.
2017-01-01
Background Patients with Parkinson disease (PD) are at high risk of hospital encounters with increasing morbidity and mortality. This study aimed to determine the rate of hospital encounters in a cohort followed over 5 years and to identify associated factors. Methods We queried the data from the International Multicenter National Parkinson Foundation Quality Improvement study. Multivariate logistic regression with backward selection was performed to identify factors associated with hospital encounter prior to baseline visit. Kaplan-Meier estimates were obtained and Cox regression performed on time to hospital encounter after the baseline visit. Results Of the 7,507 PD patients (mean age 66.5±9.9 years and disease duration 8.9±6.4 years at baseline visit), 1919 (25.6%) had a history of a hospital encounter prior to their baseline visit. Significant factors associated with a history of a hospital encounter prior to baseline included race (white race: OR 0.49), utilization of physical therapy (OR 1.47), history of deep brain stimulation (OR 1.87), number of comorbidities (OR 1.30), caregiver strain (OR 1.17 per standard deviation), and the standardized Timed Up and Go Test (OR 1.21). Patients with a history of hospitalization prior to the baseline were more likely to have a re-hospitalization (HR1.67, P<0.0001) compared to those without a prior hospitalization. In addition, the time to hospital encounter from baseline was significantly associated with age and number of medications. In patients with a history of hospitalization prior to the baseline visit, time to a second hospital encounter was significantly associated with caregiver strain and number of comorbidities. Conclusion Hospitalization and re-hospitalization were common in this cohort of people with PD. Our results suggest addressing caregiver burden, simplifying medications, and emphasizing primary and multidisciplinary care for comorbidities are potential avenues to explore for reducing hospitalization rates. PMID:28683150
A Bayesian account of quantum histories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marlow, Thomas
2006-05-15
We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive 'probabilities.' The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistentmore » with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely, the linearly positive histories originally introduced by Goldstein and Page. Thus, we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory.« less
The validity of birth and pregnancy histories in rural Bangladesh.
Espeut, Donna; Becker, Stan
2015-08-28
Maternity histories provide a means of estimating fertility and mortality from surveys. The present analysis compares two types of maternity histories-birth histories and pregnancy histories-in three respects: (1) completeness of live birth and infant death reporting; (2) accuracy of the time placement of live births and infant deaths; and (3) the degree to which reported versus actual total fertility measures differ. The analysis covers a 15-year time span and is based on two data sources from Matlab, Bangladesh: the 1994 Matlab Demographic and Health Survey and, as gold standard, the vital events data from Matlab's Demographic Surveillance System. Both histories are near perfect in live-birth completeness; however, pregnancy histories do better in the completeness and time accuracy of deaths during the first year of life. Birth or pregnancy histories can be used for fertility estimation, but pregnancy histories are advised for estimating infant mortality.
Biedermann, N; Hayes, B; Usher, K; Williams, A
2000-01-01
In research, there is no perfection: no perfect method, no perfect sample, and no perfect data analyses tool. Coming to this understanding helps the researcher identify the inadequacies of their preferred method. This paper discusses the criticisms of the oral history method, drawing reference to its challenges and difficulties in relation to its use in nursing research. Oral history has the advantage over more traditional historical approaches in that the narrators can interpret events, personalities and relationships within the interview that are not accessible from written sources. The oral history interview may also provide a forum for unveiling documents and photographs, which might not have been otherwise discovered. Nonetheless, oral history, like most methodologies, is not flawless. This paper discusses the limitations of oral history and suggests ways in which a nurse can use oral history to provide an account of aspects of nursing history.
Stuebner, Michael; Haider, Mansoor A
2010-06-18
A new and efficient method for numerical solution of the continuous spectrum biphasic poroviscoelastic (BPVE) model of articular cartilage is presented. Development of the method is based on a composite Gauss-Legendre quadrature approximation of the continuous spectrum relaxation function that leads to an exponential series representation. The separability property of the exponential terms in the series is exploited to develop a numerical scheme that can be reduced to an update rule requiring retention of the strain history at only the previous time step. The cost of the resulting temporal discretization scheme is O(N) for N time steps. Application and calibration of the method is illustrated in the context of a finite difference solution of the one-dimensional confined compression BPVE stress-relaxation problem. Accuracy of the numerical method is demonstrated by comparison to a theoretical Laplace transform solution for a range of viscoelastic relaxation times that are representative of articular cartilage. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Dong, Yi; Mihalas, Stefan; Russell, Alexander; Etienne-Cummings, Ralph; Niebur, Ernst
2012-01-01
When a neuronal spike train is observed, what can we say about the properties of the neuron that generated it? A natural way to answer this question is to make an assumption about the type of neuron, select an appropriate model for this type, and then to choose the model parameters as those that are most likely to generate the observed spike train. This is the maximum likelihood method. If the neuron obeys simple integrate and fire dynamics, Paninski, Pillow, and Simoncelli (2004) showed that its negative log-likelihood function is convex and that its unique global minimum can thus be found by gradient descent techniques. The global minimum property requires independence of spike time intervals. Lack of history dependence is, however, an important constraint that is not fulfilled in many biological neurons which are known to generate a rich repertoire of spiking behaviors that are incompatible with history independence. Therefore, we expanded the integrate and fire model by including one additional variable, a variable threshold (Mihalas & Niebur, 2009) allowing for history-dependent firing patterns. This neuronal model produces a large number of spiking behaviors while still being linear. Linearity is important as it maintains the distribution of the random variables and still allows for maximum likelihood methods to be used. In this study we show that, although convexity of the negative log-likelihood is not guaranteed for this model, the minimum of the negative log-likelihood function yields a good estimate for the model parameters, in particular if the noise level is treated as a free parameter. Furthermore, we show that a nonlinear function minimization method (r-algorithm with space dilation) frequently reaches the global minimum. PMID:21851282
NASA Technical Reports Server (NTRS)
Young, J. W.; Schy, A. A.; Johnson, K. G.
1977-01-01
An analytical method has been developed for predicting critical control inputs for which nonlinear rotational coupling may cause sudden jumps in aircraft response. The analysis includes the effect of aerodynamics which are nonlinear in angle of attack. The method involves the simultaneous solution of two polynomials in roll rate, whose coefficients are functions of angle of attack and the control inputs. Results obtained using this procedure are compared with calculated time histories to verify the validity of the method for predicting jump-like instabilities.
Applications of Fault Detection in Vibrating Structures
NASA Technical Reports Server (NTRS)
Eure, Kenneth W.; Hogge, Edward; Quach, Cuong C.; Vazquez, Sixto L.; Russell, Andrew; Hill, Boyd L.
2012-01-01
Structural fault detection and identification remains an area of active research. Solutions to fault detection and identification may be based on subtle changes in the time series history of vibration signals originating from various sensor locations throughout the structure. The purpose of this paper is to document the application of vibration based fault detection methods applied to several structures. Overall, this paper demonstrates the utility of vibration based methods for fault detection in a controlled laboratory setting and limitations of applying the same methods to a similar structure during flight on an experimental subscale aircraft.
Physical and Sexual Violence and Incident Sexually Transmitted Infections
Anand, Mallika; Redding, Colleen A.; Peipert, Jeffrey F.
2009-01-01
Abstract Objective To investigate whether women aged 13–35 who were victims of interpersonal violence were more likely than nonvictims to experience incident sexually transmitted infections (STIs). Methods We examined 542 women aged 13–35 enrolled in Project PROTECT, a randomized clinical trial that compared two different methods of computer-based intervention to promote the use of dual methods of contraception. Participants completed a baseline questionnaire that included questions about their history of interpersonal violence and were followed for incident STIs over the 2-year study period. We compared the incidence of STIs in women with and without a history of interpersonal violence using bivariate analyses and multiple logistic regression. Results In the bivariate analyses, STI incidence was found to be significantly associated with African American race/ethnicity, a higher number of sexual partners in the past month, and a lower likelihood of avoidance of sexual partners who pressure to have sex without a condom. In both crude and adjusted regression analyses, time to STI incidence was faster among women who reported physical or sexual abuse in the year before study enrollment (HRRadj = 1.68, 95% CI 1.06, 2.65). Conclusions Women with a recent history of abuse are at significantly increased risk of STI incidence than are nonvictims. PMID:19245303
Potassium-argon (argon-argon), structural fabrics
Cosca, Michael A.; Rink, W. Jack; Thompson, Jereon
2014-01-01
Definition: 40Ar/39Ar geochronology of structural fabrics: The application of 40Ar/39Ar methods to date development of structural fabrics in geologic samples. Introduction: Structural fabrics develop during rock deformation at variable pressures (P), temperatures (T), fluid compositions (X), and time (t). Structural fabrics are represented in rocks by features such as foliations and shear zones developed at the mm to km scale. In ideal cases, the P-T-X history of a given structural fabric can be constrained using stable isotope, cation exchange, and/or mineral equilibria thermobarometry (Essene 1989). The timing of structural fabric development can be assessed qualitatively using geologic field observations or quantitatively using isotope-based geochronology. High-precision geochronology of the thermal and fluid flow histories associated with structural fabric development can answer fundamental geologic questions including (1) when hydrothermal fluids transported and deposited ore minerals, ...
System and method for clock synchronization and position determination using entangled photon pairs
NASA Technical Reports Server (NTRS)
Shih, Yanhua (Inventor)
2010-01-01
A system and method for clock synchronization and position determination using entangled photon pairs is provided. The present invention relies on the measurement of the second order correlation function of entangled states. Photons from an entangled photon source travel one-way to the clocks to be synchronized. By analyzing photon registration time histories generated at each clock location, the entangled states allow for high accuracy clock synchronization as well as high accuracy position determination.
[The history of correction of refractive errors: spectacles].
Wojtyczkak, E
2000-01-01
An historical analysis of discoveries related to the treatment of defects of vision is described. Opinions on visual processes, optics and methods of treating myopia, hypermetropia and astigmatism from ancient times through the Middle Ages, the renaissance and the following centuries are presented in particular. The beginning of the usage of glasses is discussed. Examples of the techniques which have been used to improve the subjective and objective methods of measuring refractive errors are also presented.
Hirsch, Robert M.; Moyer, Douglas; Archfield, Stacey A.
2010-01-01
A new approach to the analysis of long-term surface water-quality data is proposed and implemented. The goal of this approach is to increase the amount of information that is extracted from the types of rich water-quality datasets that now exist. The method is formulated to allow for maximum flexibility in representations of the long-term trend, seasonal components, and discharge-related components of the behavior of the water-quality variable of interest. It is designed to provide internally consistent estimates of the actual history of concentrations and fluxes as well as histories that eliminate the influence of year-to-year variations in streamflow. The method employs the use of weighted regressions of concentrations on time, discharge, and season. Finally, the method is designed to be useful as a diagnostic tool regarding the kinds of changes that are taking place in the watershed related to point sources, groundwater sources, and surface-water nonpoint sources. The method is applied to datasets for the nine large tributaries of Chesapeake Bay from 1978 to 2008. The results show a wide range of patterns of change in total phosphorus and in dissolved nitrate plus nitrite. These results should prove useful in further examination of the causes of changes, or lack of changes, and may help inform decisions about future actions to reduce nutrient enrichment in the Chesapeake Bay and its watershed.
Intervention time until discharge for newborns on transition from gavage to exclusive oral feeding.
Medeiros, Andréa Monteiro Correia; Ramos, Blenda Karen Batista; Bomfim, Déborah Letticia Santana Santos; Alvelos, Conceição Lima; Silva, Talita Cardoso da; Barreto, Ikaro Daniel de Carvalho; Santos, Felipe Batista; Gurgel, Ricardo Queiroz
2018-01-01
Purpose Measure the intervention time required for transition from gavage to exclusive oral feeding, comparing newborns exposed exclusively to the mother's breast with those who, in addition to breastfeeding, received supplementation using a cup or baby bottle. Methods Analytical, longitudinal, cohort study conducted with 165 newborns (NB) divided into groups according to severity of medical complications (G1-with no complications; G2-with significant complications), and into subgroups according to feeding mechanism (A and B). All NBs were low birth weight, on Kangaroo Mother Care, and breast stimulated according to medical prescription and hospital routine. Regarding feeding pattern, subgroup A comprised NBs exclusively breastfed at hospital discharge, whereas subgroup B was composed of NBs fed through cup/bottle at some time during hospitalization. The number of days spent in each stage of transition was recorded for each NB. Results History of clinical complications significantly influenced total intervention time. Study participants in subgroups G1-A (10 days), G1-B (9 days), and G2-A (12 days) displayed greater chances of early discharge compared with those in subgroup G2-B (16 days). Conclusion NBs with no important history of clinical complications displayed greater chances of early hospital discharge. NBs with significant history of clinical complications that underwent gavage to exclusive breastfeeding transition presented smaller intervention time than those that required supplementation using cup/bottle. Feeding transition using the gavage-to-exclusive oral feeding technique is recommended for Speech-language Pathology practice in Neonatology.
Acoustic Full Waveform Inversion to Characterize Near-surface Chemical Explosions
NASA Astrophysics Data System (ADS)
Kim, K.; Rodgers, A. J.
2015-12-01
Recent high-quality, atmospheric overpressure data from chemical high-explosive experiments provide a unique opportunity to characterize near-surface explosions, specifically estimating yield and source time function. Typically, yield is estimated from measured signal features, such as peak pressure, impulse, duration and/or arrival time of acoustic signals. However, the application of full waveform inversion to acoustic signals for yield estimation has not been fully explored. In this study, we apply a full waveform inversion method to local overpressure data to extract accurate pressure-time histories of acoustics sources during chemical explosions. A robust and accurate inversion technique for acoustic source is investigated using numerical Green's functions that take into account atmospheric and topographic propagation effects. The inverted pressure-time history represents the pressure fluctuation at the source region associated with the explosion, and thus, provides a valuable information about acoustic source mechanisms and characteristics in greater detail. We compare acoustic source properties (i.e., peak overpressure, duration, and non-isotropic shape) of a series of explosions having different emplacement conditions and investigate the relationship of the acoustic sources to the yields of explosions. The time histories of acoustic sources may refine our knowledge of sound-generation mechanisms of shallow explosions, and thereby allow for accurate yield estimation based on acoustic measurements. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
John Dewey on History Education and the Historical Method
ERIC Educational Resources Information Center
Fallace, Thomas D.
2010-01-01
This essay constructs a comprehensive view of Dewey's approach to history, the historical method, and history education. Drawing on Dewey's approach to the subject at the University of Chicago Laboratory School (1896-1904), Dewey's chapter on the historical method in "Logic: A Theory of Inquiry" (1938), and a critique of Dewey's…
Global solutions to the equation of thermoelasticity with fading memory
NASA Astrophysics Data System (ADS)
Okada, Mari; Kawashima, Shuichi
2017-07-01
We consider the initial-history value problem for the one-dimensional equation of thermoelasticity with fading memory. It is proved that if the data are smooth and small, then a unique smooth solution exists globally in time and converges to the constant equilibrium state as time goes to infinity. Our proof is based on a technical energy method which makes use of the strict convexity of the entropy function and the properties of strongly positive definite kernels.
Applications of Generalized Derivatives to Viscoelasticity.
1979-11-01
Integration Used to Evaluate the Inverse Transform 78 B-i Schematic of the Half-Space of Newtonian Fluid Bounded by a "Wetted" Surface 96 C-I The...of the response at discrete frequencies. The inverse transform of the response is evaluated numerically to produce the time history. The major drawback...of this method is the arduous task of calculating the inverse transform for every point in time at which the value of the response is required. The
Hu, Zhongkai; Jin, Bo; Shin, Andrew Y; Zhu, Chunqing; Zhao, Yifan; Hao, Shiying; Zheng, Le; Fu, Changlin; Wen, Qiaojun; Ji, Jun; Li, Zhen; Wang, Yong; Zheng, Xiaolin; Dai, Dorothy; Culver, Devore S; Alfreds, Shaun T; Rogow, Todd; Stearns, Frank; Sylvester, Karl G; Widen, Eric; Ling, Xuefeng B
2015-01-13
An easily accessible real-time Web-based utility to assess patient risks of future emergency department (ED) visits can help the health care provider guide the allocation of resources to better manage higher-risk patient populations and thereby reduce unnecessary use of EDs. Our main objective was to develop a Health Information Exchange-based, next 6-month ED risk surveillance system in the state of Maine. Data on electronic medical record (EMR) encounters integrated by HealthInfoNet (HIN), Maine's Health Information Exchange, were used to develop the Web-based surveillance system for a population ED future 6-month risk prediction. To model, a retrospective cohort of 829,641 patients with comprehensive clinical histories from January 1 to December 31, 2012 was used for training and then tested with a prospective cohort of 875,979 patients from July 1, 2012, to June 30, 2013. The multivariate statistical analysis identified 101 variables predictive of future defined 6-month risk of ED visit: 4 age groups, history of 8 different encounter types, history of 17 primary and 8 secondary diagnoses, 8 specific chronic diseases, 28 laboratory test results, history of 3 radiographic tests, and history of 25 outpatient prescription medications. The c-statistics for the retrospective and prospective cohorts were 0.739 and 0.732 respectively. Integration of our method into the HIN secure statewide data system in real time prospectively validated its performance. Cluster analysis in both the retrospective and prospective analyses revealed discrete subpopulations of high-risk patients, grouped around multiple "anchoring" demographics and chronic conditions. With the Web-based population risk-monitoring enterprise dashboards, the effectiveness of the active case finding algorithm has been validated by clinicians and caregivers in Maine. The active case finding model and associated real-time Web-based app were designed to track the evolving nature of total population risk, in a longitudinal manner, for ED visits across all payers, all diseases, and all age groups. Therefore, providers can implement targeted care management strategies to the patient subgroups with similar patterns of clinical histories, driving the delivery of more efficient and effective health care interventions. To the best of our knowledge, this prospectively validated EMR-based, Web-based tool is the first one to allow real-time total population risk assessment for statewide ED visits.
A Century of Enzyme Kinetic Analysis, 1913 to 2013
Johnson, Kenneth A.
2013-01-01
This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. PMID:23850893
Torsion of the fallopian tube--a late complication of sterilisation.
Sivanesaratnam, V
1986-02-01
Torsion of an intact fallopian tube, unaccompanied by torsion of the ipsilateral ovary, was noted as a complication of bilateral tubal occlusion by the Pomeroy method in a 45-year old Indian woman. The sterilization was performed 5 years previously, at the time of Cesarean section delivery. The patient presented with a history of pain in the right iliac fossa. Laparotomy showed that the distal segment of the right fallopian tube was twisted 3 times on the distal mesosalpinx and appeared tense and gangrenous. The right ovary was normal and a 2 cm gap was noted between the proximal and distal segments of the tube. As a rare complication of the Pomeroy method, the gap in the tube can allow the distal mesosalpinx to act as a pedicle, and with a long mesosalpinx, the fimbriated segment of the tube lies free and may swing and twist to produce torsion. The occurrence of torsion is further promoted by a vascular disturbance leading to venous congestion, edema, and increased weight of the free fimbrial end of the tube. In those patients with a history of sterilization, torsion of the fallopian tube should be considered in the differential diagnosis of acute lower abdominal pain. Torsion of the fallopian tube has also been reported following other methods of tubal occlusion, including cautery and clips.
[The early history of "Ecstasy"].
Benzenhöfer, U; Passie, T
2006-01-01
There is no consensus in the literature regarding the early history of MDMA (Methylendioxymethamphetamine, so-called "Ecstasy"). Various authors credit the first synthesis of MDMA to the German chemist Fritz Haber, but it appears neither in his doctoral thesis (Berlin 1891) nor in his accompanying articles. The man who first synthesized MDMA was the chemist Dr. Anton Köllisch, who worked for the German pharmaceutical company Merck. He created MDMA as a by-product while trying to synthesize hydrastinin, a styptic substance. In 1912, Merck filed to patent the applied method of preparation. The patent was issued in 1914, yet no pharmaceutical testing followed at that time.
'I Am a Nurse': Oral Histories of African Nurses.
Wall, Barbra Mann; Dhurmah, Krist; Lamboni, Bassan; Phiri, Benson Edwinson
2015-08-01
Much of African history has been written by colonial "masters" and is skewed by cultural bias. The voices of indigenous peoples have largely been ignored. The purpose of this study was to collect the oral histories of African nursing leaders who studied and practiced nursing from the late colonial era (1950s) through decolonization and independence (1960s-70s), in order to better understand their experiences and perspectives. This study relied on historical methodology, grounded specifically within the context of decolonization and independence. The method used was oral history. Oral histories were collected from 13 retired nurses from Mauritius, Malawi, and Togo. Participants' educational and work histories bore the distinct imprint of European educational and medical norms. Nursing education provided a means of earning a living and offered professional advancement and affirmation. Participants were reluctant to discuss the influence of race, but several recalled difficulties in working with both expatriate and indigenous physicians and matrons. Differences in African nurses' experiences were evident at the local level, particularly with regard to language barriers, gender-related divisions, and educational and practice opportunities. The data show that although institutional models and ideas were transported from colonial nursing leaders to African nursing students, the African nurses in this study adapted those models and ideas to meet their own needs. The findings also support the use of storytelling as a culturally appropriate research method. Participants' stories provide a better understanding of how time, place, and social and cultural forces influenced and affected local nursing practices. Their stories also reveal that nursing has held various meanings for participants, including as a means to personal and professional opportunities and as a way to help their countries' citizens.
The Association between Unintended Pregnancy and Violence among Incarcerated Men and Women
Kelly, Patricia J.; Ramaswamy, Megha
2018-01-01
Background In this article, we examine the association between unintended pregnancy and individual and community level indicators of violence in a population of both women and men in the criminal justice system. Methods We conducted a cross-sectional survey with 290 women and 306 men in 3 correctional facilities in Kansas City and used logistic regression models to assess relationships between key independent variables and unintended pregnancy. Findings In gender-specific logistic regression models, women with a history of intimate partner violence were 2.02 times more likely (CI 1.15, 3.56), and those with a history of sexual abuse before age 16 were 1.23 times more likely (CI 1.02–1.49) to have experienced unintended pregnancy. Men or their family members who were victimized by neighborhood violence were 1.82 times more likely to have experienced unintended pregnancy (CI 1.01, 3.28). Discussion These findings suggest the need for gender and community-specific interventions that address the relationship between violence and unintended pregnancy. PMID:23136860
Risk Factors for Self-Reported Cholera Within HIV-Affected Households in Rural Haiti
Cheung, Hoi Ching; Meiselbach, Mark K; Jerome, Gregory; Ternier, Ralph; Ivers, Louise C
2018-01-01
Abstract Background Cholera continues to be a major cause of morbidity and mortality worldwide and is now endemic in Haiti since first being introduced in 2010. Cholera and HIV have significant geographic overlap globally, but little is known about the clinical features and risk of cholera among HIV-infected people and their households. Methods We assessed HIV-affected households originally recruited for a randomized controlled trial of food supplements. We assessed for correlation between household and individual factors and reported history of cholera since 2010 using univariable and multivariable analyses. Results There were 352 HIV-infected household members, 32 with reported history of medically attended cholera, and 1968 other household members, 55 with reported history of medically attended cholera. Among HIV-infected individuals in this study, no variables correlated with reported history of cholera in univariable analyses. Among all household members, known HIV infection (adjusted odds ratio [AOR], 3.75; 95% CI, 2.43–5.79; P < .0001), source of income in the household (AOR, 1.82; 95% CI, 1.05–3.15; P = .034), time required to fetch water (AOR, 1.07 per 5-minute increase; 95% CI, 1.01–1.12; P = .015), and severe household food insecurity (AOR, 3.23; 95% CI, 1.25–8.34; P = .016) were correlated with reported history of cholera in a multivariable analysis. Conclusions Known HIV infection, source of household income, time required to fetch water, and severe household food insecurity were independently associated with reported history of medically attended cholera in HIV-affected households in rural Haiti. Further research is required to better understand the interactions between HIV and cholera. PMID:29942825
Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (ORION)
NASA Technical Reports Server (NTRS)
Mott, Diana L.; Bigler, Mark A.
2017-01-01
NASA uses two HRA assessment methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is still expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a PRA model that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more problematic. In order to determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the assumptions and expectations expressed in the assessments will be needed when the procedures, flight rules and operational requirements are developed and then finalized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazloom, M.
2008-07-08
The idea of safe room has been developed for decreasing the earthquake casualties in masonry buildings. The information obtained from the previous ground motions occurring in seismic zones expresses the lack of enough safety of these buildings against earthquakes. For this reason, an attempt has been made to create some safe areas inside the existing masonry buildings, which are called safe rooms. The practical method for making these safe areas is to install some prefabricated steel frames in some parts of the existing structure. These frames do not carry any service loads before an earthquake. However, if a devastating earthquakemore » happens and the load bearing walls of the building are destroyed, some parts of the floors, which are in the safe areas, will fall on the roof of the installed frames and the occupants who have sheltered there will survive. This paper presents the performance of these frames located in a destroying three storey masonry building with favorable conclusions. In fact, the experimental pushover diagram of the safe room located at the ground-floor level of this building is compared with the analytical results and it is concluded that pushover analysis is a good method for seismic performance evaluation of safe rooms. For time history analysis the 1940 El Centro, the 2003 Bam, and the 1990 Manjil earthquake records with the maximum peak accelerations of 0.35g were utilized. Also the design spectrum of Iranian Standard No. 2800-05 for the ground kind 2 is used for response spectrum analysis. The results of time history, response spectrum and pushover analyses show that the strength and displacement capacity of the steel frames are adequate to accommodate the distortions generated by seismic loads and aftershocks properly.« less
A high fidelity real-time simulation of a small turboshaft engine
NASA Technical Reports Server (NTRS)
Ballin, Mark G.
1988-01-01
A high-fidelity component-type model and real-time digital simulation of the General Electric T700-GE-700 turboshaft engine were developed for use with current generation real-time blade-element rotor helicopter simulations. A control system model based on the specification fuel control system used in the UH-60A Black Hawk helicopter is also presented. The modeling assumptions and real-time digital implementation methods particular to the simulation of small turboshaft engines are described. The validity of the simulation is demonstrated by comparison with analysis-oriented simulations developed by the manufacturer, available test data, and flight-test time histories.
Durability of Structural Adhesively Bonded System.
1981-06-01
Composites , Finite Element Method. II DURABILITY OF STRUCTURAL ADHESIVELY BONDED SYSTEMS TABLE OF CONTENTS 1. Introduction...That investigation was mainly devoted to the temperature effects in time on the mechanical behavior of fiber-reinforced plastic (FRP) composites and...ervironmental-loading history on the mechanical performance of similar FRP composites (which may serve as adherends in structural bcnded systems). That
The Solution of Large Time-Dependent Problems Using Reduced Coordinates.
1987-06-01
numerical intergration schemes for dynamic problems, the algorithm known as Newmark’s Method. The behavior of the Newmark scheme, as well as the basic...T’he horizontal displacements at the mid-height and the bottom of the buildin- are shown in f igure 4. 13. The solution history illustrated is for a
A Contribution to the History of Assessment: How a Conversation Simulator Redeems Socratic Method
ERIC Educational Resources Information Center
Nelson, Robert; Dawson, Phillip
2014-01-01
Assessment in education is a recent phenomenon. Although there were counterparts in former epochs, the term assessment only began to be spoken about in education after the Second World War; and, since that time, views, strategies and concerns over assessment have proliferated according to an uncomfortable dynamic. We fear that, increasingly,…
NASA Astrophysics Data System (ADS)
Kleinman, Leonard
2001-03-01
The history of pseudopotentials from 1934 to the present time will be discussed. The speaker's personal involvement will be described but not to the neglect of the many others who have made huge contributions to the field. We end with the question, 'Is it possible that pseudopotential calculations could be more accurate than those made using the full potential augmented plane wave method?'.
Hartzell, S.; Liu, P.
1996-01-01
A method is presented for the simultaneous calculation of slip amplitudes and rupture times for a finite fault using a hybrid global search algorithm. The method we use combines simulated annealing with the downhill simplex method to produce a more efficient search algorithm then either of the two constituent parts. This formulation has advantages over traditional iterative or linearized approaches to the problem because it is able to escape local minima in its search through model space for the global optimum. We apply this global search method to the calculation of the rupture history for the Landers, California, earthquake. The rupture is modeled using three separate finite-fault planes to represent the three main fault segments that failed during this earthquake. Both the slip amplitude and the time of slip are calculated for a grid work of subfaults. The data used consist of digital, teleseismic P and SH body waves. Long-period, broadband, and short-period records are utilized to obtain a wideband characterization of the source. The results of the global search inversion are compared with a more traditional linear-least-squares inversion for only slip amplitudes. We use a multi-time-window linear analysis to relax the constraints on rupture time and rise time in the least-squares inversion. Both inversions produce similar slip distributions, although the linear-least-squares solution has a 10% larger moment (7.3 ?? 1026 dyne-cm compared with 6.6 ?? 1026 dyne-cm). Both inversions fit the data equally well and point out the importance of (1) using a parameterization with sufficient spatial and temporal flexibility to encompass likely complexities in the rupture process, (2) including suitable physically based constraints on the inversion to reduce instabilities in the solution, and (3) focusing on those robust rupture characteristics that rise above the details of the parameterization and data set.
Stockman, Jamila K.; Syvertsen, Jennifer L.; Robertson, Angela M.; Ludwig-Barron, Natasha T.; Bergmann, Julie N.; Palinkas, Lawrence A.
2014-01-01
BACKGROUND Female-initiated barrier methods for the prevention of HIV may be an effective alternative for drug-using women who are unable to negotiate safe sex, often as a result of physical and/or sexual partner violence. METHODS Utilizing a SAVA (substance abuse, violence, and AIDS) syndemic framework, we qualitatively examined perspectives on female condoms and vaginal microbicides among 18 women with histories of methamphetamine abuse and partner violence in San Diego, CA, USA. FINDINGS Most women were not interested in female condoms due to perceived discomfort, difficulty of insertion, time-intensive effort, and unappealing appearance. Alternatively, most women viewed vaginal microbicides as a useful method. Positive aspects included convenience, ability to disguise as a lubricant, and a sense of control and empowerment. Concerns included possible side effects, timing of application, and unfavorable characteristics of the gel. Acceptability of female-initiated barrier methods was context dependent (i.e., partner type, level of drug use and violence that characterized the sexual relationship). CONCLUSIONS Findings indicate that efforts are needed to address barriers identified for vaginal microbicides to increase its uptake in future HIV prevention trials and marketing of future FDA-approved products. Strategies should address gender-based inequalities (e.g., partner violence) experienced by drug-using women and promote female empowerment. Education on female-initiated barrier methods is also needed for women who use drugs, as well as health care providers and other professionals providing sexual health care and contraception to women with histories of drug use and partner violence. PMID:24837396
Tag SNP selection via a genetic algorithm.
Mahdevar, Ghasem; Zahiri, Javad; Sadeghi, Mehdi; Nowzari-Dalini, Abbas; Ahrabian, Hayedeh
2010-10-01
Single Nucleotide Polymorphisms (SNPs) provide valuable information on human evolutionary history and may lead us to identify genetic variants responsible for human complex diseases. Unfortunately, molecular haplotyping methods are costly, laborious, and time consuming; therefore, algorithms for constructing full haplotype patterns from small available data through computational methods, Tag SNP selection problem, are convenient and attractive. This problem is proved to be an NP-hard problem, so heuristic methods may be useful. In this paper we present a heuristic method based on genetic algorithm to find reasonable solution within acceptable time. The algorithm was tested on a variety of simulated and experimental data. In comparison with the exact algorithm, based on brute force approach, results show that our method can obtain optimal solutions in almost all cases and runs much faster than exact algorithm when the number of SNP sites is large. Our software is available upon request to the corresponding author.
Li, J J; Li, Y F; Li, Q; Liu, X L; Liu, Y L; Shi, W Y; Liu, R
2018-06-05
Objective: To develop a self-designed questionnaire for history collection of patients with vertigo or dizziness, and to analyze its effect to clinical work. Method: An observational study was conducted to collect 69 patients who had undergone vestibular function tests in our department with main complaints of " dizziness and dizziness".Information was extracted from the questionnaires filled by patients themselves and the inpatient medical records wrote by doctors.The differences in ability to reflect clinically important information was investigated between the questionnaire and medical record. Result: Questionnaire is more comprehensive and meticulous for history collection. It is better than inpatient medical documents. It contains character of vertigo, duration, frequency of attack, time of onset, inducing and aggravating factors, relation with position and posture, concomitant symptoms, ear condition, vision, headache, conditions of other systems, consciousness, medication, VAS score of instability, previous history, personal history, family history, and positive results of examinations.It has a better detection rate in terms of vertigo frequency, duration, suspicious otolith or not, and vestibular compensation than that of inpatient medical record ( P <0.05). Conclusion: This self-designed questionnaire can help doctors to collect medical history of patients with vertigo or dizziness.It is worthy of clinical promotion as an important supplement for inpatient and outpatient medical records. Copyright© by the Editorial Department of Journal of Clinical Otorhinolaryngology Head and Neck Surgery.
Krampen, Günter
Examines scientometrically the trends in and the recent situation of research on and the teaching of the history of psychology in the German-speaking countries and compares the findings with the situation in other countries (mainly the United States) by means of the psychology databases PSYNDEX and PsycINFO. Declines of publications on the history of psychology are described scientometrically for both research communities since the 1990s. Some impulses are suggested for the future of research on and the teaching of the history of psychology. These include (1) the necessity and significance of an intensified use of quantitative, unobtrusive scientometric methods in historiography in times of digital "big data", (2) the necessity and possibilities to integrate qualitative and quantitative methodologies in historical research and teaching, (3) the reasonableness of interdisciplinary cooperation of specialist historians, scientometricians, and psychologists, (4) the meaningfulness and necessity to explore, investigate, and teach more intensively the past and the problem history of psychology as well as the understanding of the subject matter of psychology in its historical development in cultural contexts. The outlook on the future of such a more up-to-date research on and teaching of the history of psychology is-with some caution-positive.
Kashif, Muhammad; Bonnety, Jérôme; Guibert, Philippe; Morin, Céline; Legros, Guillaume
2012-12-17
A Laser Extinction Method has been set up to provide two-dimensional soot volume fraction field time history at a tunable frequency up to 70 Hz inside an axis-symmetric diffusion flame experiencing slow unsteady phenomena preserving the symmetry. The use of a continuous wave laser as the light source enables this repetition rate, which is an incremental advance in the laser extinction technique. The technique is shown to allow a fine description of the soot volume fraction field in a flickering flame exhibiting a 12.6 Hz flickering phenomenon. Within this range of repetition rate, the technique and its subsequent post-processing require neither any method for time-domain reconstruction nor any correction for energy intrusion. Possibly complemented by such a reconstruction method, the technique should support further soot volume fraction database in oscillating flames that exhibit characteristic times relevant to the current efforts in the validation of soot processes modeling.
NASA Technical Reports Server (NTRS)
1979-01-01
The computer program Linear SCIDNT which evaluates rotorcraft stability and control coefficients from flight or wind tunnel test data is described. It implements the maximum likelihood method to maximize the likelihood function of the parameters based on measured input/output time histories. Linear SCIDNT may be applied to systems modeled by linear constant-coefficient differential equations. This restriction in scope allows the application of several analytical results which simplify the computation and improve its efficiency over the general nonlinear case.
A method for determining the weak statistical stationarity of a random process
NASA Technical Reports Server (NTRS)
Sadeh, W. Z.; Koper, C. A., Jr.
1978-01-01
A method for determining the weak statistical stationarity of a random process is presented. The core of this testing procedure consists of generating an equivalent ensemble which approximates a true ensemble. Formation of an equivalent ensemble is accomplished through segmenting a sufficiently long time history of a random process into equal, finite, and statistically independent sample records. The weak statistical stationarity is ascertained based on the time invariance of the equivalent-ensemble averages. Comparison of these averages with their corresponding time averages over a single sample record leads to a heuristic estimate of the ergodicity of a random process. Specific variance tests are introduced for evaluating the statistical independence of the sample records, the time invariance of the equivalent-ensemble autocorrelations, and the ergodicity. Examination and substantiation of these procedures were conducted utilizing turbulent velocity signals.
ERIC Educational Resources Information Center
Snyder, Jennifer
2012-01-01
Students often have a hard time equating time spent on art history as time well spent in the art room. Likewise, art teachers struggle with how to keep interest in their classrooms high when the subject turns to history. Some teachers show endless videos, with the students nodding sleepily along to the narrator. Others try to incorporate small…
Strange Gravity: Toward a Unified Theory of Joint Warfighting
2001-05-31
A Brief History of Time : From...74. 115 Carl von Clausewitz, On War, 606. 116 Stephen Hawking, A Brief History of Time , 169. 63 APPENDIX Clausewitz on the Center of Gravity This...Storm over Iraq: Airpower in the Gulf War. Washington: Smithsonian Institution Press, 1992. Hawking, Stephen. A Brief History of Time : From
Star Formation Histories of Local Group Dwarf Galaxies. (Ludwig Biermann Award Lecture 1996)
NASA Astrophysics Data System (ADS)
Grebel, E. K.
The star formation histories of dwarf galaxies in the Local Group are reviewed. First the question of Local Group membership is considered based on various criteria. The properties of 31 (36) galaxies are consistent with likely (potential) Local Group membership. To study the star formation histories of these galaxies, a multi-parameter problem needs to be solved: Ages, metallicities, population fractions, and spatial variations must be determined, which depend crucially on the knowledge of reddening and distance. The basic methods for studying resolvable stellar populations are summarized. One method is demonstrated using the Fornax dwarf spheroidal galaxy. A comprehensive compilation of the star formation histories of dwarf irregulars, dwarf ellipticals, and dwarf spheroidals in the Local Group is presented and visualized through Hodge's population boxes. All galaxies appear to have differing fractions of old and intermediate-age populations, and those sufficiently massive and undisturbed to retain and recycle their gas are still forming stars today. Star formation has occurred either in distinct episodes or continuously over long periods of time. Metallicities and enrichment vary widely. Constraints on merger and remnant scenarios are discussed, and a unified picture based on the current knowledge is presented. Primary goals for future observations are: accurate age determinations based on turnoff photometry, detection of subpopulations distinct in age, metallicity, and/or spatial distribution; improved distances; and astrometric studies to derive orbits and constrain past and future interactions.
Matteson, Kristen A; Munro, Malcolm G; Fraser, Ian S
2011-09-01
Abnormal uterine bleeding (AUB) is a prevalent symptom that encompasses abnormalities in menstrual regularity, duration, frequency and/or volume, and it is encountered frequently by both primary care physicians and obstetrician-gynecologists. Research on AUB has used numerous methods to measure bleeding and assess symptoms, but the lack of universally accepted outcome measures hinder the quality of research and the ability of clinical investigators to collaborate in multicenter trials. Similarly, clinical care for women reporting heavy, prolonged, or irregular menstrual bleeding is not optimized because standard ways of evaluating symptoms and change in symptoms over time do not exist. This article describes (1) the current methods of evaluating women with AUB, both in research and clinical care; and (2) offers suggestions for the development of a standardized structured menstrual history for use in both research and clinical care. © Thieme Medical Publishers.
The History of a Decision: A Standard Vibration Test Method for Qualification
Rizzo, Davinia; Blackburn, Mark
2017-01-01
As Mil-Std-810G and subsequent versions have included multiple degree of freedom vibration test methodologies, it is important to understand the history and factors that drove the original decision in Mil-Std-810 to focus on single degree of freedom (SDOF) vibration testing. By assessing the factors and thought process of early Mil-Std-810 vibration test methods, it enables one to better consider the use of multiple degree of freedom testing now that it is feasible with today’s technology and documented in Mil-Std-810. This paper delves into the details of the decision made in the 1960s for the SDOF vibration testing standards in Mil-Std-810more » beyond the limitations of technology at the time. We also consider the implications for effective test planning today considering the advances in test capabilities and improvements in understanding of the operational environment.« less
The History of a Decision: A Standard Vibration Test Method for Qualification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rizzo, Davinia; Blackburn, Mark
As Mil-Std-810G and subsequent versions have included multiple degree of freedom vibration test methodologies, it is important to understand the history and factors that drove the original decision in Mil-Std-810 to focus on single degree of freedom (SDOF) vibration testing. By assessing the factors and thought process of early Mil-Std-810 vibration test methods, it enables one to better consider the use of multiple degree of freedom testing now that it is feasible with today’s technology and documented in Mil-Std-810. This paper delves into the details of the decision made in the 1960s for the SDOF vibration testing standards in Mil-Std-810more » beyond the limitations of technology at the time. We also consider the implications for effective test planning today considering the advances in test capabilities and improvements in understanding of the operational environment.« less
Composition measurements of binary mixture droplets by rainbow refractometry.
Wilms, J; Weigand, B
2007-04-10
So far, refractive index measurements by rainbow refractometry have been used to determine the temperature of single droplets and ensembles of droplets. Rainbow refractometry is, for the first time, to the best of our knowledge, applied to measure composition histories of evaporating, binary mixture droplets. An evaluation method is presented that makes use of Airy theory and the simultaneous size measurement by Mie scattering imaging. The method further includes an empirical correction function for a certain diameter and refractive index range. The measurement uncertainty was investigated by numerical simulations with Lorenz-Mie theory. For the experiments, an optical levitation setup was used allowing for long measurement periods. Temperature measurements of single-component droplets at different temperature levels are shown to demonstrate the accuracy of rainbow refractometry. Measurements of size and composition histories of binary mixture droplets are presented for two different mixtures. Experimental results show good agreement with numerical results using a rapid-mixing model.
Spatial methods for deriving crop rotation history
NASA Astrophysics Data System (ADS)
Mueller-Warrant, George W.; Trippe, Kristin M.; Whittaker, Gerald W.; Anderson, Nicole P.; Sullivan, Clare S.
2017-08-01
Benefits of converting 11 years of remote sensing classification data into cropping history of agricultural fields included measuring lengths of rotation cycles and identifying specific sequences of intervening crops grown between final years of old grass seed stands and establishment of new ones. Spatial and non-spatial methods were complementary. Individual-year classification errors were often correctable in spreadsheet-based non-spatial analysis, whereas their presence in spatial data generally led to exclusion of fields from further analysis. Markov-model testing of non-spatial data revealed that year-to-year cropping sequences did not match average frequencies for transitions among crops grown in western Oregon, implying that rotations into new grass seed stands were influenced by growers' desires to achieve specific objectives. Moran's I spatial analysis of length of time between consecutive grass seed stands revealed that clustering of fields was relatively uncommon, with high and low value clusters only accounting for 7.1 and 6.2% of fields.
Automatic detection of key innovations, rate shifts, and diversity-dependence on phylogenetic trees.
Rabosky, Daniel L
2014-01-01
A number of methods have been developed to infer differential rates of species diversification through time and among clades using time-calibrated phylogenetic trees. However, we lack a general framework that can delineate and quantify heterogeneous mixtures of dynamic processes within single phylogenies. I developed a method that can identify arbitrary numbers of time-varying diversification processes on phylogenies without specifying their locations in advance. The method uses reversible-jump Markov Chain Monte Carlo to move between model subspaces that vary in the number of distinct diversification regimes. The model assumes that changes in evolutionary regimes occur across the branches of phylogenetic trees under a compound Poisson process and explicitly accounts for rate variation through time and among lineages. Using simulated datasets, I demonstrate that the method can be used to quantify complex mixtures of time-dependent, diversity-dependent, and constant-rate diversification processes. I compared the performance of the method to the MEDUSA model of rate variation among lineages. As an empirical example, I analyzed the history of speciation and extinction during the radiation of modern whales. The method described here will greatly facilitate the exploration of macroevolutionary dynamics across large phylogenetic trees, which may have been shaped by heterogeneous mixtures of distinct evolutionary processes.
Automatic Detection of Key Innovations, Rate Shifts, and Diversity-Dependence on Phylogenetic Trees
Rabosky, Daniel L.
2014-01-01
A number of methods have been developed to infer differential rates of species diversification through time and among clades using time-calibrated phylogenetic trees. However, we lack a general framework that can delineate and quantify heterogeneous mixtures of dynamic processes within single phylogenies. I developed a method that can identify arbitrary numbers of time-varying diversification processes on phylogenies without specifying their locations in advance. The method uses reversible-jump Markov Chain Monte Carlo to move between model subspaces that vary in the number of distinct diversification regimes. The model assumes that changes in evolutionary regimes occur across the branches of phylogenetic trees under a compound Poisson process and explicitly accounts for rate variation through time and among lineages. Using simulated datasets, I demonstrate that the method can be used to quantify complex mixtures of time-dependent, diversity-dependent, and constant-rate diversification processes. I compared the performance of the method to the MEDUSA model of rate variation among lineages. As an empirical example, I analyzed the history of speciation and extinction during the radiation of modern whales. The method described here will greatly facilitate the exploration of macroevolutionary dynamics across large phylogenetic trees, which may have been shaped by heterogeneous mixtures of distinct evolutionary processes. PMID:24586858
Aschenbrenner, Andrew J.; Balota, David A.; Gordon, Brian A.; Ratcliff, Roger; Morris, John C.
2015-01-01
Objective A family history of Alzheimer disease (AD) increases the risk of developing AD and can influence the accumulation of well-established AD biomarkers. There is some evidence that family history can influence episodic memory performance even in cognitively normal individuals. We attempted to replicate the effect of family history on episodic memory and used a specific computational model of binary decision making (the diffusion model) to understand precisely how family history influences cognition. Finally, we assessed the sensitivity of model parameters to family history controlling for standard neuropsychological test performance. Method Across two experiments, cognitively healthy participants from the Adult Children Study completed an episodic recognition test consisting of high and low frequency words. The diffusion model was applied to decompose accuracy and reaction time into latent parameters which were analyzed as a function of family history. Results In both experiments, individuals with a family history of AD exhibited lower recognition accuracy and this occurred in the absence of an apolipoprotein E (APOE) ε4 allele. The diffusion model revealed this difference was due to changes in the quality of information accumulation (the drift rate) and not differences in response caution or other model parameters. This difference remained after controlling for several standard neuropsychological tests. Conclusions These results confirm that the presence of a family history of AD confers a subtle cognitive deficit in episodic memory as reflected by decreased drift rate that cannot be attributed to APOE. This measure may serve as a novel cognitive marker of preclinical AD. PMID:26192539
Deterministically estimated fission source distributions for Monte Carlo k-eigenvalue problems
Biondo, Elliott D.; Davidson, Gregory G.; Pandya, Tara M.; ...
2018-04-30
The standard Monte Carlo (MC) k-eigenvalue algorithm involves iteratively converging the fission source distribution using a series of potentially time-consuming inactive cycles before quantities of interest can be tallied. One strategy for reducing the computational time requirements of these inactive cycles is the Sourcerer method, in which a deterministic eigenvalue calculation is performed to obtain an improved initial guess for the fission source distribution. This method has been implemented in the Exnihilo software suite within SCALE using the SPNSPN or SNSN solvers in Denovo and the Shift MC code. The efficacy of this method is assessed with different Denovo solutionmore » parameters for a series of typical k-eigenvalue problems including small criticality benchmarks, full-core reactors, and a fuel cask. Here it is found that, in most cases, when a large number of histories per cycle are required to obtain a detailed flux distribution, the Sourcerer method can be used to reduce the computational time requirements of the inactive cycles.« less
Light scattering methods to test inorganic PCMs for application in buildings
NASA Astrophysics Data System (ADS)
De Paola, M. G.; Calabrò, V.; De Simone, M.
2017-10-01
Thermal performance and stability over time are key parameters for the characterization and application of PCMs in the building sector. Generally, inorganic PCMs are dispersions of hydrated salts and additives in water that counteract phase segregation phenomena and subcooling. Traditional methods or in “house” methods can be used for evaluating thermal properties, while stability can be estimated over time by using optical techniques. By considering this double approach, in this work thermal and structural analyses of Glauber salt based composite PCMs are conducted by means of non-conventional equipment: T-history method (thermal analysis) and Turbiscan (stability analysis). Three samples with the same composition (Glauber salt with additives) were prepared by using different sonication times and their thermal performances were compared by testing both the thermal cycling and the thermal properties. The stability of the mixtures was verified by the identification of destabilization phenomena, the evaluation of the migration velocities of particles and the estimation of variation of particle size.
Deterministically estimated fission source distributions for Monte Carlo k-eigenvalue problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biondo, Elliott D.; Davidson, Gregory G.; Pandya, Tara M.
The standard Monte Carlo (MC) k-eigenvalue algorithm involves iteratively converging the fission source distribution using a series of potentially time-consuming inactive cycles before quantities of interest can be tallied. One strategy for reducing the computational time requirements of these inactive cycles is the Sourcerer method, in which a deterministic eigenvalue calculation is performed to obtain an improved initial guess for the fission source distribution. This method has been implemented in the Exnihilo software suite within SCALE using the SPNSPN or SNSN solvers in Denovo and the Shift MC code. The efficacy of this method is assessed with different Denovo solutionmore » parameters for a series of typical k-eigenvalue problems including small criticality benchmarks, full-core reactors, and a fuel cask. Here it is found that, in most cases, when a large number of histories per cycle are required to obtain a detailed flux distribution, the Sourcerer method can be used to reduce the computational time requirements of the inactive cycles.« less
NASA Astrophysics Data System (ADS)
Rana, Sachin; Ertekin, Turgay; King, Gregory R.
2018-05-01
Reservoir history matching is frequently viewed as an optimization problem which involves minimizing misfit between simulated and observed data. Many gradient and evolutionary strategy based optimization algorithms have been proposed to solve this problem which typically require a large number of numerical simulations to find feasible solutions. Therefore, a new methodology referred to as GP-VARS is proposed in this study which uses forward and inverse Gaussian processes (GP) based proxy models combined with a novel application of variogram analysis of response surface (VARS) based sensitivity analysis to efficiently solve high dimensional history matching problems. Empirical Bayes approach is proposed to optimally train GP proxy models for any given data. The history matching solutions are found via Bayesian optimization (BO) on forward GP models and via predictions of inverse GP model in an iterative manner. An uncertainty quantification method using MCMC sampling in conjunction with GP model is also presented to obtain a probabilistic estimate of reservoir properties and estimated ultimate recovery (EUR). An application of the proposed GP-VARS methodology on PUNQ-S3 reservoir is presented in which it is shown that GP-VARS provides history match solutions in approximately four times less numerical simulations as compared to the differential evolution (DE) algorithm. Furthermore, a comparison of uncertainty quantification results obtained by GP-VARS, EnKF and other previously published methods shows that the P50 estimate of oil EUR obtained by GP-VARS is in close agreement to the true values for the PUNQ-S3 reservoir.
Multiresolution forecasting for futures trading using wavelet decompositions.
Zhang, B L; Coggins, R; Jabri, M A; Dersch, D; Flower, B
2001-01-01
We investigate the effectiveness of a financial time-series forecasting strategy which exploits the multiresolution property of the wavelet transform. A financial series is decomposed into an over complete, shift invariant scale-related representation. In transform space, each individual wavelet series is modeled by a separate multilayer perceptron (MLP). We apply the Bayesian method of automatic relevance determination to choose short past windows (short-term history) for the inputs to the MLPs at lower scales and long past windows (long-term history) at higher scales. To form the overall forecast, the individual forecasts are then recombined by the linear reconstruction property of the inverse transform with the chosen autocorrelation shell representation, or by another perceptron which learns the weight of each scale in the prediction of the original time series. The forecast results are then passed to a money management system to generate trades.
Signal processing of aircraft flyover noise
NASA Technical Reports Server (NTRS)
Kelly, Jeffrey J.
1991-01-01
A detailed analysis of signal processing concerns for measuring aircraft flyover noise is presented. Development of a de-Dopplerization scheme for both corrected time history and spectral data is discussed along with an analysis of motion effects on measured spectra. A computer code was written to implement the de-Dopplerization scheme. Input to the code is the aircraft position data and the pressure time histories. To facilitate ensemble averaging, a uniform level flyover is considered but the code can accept more general flight profiles. The effects of spectral smearing and its removal is discussed. Using data acquired from XV-15 tilt rotor flyover test comparisons are made showing the measured and corrected spectra. Frequency shifts are accurately accounted for by the method. It is shown that correcting for spherical spreading, Doppler amplitude, and frequency can give some idea about source directivity. The analysis indicated that smearing increases with frequency and is more severe on approach than recession.
Star formation history of the galaxy merger Mrk848 with SDSS-IV MaNGA
NASA Astrophysics Data System (ADS)
Yuan, Fang-Ting; Shen, Shiyin; Hao, Lei; Fernandez, Maria Argudo
2017-03-01
With the 3D data of SDSS-IV MaNGA (Bundy et al. 2015) spectra and multi-wavelength SED modeling, we expect to have a better understanding of the distribution of dust, gas and star formation of galaxy mergers. For a case study of the merging galaxy Mrk848, we use both UV-to-IR broadband SED and the MaNGA integral field spectroscopy to obtain its star formation histories at the tail and core regions. From the SED fitting and full spectral fitting, we find that the star formation in the tail regions are affected by the interaction earlier than the core regions. The core regions show apparently two times of star formation and a strong burst within 500Myr, indicating the recent star formation is triggered by the interaction. The star formation histories derived from these two methods are basically consistent.
Exploring the Earth's crust: history and results of controlled-source seismology
Prodehl, Claus; Mooney, Walter D.
2012-01-01
This volume contains a comprehensive, worldwide history of seismological studies of the Earth’s crust using controlled sources from 1850 to 2005. Essentially all major seismic projects on land and the most important oceanic projects are covered. The time period 1850 to 1939 is presented as a general synthesis, and from 1940 onward the history and results are presented in separate chapters for each decade, with the material organized by geographical region. Each chapter highlights the major advances achieved during that decade in terms of data acquisition, processing technology, and interpretation methods. For all major seismic projects, the authors provide specific details on field observations, interpreted crustal cross sections, and key references. They conclude with global and continental-scale maps of all field measurements and interpreted Moho contours. An accompanying DVD contains important out-of-print publications and an extensive collection of controlled-source data, location maps, and crustal cross sections.
Real time charge efficiency monitoring for nickel electrodes in NICD and NIH2 cells
NASA Astrophysics Data System (ADS)
Zimmerman, A. H.
1987-09-01
The charge efficiency of nickel-cadmium and nickel-hydrogen battery cells is critical in spacecraft applications for determining the amount of time required for a battery to reach a full state of charge. As the nickel-cadmium or nickel-hydrogen batteries approach about 90 percent state of charge, the charge efficiency begins to drop towards zero, making estimation of the total amount of stored charge uncertain. Charge efficiency estimates are typically based on prior history of available capacity following standardized conditions for charge and discharge. These methods work well as long as performance does not change significantly. A relatively simple method for determining charge efficiencies during real time operation for these battery cells would be a tremendous advantage. Such a method was explored and appears to be quite well suited for application to nickel-cadmium and nickel-hydrogen battery cells. The charge efficiency is monitored in real time, using only voltage measurements as inputs. With further evaluation such a method may provide a means to better manage charge control of batteries, particularly in systems where a high degree of autonomy or system intelligence is required.
Real time charge efficiency monitoring for nickel electrodes in NICD and NIH2 cells
NASA Technical Reports Server (NTRS)
Zimmerman, A. H.
1987-01-01
The charge efficiency of nickel-cadmium and nickel-hydrogen battery cells is critical in spacecraft applications for determining the amount of time required for a battery to reach a full state of charge. As the nickel-cadmium or nickel-hydrogen batteries approach about 90 percent state of charge, the charge efficiency begins to drop towards zero, making estimation of the total amount of stored charge uncertain. Charge efficiency estimates are typically based on prior history of available capacity following standardized conditions for charge and discharge. These methods work well as long as performance does not change significantly. A relatively simple method for determining charge efficiencies during real time operation for these battery cells would be a tremendous advantage. Such a method was explored and appears to be quite well suited for application to nickel-cadmium and nickel-hydrogen battery cells. The charge efficiency is monitored in real time, using only voltage measurements as inputs. With further evaluation such a method may provide a means to better manage charge control of batteries, particularly in systems where a high degree of autonomy or system intelligence is required.
NASA Astrophysics Data System (ADS)
Young, A. J.; Kuiken, T. A.; Hargrove, L. J.
2014-10-01
Objective. The purpose of this study was to determine the contribution of electromyography (EMG) data, in combination with a diverse array of mechanical sensors, to locomotion mode intent recognition in transfemoral amputees using powered prostheses. Additionally, we determined the effect of adding time history information using a dynamic Bayesian network (DBN) for both the mechanical and EMG sensors. Approach. EMG signals from the residual limbs of amputees have been proposed to enhance pattern recognition-based intent recognition systems for powered lower limb prostheses, but mechanical sensors on the prosthesis—such as inertial measurement units, position and velocity sensors, and load cells—may be just as useful. EMG and mechanical sensor data were collected from 8 transfemoral amputees using a powered knee/ankle prosthesis over basic locomotion modes such as walking, slopes and stairs. An offline study was conducted to determine the benefit of different sensor sets for predicting intent. Main results. EMG information was not as accurate alone as mechanical sensor information (p < 0.05) for any classification strategy. However, EMG in combination with the mechanical sensor data did significantly reduce intent recognition errors (p < 0.05) both for transitions between locomotion modes and steady-state locomotion. The sensor time history (DBN) classifier significantly reduced error rates compared to a linear discriminant classifier for steady-state steps, without increasing the transitional error, for both EMG and mechanical sensors. Combining EMG and mechanical sensor data with sensor time history reduced the average transitional error from 18.4% to 12.2% and the average steady-state error from 3.8% to 1.0% when classifying level-ground walking, ramps, and stairs in eight transfemoral amputee subjects. Significance. These results suggest that a neural interface in combination with time history methods for locomotion mode classification can enhance intent recognition performance; this strategy should be considered for future real-time experiments.
Probabilistic seismic history matching using binary images
NASA Astrophysics Data System (ADS)
Davolio, Alessandra; Schiozer, Denis Jose
2018-02-01
Currently, the goal of history-matching procedures is not only to provide a model matching any observed data but also to generate multiple matched models to properly handle uncertainties. One such approach is a probabilistic history-matching methodology based on the discrete Latin Hypercube sampling algorithm, proposed in previous works, which was particularly efficient for matching well data (production rates and pressure). 4D seismic (4DS) data have been increasingly included into history-matching procedures. A key issue in seismic history matching (SHM) is to transfer data into a common domain: impedance, amplitude or pressure, and saturation. In any case, seismic inversions and/or modeling are required, which can be time consuming. An alternative to avoid these procedures is using binary images in SHM as they allow the shape, rather than the physical values, of observed anomalies to be matched. This work presents the incorporation of binary images in SHM within the aforementioned probabilistic history matching. The application was performed with real data from a segment of the Norne benchmark case that presents strong 4D anomalies, including softening signals due to pressure build up. The binary images are used to match the pressurized zones observed in time-lapse data. Three history matchings were conducted using: only well data, well and 4DS data, and only 4DS. The methodology is very flexible and successfully utilized the addition of binary images for seismic objective functions. Results proved the good convergence of the method in few iterations for all three cases. The matched models of the first two cases provided the best results, with similar well matching quality. The second case provided models presenting pore pressure changes according to the expected dynamic behavior (pressurized zones) observed on 4DS data. The use of binary images in SHM is relatively new with few examples in the literature. This work enriches this discussion by presenting a new application to match pressure in a reservoir segment with complex pressure behavior.
Andrianakis, Ioannis; Vernon, Ian R.; McCreesh, Nicky; McKinley, Trevelyan J.; Oakley, Jeremy E.; Nsubuga, Rebecca N.; Goldstein, Michael; White, Richard G.
2015-01-01
Advances in scientific computing have allowed the development of complex models that are being routinely applied to problems in disease epidemiology, public health and decision making. The utility of these models depends in part on how well they can reproduce empirical data. However, fitting such models to real world data is greatly hindered both by large numbers of input and output parameters, and by long run times, such that many modelling studies lack a formal calibration methodology. We present a novel method that has the potential to improve the calibration of complex infectious disease models (hereafter called simulators). We present this in the form of a tutorial and a case study where we history match a dynamic, event-driven, individual-based stochastic HIV simulator, using extensive demographic, behavioural and epidemiological data available from Uganda. The tutorial describes history matching and emulation. History matching is an iterative procedure that reduces the simulator's input space by identifying and discarding areas that are unlikely to provide a good match to the empirical data. History matching relies on the computational efficiency of a Bayesian representation of the simulator, known as an emulator. Emulators mimic the simulator's behaviour, but are often several orders of magnitude faster to evaluate. In the case study, we use a 22 input simulator, fitting its 18 outputs simultaneously. After 9 iterations of history matching, a non-implausible region of the simulator input space was identified that was times smaller than the original input space. Simulator evaluations made within this region were found to have a 65% probability of fitting all 18 outputs. History matching and emulation are useful additions to the toolbox of infectious disease modellers. Further research is required to explicitly address the stochastic nature of the simulator as well as to account for correlations between outputs. PMID:25569850
Research and Research Methods in Geographical Education.
ERIC Educational Resources Information Center
Graves, Norman J., Ed.
This collection of papers examines research methods in geographical education in nine countries. "Research Methods in the History of Geographical Education" (William Marsden, the United Kingdom) examines the methods used and some of the research undertaken in the history of geographical education. "Research Methods in Investigating…
NASA Technical Reports Server (NTRS)
Quinn, Robert D.; Gong, Leslie
2000-01-01
This report describes a method that can calculate transient aerodynamic heating and transient surface temperatures at supersonic and hypersonic speeds. This method can rapidly calculate temperature and heating rate time-histories for complete flight trajectories. Semi-empirical theories are used to calculate laminar and turbulent heat transfer coefficients and a procedure for estimating boundary-layer transition is included. Results from this method are compared with flight data from the X-15 research vehicle, YF-12 airplane, and the Space Shuttle Orbiter. These comparisons show that the calculated values are in good agreement with the measured flight data.
Lorenz, Tierney K.; Harte, Christopher B.; Meston, Cindy M.
2015-01-01
Introduction Women with histories of childhood sexual abuse (CSA) have higher rates of sexual difficulties, as well as high sympathetic nervous system (SNS) response to sexual stimuli. Aim To examine whether treatment-related changes in autonomic balance, as indexed by heart rate variability (HRV), were associated with changes in sexual arousal and orgasm function. Methods In Study 1, we measured HRV while writing a sexual essay in 42 healthy, sexually functional women without any history of sexual trauma. These data, along with demographics, were used to develop HRV norms equations. In Study 2, 136 women with a history of CSA were randomized to one of three active expressive writing treatments that focused on their trauma, sexuality, or daily life (control condition). We recorded HRV while writing a sexual essay at pre-treatment, post-treatment, and 2 week, 1 month, and 6 month follow-ups; we also calculated the expected HRV for each participant based on the norms equations from Study 1. Main Outcome Measures Heart rate variability, Female Sexual Function Index (FSFI), Sexual Satisfaction Scale – Women (SSS-W) Results The difference between expected and observed HRV decreased over time, indicating that, post-treatment, CSA survivors displayed HRV closer to the expected HRV of a demographics-matched woman with no history of sexual trauma. Also, over time, participants whose HRV became less dysregulated showed the biggest gains in sexual arousal and orgasm function. These effects were consistent across condition. Conclusions Treatments that reduce autonomic imbalance may improve sexual wellbeing among CSA populations. PMID:25963394
NOVAM Evaluation Utilizing Electro-Optics and Meteorological Data from KEY-90
1993-09-01
from TNO lidar ............................ 53 22. A segment of time history of the aircraft altitude determined from the NRL data for 14 July 1990...54 23. A time history of the optical depth between the NRL aircraft and the ocean surface on 14...of two sets of lidar shots taken at different times and places on 14 July 1990 ..................... 55 25. A time history of the boundary-layer
Seed after-ripening and dormancy determine adult life history independently of germination timing.
de Casas, Rafael Rubio; Kovach, Katherine; Dittmar, Emily; Barua, Deepak; Barco, Brenden; Donohue, Kathleen
2012-05-01
• Seed dormancy can affect life history through its effects on germination time. Here, we investigate its influence on life history beyond the timing of germination. • We used the response of Arabidopsis thaliana to chilling at the germination and flowering stages to test the following: how seed dormancy affects germination responses to the environment; whether variation in dormancy affects adult phenology independently of germination time; and whether environmental cues experienced by dormant seeds have an effect on adult life history. • Dormancy conditioned the germination response to low temperatures, such that prolonged periods of chilling induced dormancy in nondormant seeds, but stimulated germination in dormant seeds. The alleviation of dormancy through after-ripening was associated with earlier flowering, independent of germination date. Experimental dormancy manipulations showed that prolonged chilling at the seed stage always induced earlier flowering, regardless of seed dormancy. Surprisingly, this effect of seed chilling on flowering time was observed even when low temperatures did not induce germination. • In summary, seed dormancy influences flowering time and hence life history independent of its effects on germination timing. We conclude that the seed stage has a pronounced effect on life history, the influence of which goes well beyond the timing of germination. © 2012 The Authors. New Phytologist © 2012 New Phytologist Trust.
World History. Volumes I and II. [Sahuarita High School Career Curriculum Project].
ERIC Educational Resources Information Center
Hoffman, Judy
Volumes I and II of a world history course, part of a high school career curriculum project, are outlined. Objectives are listed by course title. Course titles include: Early Communication - Languages and Writing; World History; Law and Order in Ancient Times; Early Transportation; Women in Ancient Times; Art and Literature in Ancient Times;…
The Effects of Art History-Enriched Art Therapy on Anxiety, Time on Task, and Art Product Quality.
ERIC Educational Resources Information Center
Miller, Carol L.
1993-01-01
Investigated effects of art history enrichment of art therapy task on anxiety, time on task, and art product quality among 13 chronic adult psychiatric day hospital patients. Results indicated art history enrichment task reduced anxiety and increased time on task. Art organization level tended toward significant increase compared with control…
An Economic History of Medicare Part C
Mcguire, Thomas G; Newhouse, Joseph P; Sinaiko, Anna D
2011-01-01
Context: Twenty-five years ago, private insurance plans were introduced into the Medicare program with the stated dual aims of (1) giving beneficiaries a choice of health insurance plans beyond the fee-for-service Medicare program and (2) transferring to the Medicare program the efficiencies and cost savings achieved by managed care in the private sector. Methods: In this article we review the economic history of Medicare Part C, known today as Medicare Advantage, focusing on the impact of major changes in the program's structure and of plan payment methods on trends in the availability of private plans, plan enrollment, and Medicare spending. Additionally, we compare the experience of Medicare Advantage and of employer-sponsored health insurance with managed care over the same time period. Findings: Beneficiaries' access to private plans has been inconsistent over the program's history, with higher plan payments resulting in greater choice and enrollment and vice versa. But Medicare Advantage generally has cost more than the traditional Medicare program, an overpayment that has increased in recent years. Conclusions: Major changes in Medicare Advantage's payment rules are needed in order to simultaneously encourage the participation of private plans, the provision of high-quality care, and to save Medicare money. PMID:21676024
NASA Astrophysics Data System (ADS)
Beaman, Joseph
2015-03-01
Starting in the late 1980's, several new technologies were created that have the potential to revolutionize manufacturing. These technologies are, for the most part, additive processes that build up parts layer by layer. In addition, the processes that are being touted for hard-core manufacturing are primarily laser or e-beam based processes. This presentation gives a brief history of Additive Manufacturing and gives an assessment for these technologies. These technologies initially grew out of a commercial need for rapid prototyping. This market has a different requirement for process and quality control than traditional manufacturing. The relatively poor process control of the existing commercial Additive Manufacturing equipment is a vestige of this history. This presentation discusses this history and improvements in quality over time. The emphasis will be on Additive Manufacturing processes that are being considered for direct manufacturing, which is a different market than the 3D Printing ``Makerbot'' market. Topics discussed include past and present machine sensors, materials, and operational methods that were used in the past and those that are used today to create manufactured parts. Finally, a discussion of new methods and future directions of AM is presented.
Enchanted rendezvous: John C. Houbolt and the genesis of the lunar-orbit rendezvous concept
NASA Technical Reports Server (NTRS)
Hansen, James R.
1995-01-01
This is the fourth publication of the 'Monographs in Aerospace History' series, prepared by the NASA History Office. These publications are intended to be tightly focused in terms of subject, relatively short in length, and reproduced to allow timely and broad dissemination to researchers in aerospace history. This publication details the arguments of John C. Houbolt, an engineer at the Langley Research Center in Hampton, Virginia, in his 1961-1962 campaign to support the lunar-orbit rendezvous (LOR). The LOR was eventually selected during Project Apollo as the method of flying to the Moon, landing on the surface, and returning to Earth. The LOR opted to send the entire lunar spacecraft up in one launch, enter into the lunar orbit, and dispatch a small lander to the lunar surface. It was the simplest of the various methods, both in terms of development and operational costs, but it was risky. There was no room for error or the crew could not get home; and the more difficult maneuvers had to be done when the spacecraft was committed to a circumlunar flight. Houbolt was one of the most vocal people supporting the LOR.
The management of kidney stones as suggested by Goeury-Duvivier.
Bellinghieri, Guido; Satta, Ersilia; Savica, Vincenzo; Gembillo, Guido; Salvo, Antonino; Buemi, Michele; Santoro, Domenico
2016-02-01
The management of kidney stones has always been a big problem for doctors of all time. Goeury Duvivier in his masterpiece "Guide des malades atteints daffections de voie urinaires ou des organes de la gnration chez lhomme et chez la femme shows us the different kind of diseases which affects the urinary tract and in particular highlights the list of the main methods that during the history characterized the treatment of renal calculi. Duvivier gives us the descriptions of invasive innovative techniques of the time, the Taille, the Lithotripsy and Lithotomy and their negative effects or limits for each technique. He also describes the different kind of palliative methods used in the 19th century to treat renal lithiasis and the clinical case reports of the time.
Hashemi, Hassan; Khabazkhoob, Mehdi; Emamian, Mohammad Hassan; Shariati, Mohammad; Mohazzab-Torabi, Saman; Fotouhi, Akbar
2015-01-01
Purpose: The purpose of this study was to determine the prevalence of a history of ocular trauma and its association to age, sex, and biometric components. Materials and Methods: Residents of Shahroud, Iran aged 40–64 years, were sampled through a cross-sectional study using multistage cluster sampling. Three hundred clusters were randomly selected, and 20 individuals were systematically selected from each cluster. The subjects underwent optometric and ophthalmic examinations, and ocular imaging. A history of ocular trauma was determined through personal interviews. Results: The prevalence of a history of trauma and blunt trauma, sharp trauma, and chemical burns were 8.57%, 3.91%, 3.82%, and 1.93%, respectively. After adjusting for age, the rate of all types of trauma was significantly higher for males. Only the prevalence of chemical burns significantly decreased with aging. A history of hospitalization was stated by 1.64% of the subjects. The axial length was significantly longer in cases with a history of trauma. The corneal curvature was significantly larger in cases with a history of sharp trauma and chemical burns. The prevalence of corneal opacities was significantly higher among cases with a history of the blunt trauma odds ratio (OR = 2.33) and sharp trauma (OR = 4.46). Based on corrected visual acuity, the odds of blindness was 3.32 times higher in those with a history of ocular trauma (P < 0.001). Conclusion: A considerable proportion of the 40–64-year-old population reported a history of ocular trauma. This observation has important health implications. Blindness, corneal opacities, and posterior subcapsular cataract were observed more frequently among these cases, and they demonstrated differences in some ocular biometric components. PMID:26180480
NASA Technical Reports Server (NTRS)
Hubbard, Harvey H.; Shepherd, Kevin P.
1990-01-01
Available information on the physical characteristics of the noise generated by wind turbines is summarized, with example sound pressure time histories, narrow- and broadband frequency spectra, and noise radiation patterns. Reviewed are noise measurement standards, analysis technology, and a method of characterizing wind turbine noise. Prediction methods are given for both low-frequency rotational harmonics and broadband noise components. Also included are atmospheric propagation data showing the effects of distance and refraction by wind shear. Human perception thresholds, based on laboratory and field tests, are given. Building vibration analysis methods are summarized. The bibliography of this report lists technical publications on all aspects of wind turbine acoustics.
Uncovering History for Future History Teachers
ERIC Educational Resources Information Center
Fischer, Fritz
2010-01-01
The art of history teaching is at a crossroads. Recent scholarship focuses on the need to change the teaching of history so students can better learn history, and insists that history teachers must move beyond traditional structures and methods of teaching in order to improve their students' abilities to think with history. This article presents…
Basal accretion, a major mechanism for mountain building in Taiwan revealed in rock thermal history
NASA Astrophysics Data System (ADS)
Chen, Chih-Tung; Chan, Yu-Chang; Lo, Ching-Hua; Malavieille, Jacques; Lu, Chia-Yu; Tang, Jui-Ting; Lee, Yuan-Hsi
2018-02-01
Deep tectonic processes are key integral components in the evolution of mountain belts, while observations of their temporal development are generally obscured by thermal resetting, retrograde alteration and structural overprinting. Here we recorded an integrated rock time-temperature history for the first time in the pro-wedge part of the active Taiwan arc-continent collision starting from sedimentation through cleavage-forming state to its final exhumation. The integrated thermal and age results from the Raman Spectroscopy of Carbonaceous Material (RSCM) method, zircon U-Pb laser ablation dating, and in-situ40Ar/39Ar laser microprobe dating suggest that the basal accretion process was crucial to the development of the Taiwanese orogenic wedge. The basal accretion process commenced early in the mountain building history (∼6 Ma) and gradually migrated to greater depths, as constrained by persistent plate convergence and cleavage formation under nearly isothermal state at similar depths until ∼ 2.5 Ma recorded in the early-accreted units. Such development essentially contributed to mountain root growth by the increased depth of the wedge detachment and the downward wedge thickening during the incipient to full collision stages in the Taiwan mountain belt.
NASA Astrophysics Data System (ADS)
Hess Webber, Shea A.; Thompson, Barbara J.; Kwon, Ryun Young; Ireland, Jack
2018-01-01
An improved understanding of the kinematic properties of CMEs and CME-associated phenomena has several impacts: 1) a less ambiguous method of mapping propagating structures into their inner coronal manifestations, 2) a clearer view of the relationship between the “main” CME and CME-associated brightenings, and 3) an improved identification of the heliospheric sources of shocks, Type II bursts, and SEPs. We present the results of a mapping technique that facilitates the separation of CMEs and CME-associated brightenings (such as shocks) from background corona. The Time Convolution Mapping Method (TCMM) segments coronagraph data to identify the time history of coronal evolution, the advantage being that the spatiotemporal evolution profiles allow users to separate features with different propagation characteristics. For example, separating “main” CME mass from CME-associated brightenings or shocks is a well-known obstacle, which the TCMM aids in differentiating. A TCMM CME map is made by first recording the maximum value each individual pixel in the image reaches during the traversal of the CME. Then the maximum value is convolved with an index to indicate the time that the pixel reached that value. The TCMM user is then able to identify continuous “kinematic profiles,” indicating related kinematic behavior, and also identify breaks in the profiles that indicate a discontinuity in kinematic history (i.e. different structures or different propagation characteristics). The maps obtained from multiple spacecraft viewpoints (i.e., STEREO and SOHO) can then be fit with advanced structural models to obtain the 3D properties of the evolving phenomena. We will also comment on the TCMM's further applicability toward the tracking of prominences, coronal hole boundaries and coronal cavities.
Bidirectional light-scattering image processing method for high-concentration jet sprays
NASA Astrophysics Data System (ADS)
Shimizu, I.; Emori, Y.; Yang, W.-J.; Shimoda, M.; Suzuki, T.
1985-01-01
In order to study the distributions of droplet size and volume density in high-concentration jet sprays, a new technique is developed, which combines the forward and backward light scattering method and an image processing method. A pulsed ruby laser is used as the light source. The Mie scattering theory is applied to the results obtained from image processing on the scattering photographs. The time history is obtained for the droplet size and volume density distributions, and the method is demonstrated by diesel fuel sprays under various injecting conditions. The validity of the technique is verified by a good agreement in the injected fuel volume distributions obtained by the present method and by injection rate measurements.
Digital methods for the history of psychology: Introduction and resources.
Fox Lee, Shayna
2016-02-01
At the York University Digital History of Psychology Laboratory, we have been working on projects that explore what digital methodologies have to offer historical research in our field. This piece provides perspective on the history and theory of digital history, as well as introductory resources for those who are curious about incorporating these methods into their own work. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
ERIC Educational Resources Information Center
Healy, Daniel J.
2014-01-01
The jazz ensemble represents an important performance opportunity in many school music programs. Due to the cultural history of jazz as an improvisatory art form, school jazz ensemble directors must address methods of teaching improvisation concepts to young students. Progress has been made in the field of prescribed improvisation activities and…
The Role of Student Growth Percentiles in Monitoring Learning and Predicting Learning Outcomes
ERIC Educational Resources Information Center
Seo, Daeryong; McGrane, Joshua; Taherbhai, Husein
2015-01-01
Most formative assessments rely on the performance status of a student at a particular time point. However, such a method does not provide any information on the "propensity" of the student to achieve a predetermined target score or whether the student is performing as per the expectations from identical students with the same history of…
The Political Uses of Sign Language: The Case of the French Revolution
ERIC Educational Resources Information Center
Rosenfeld, Sophia
2005-01-01
The story of the Abbe de l'Epee's "methodical signs" is best known as a key moment in Deaf history. However, at the time of the French Revolution this story served a larger political function. The example of de l'Epee's deaf students, and their seemingly miraculous command of ideas learned through gestural signs, helped the French…
Helicopter Noise Reduction Design Trade-Off Study
1977-01-01
level, A-weighted sound pressure level, perceived noise level and tone corrected perceived noise level time histories, and are further analyzed to...DATA ---------------------------- 101 10 BASELINE VEHICLE EFFECTIVE PERCEIVED NOISE LEVELS (EPiII.) AND RANGE FOR MAXIMIJM TONE CORRECTED PERCEIVED...and tone corrected perceived noise level (PNLT) units. All noise level calculation methods have been computerized in FORTRAN language for use on the
Aversive Intervention: Research and Reflection
ERIC Educational Resources Information Center
Rice, Deanna K.; Kohler, Patty
2012-01-01
From the beginning of time, there has been a pervasive interest in getting some members of the human race to conform to the wishes of other members of the human race. The use of positive as well as not so positive methods have been utilized, researched and espoused upon. The current paper presents a review of the vast literature on the history of…
Risk factors for the development of colorectal carcinoma: A case control study from South India
Iswarya, Santhana Krishnan; Premarajan, Kariyarath Cheriyath; Kar, Sitanshu Sekhar; Kumar, Sathasivam Suresh; Kate, Vikram
2016-01-01
AIM: To study the association of colorectal carcinoma (CRC) with diet, smoking, alcohol, physical activity, body mass index, family history and diabetes. METHODS: All consecutive patients with CRC confirmed by histopathology diagnosis were included. Age (± 5 years) and gender matched controls were selected among the patients admitted in surgery ward for various conditions without any co-existing malignancy. Food frequency questionnaire (FFQ) was developed and validated after pretesting by investigator trained in data collection techniques. Cases and controls were interviewed ensuring privacy, in similar interview setting, with same duration of time for both cases and controls without any leading question. Biological variables like family history of CRC in first degree relatives, history of diabetes mellitus; behavioral factors like tobacco use both smoking and smokeless form, alcohol consumption and physical activity were recorded. Dietary details were recorded using a FFQ consisting 29 food items with seven categories. Analysis was done using appropriate statistical methods. RESULTS: Ninety-four histopathologically confirmed cases of CRC and equal number of age and gender matched controls treated over a period of two years were studied. Age distribution, mean age, male to female ratio, education level and socioeconomic status were similar in cases and controls. Intake of food items was categorized into tertile due to skewed distribution of subjects as per recommended cut off for consumption of food item. On univariate analysis red meat [OR = 7.4 (2.935-18.732)], egg [OR = 5.1 (2.26-11.36)], fish, fried food and oil consumption were found to be risk factors for CRC. On multivariate analysis red meat consumption of more than 2-3 times a month (OR = 5.4; 95%CI: 1.55-19.05) and egg consumption of more than 2-3 times a week (OR = 3.67; 95%CI: 1.23-9.35) were found to be independent risk factors for the development of CRC. CONCLUSION: Egg and red meat consumption found to be independent risk factors for CRC. Smoking, alcohol, physical activity and family history were not associated with increased risk. PMID:26909135
Zone clearance in an infinite TASEP with a step initial condition
NASA Astrophysics Data System (ADS)
Cividini, Julien; Appert-Rolland, Cécile
2017-06-01
The TASEP is a paradigmatic model of out-of-equilibrium statistical physics, for which many quantities have been computed, either exactly or by approximate methods. In this work we study two new kinds of observables that have some relevance in biological or traffic models. They represent the probability for a given clearance zone of the lattice to be empty (for the first time) at a given time, starting from a step density profile. Exact expressions are obtained for single-time quantities, while more involved history-dependent observables are studied by Monte Carlo simulation, and partially predicted by a phenomenological approach.
History and development of the Schmidt-Hunter meta-analysis methods.
Schmidt, Frank L
2015-09-01
In this article, I provide answers to the questions posed by Will Shadish about the history and development of the Schmidt-Hunter methods of meta-analysis. In the 1970s, I headed a research program on personnel selection at the US Office of Personnel Management (OPM). After our research showed that validity studies have low statistical power, OPM felt a need for a better way to demonstrate test validity, especially in light of court cases challenging selection methods. In response, we created our method of meta-analysis (initially called validity generalization). Results showed that most of the variability of validity estimates from study to study was because of sampling error and other research artifacts such as variations in range restriction and measurement error. Corrections for these artifacts in our research and in replications by others showed that the predictive validity of most tests was high and generalizable. This conclusion challenged long-standing beliefs and so provoked resistance, which over time was overcome. The 1982 book that we published extending these methods to research areas beyond personnel selection was positively received and was followed by expanded books in 1990, 2004, and 2014. Today, these methods are being applied in a wide variety of areas. Copyright © 2015 John Wiley & Sons, Ltd.
Randomized Controlled Evaluation of an Early Intervention to Prevent Post-Rape Psychopathology
Resnick, Heidi; Acierno, Ron; Waldrop, Angela E.; King, Lynda; King, Daniel; Danielson, Carla; Ruggiero, Kenneth J.; Kilpatrick, Dean
2007-01-01
A randomized between-group design was used to evaluate efficacy of a video intervention to reduce PTSD and other mental health problems, implemented prior to the forensic medical exam conducted within 72 hours post-sexual assault. Participants were 140 female victims of sexual assault (68 video/72 nonvideo) ages 15 or older. Assessments were targeted for 6 weeks (Time 1) and 6 months (Time 2) post-assault. At Time 1, the intervention was associated with lower scores on measures of PTSD and depression among women with prior rape history relative to scores among women with prior rape history in the standard care condition. At Time 2, depression scores were also lower among those with a prior history who were in the video relative to standard care condition. Small effects indicating higher PTSD and BAI scores among women without a prior history in the video condition were observed at Time 1. Accelerated longitudinal growth curve analysis indicated a video x prior rape history interaction for PTSD, yielding four patterns of symptom trajectory over time. Women with a prior rape history in the video condition generally maintained the lowest level of symptoms. PMID:17585872
Randomized controlled evaluation of an early intervention to prevent post-rape psychopathology.
Resnick, Heidi; Acierno, Ron; Waldrop, Angela E; King, Lynda; King, Daniel; Danielson, Carla; Ruggiero, Kenneth J; Kilpatrick, Dean
2007-10-01
A randomized between-group design was used to evaluate the efficacy of a video intervention to reduce post-traumatic stress disorder (PTSD) and other mental health problems, implemented prior to the forensic medical examination conducted within 72 h post-sexual assault. Participants were 140 female victims of sexual assault (68 video/72 nonvideo) aged 15 years or older. Assessments were targeted for 6 weeks (Time 1) and 6 months (Time 2) post-assault. At Time 1, the intervention was associated with lower scores on measures of PTSD and depression among women with a prior rape history relative to scores among women with a prior rape history in the standard care condition. At Time 2, depression scores were also lower among those with a prior rape history who were in the video relative to the standard care condition. Small effects indicating higher PTSD and Beck Anxiety Inventory (BAI) scores among women without a prior rape history in the video condition were observed at Time 1. Accelerated longitudinal growth curve analysis indicated a videoxprior rape history interaction for PTSD, yielding four patterns of symptom trajectory over time. Women with a prior rape history in the video condition generally maintained the lowest level of symptoms.
Downscaling ocean conditions: Experiments with a quasi-geostrophic model
NASA Astrophysics Data System (ADS)
Katavouta, A.; Thompson, K. R.
2013-12-01
The predictability of small-scale ocean variability, given the time history of the associated large-scales, is investigated using a quasi-geostrophic model of two wind-driven gyres separated by an unstable, mid-ocean jet. Motivated by the recent theoretical study of Henshaw et al. (2003), we propose a straightforward method for assimilating information on the large-scale in order to recover the small-scale details of the quasi-geostrophic circulation. The similarity of this method to the spectral nudging of limited area atmospheric models is discussed. Results from the spectral nudging of the quasi-geostrophic model, and an independent multivariate regression-based approach, show that important features of the ocean circulation, including the position of the meandering mid-ocean jet and the associated pinch-off eddies, can be recovered from the time history of a small number of large-scale modes. We next propose a hybrid approach for assimilating both the large-scales and additional observed time series from a limited number of locations that alone are too sparse to recover the small scales using traditional assimilation techniques. The hybrid approach improved significantly the recovery of the small-scales. The results highlight the importance of the coupling between length scales in downscaling applications, and the value of assimilating limited point observations after the large-scales have been set correctly. The application of the hybrid and spectral nudging to practical ocean forecasting, and projecting changes in ocean conditions on climate time scales, is discussed briefly.
Characterization of platelet adhesion under flow using microscopic image sequence analysis.
Machin, M; Santomaso, A; Cozzi, M R; Battiston, M; Mazzuccato, M; De Marco, L; Canu, P
2005-07-01
A method for quantitative analysis of platelet deposition under flow is discussed here. The model system is based upon perfusion of blood platelets over an adhesive substrate immobilized on a glass coverslip acting as the lower surface of a rectangular flow chamber. The perfusion apparatus is mounted onto an inverted microscope equipped with epifluorescent illumination and intensified CCD video camera. Characterization is based on information obtained from a specific image analysis method applied to continuous sequences of microscopical images. Platelet recognition across the sequence of images is based on a time-dependent, bidimensional, gaussian-like pdf. Once a platelet is located,the variation of its position and shape as a function of time (i.e., the platelet history) can be determined. Analyzing the history we can establish if the platelet is moving on the surface, the frequency of this movement and the distance traveled before its resumes the velocity of a non-interacting cell. Therefore, we can determine how long the adhesion would last which is correlated to the resistance of the platelet-substrate bond. This algorithm enables the dynamic quantification of trajectories, as well as residence times, arrest and release frequencies for a high numbers of platelets at the same time. Statistically significant conclusions on platelet-surface interactions can then be obtained. An image analysis tool of this kind can dramatically help the investigation and characterization of the thrombogenic properties of artificial surfaces such as those used in artificial organs and biomedical devices.
Hartzell, S.; Guatteri, Mariagiovanna; Mai, P.M.; Liu, P.-C.; Fisk, M. R.
2005-01-01
In the evolution of methods for calculating synthetic time histories of ground motion for postulated earthquakes, kinematic source models have dominated to date because of their ease of application. Dynamic models, however, which incorporate a physical relationship between important faulting parameters of stress drop, slip, rupture velocity, and rise time, are becoming more accessible. This article compares a class of kinematic models based on the summation of a fractal distribution of subevent sizes with a dynamic model based on the slip-weakening friction law. Kinematic modeling is done for the frequency band 0.2 to 10.0. Hz, dynamic models are calculated from 0.2 to 2.0. Hz. The strong motion data set for the 1994 Northridge earthquake is used to evaluate and compare the synthetic time histories. Source models are propagated to the far field by convolution with 1D and 3D theoretical Green’s functions. In addition, the kinematic model is used to evaluate the importance of propagation path effects: velocity structure, scattering, and nonlinearity. At present, the kinematic model gives a better broadband fit to the Northridge ground motion than the simple slip-weakening dynamic model. In general, the dynamic model overpredicts rise times and produces insufficient shorter-period energy. Within the context of the slip-weakening model, the Northridge ground motion requires a short slip-weakening distance, on the order of 0.15 m or less. A more complex dynamic model including rate weakening or one that allows shorter rise times near the hypocenter may fit the data better.
Time reversal imaging and cross-correlations techniques by normal mode theory
NASA Astrophysics Data System (ADS)
Montagner, J.; Fink, M.; Capdeville, Y.; Phung, H.; Larmat, C.
2007-12-01
Time-reversal methods were successfully applied in the past to acoustic waves in many fields such as medical imaging, underwater acoustics, non destructive testing and recently to seismic waves in seismology for earthquake imaging. The increasing power of computers and numerical methods (such as spectral element methods) enables one to simulate more and more accurately the propagation of seismic waves in heterogeneous media and to develop new applications, in particular time reversal in the three-dimensional Earth. Generalizing the scalar approach of Draeger and Fink (1999), the theoretical understanding of time-reversal method can be addressed for the 3D- elastic Earth by using normal mode theory. It is shown how to relate time- reversal methods on one hand, with auto-correlation of seismograms for source imaging and on the other hand, with cross-correlation between receivers for structural imaging and retrieving Green function. The loss of information will be discussed. In the case of source imaging, automatic location in time and space of earthquakes and unknown sources is obtained by time reversal technique. In the case of big earthquakes such as the Sumatra-Andaman earthquake of december 2004, we were able to reconstruct the spatio-temporal history of the rupture. We present here some new applications at the global scale of these techniques on synthetic tests and on real data.
Response of a tethered aerostat to simulated turbulence
NASA Astrophysics Data System (ADS)
Stanney, Keith A.; Rahn, Christopher D.
2006-09-01
Aerostats are lighter-than-air vehicles tethered to the ground by a cable and used for broadcasting, communications, surveillance, and drug interdiction. The dynamic response of tethered aerostats subject to extreme atmospheric turbulence often dictates survivability. This paper develops a theoretical model that predicts the planar response of a tethered aerostat subject to atmospheric turbulence and simulates the response to 1000 simulated hurricane scale turbulent time histories. The aerostat dynamic model assumes the aerostat hull to be a rigid body with non-linear fluid loading, instantaneous weathervaning for planar response, and a continuous tether. Galerkin's method discretizes the coupled aerostat and tether partial differential equations to produce a non-linear initial value problem that is integrated numerically given initial conditions and wind inputs. The proper orthogonal decomposition theorem generates, based on Hurricane Georges wind data, turbulent time histories that possess the sequential behavior of actual turbulence, are spectrally accurate, and have non-Gaussian density functions. The generated turbulent time histories are simulated to predict the aerostat response to severe turbulence. The resulting probability distributions for the aerostat position, pitch angle, and confluence point tension predict the aerostat behavior in high gust environments. The dynamic results can be up to twice as large as a static analysis indicating the importance of dynamics in aerostat modeling. The results uncover a worst case wind input consisting of a two-pulse vertical gust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavez, Marcus Alexander; Willis, Michael David; Covert, Timothy Todd
2014-09-01
The miniaturization of explosive components has driven the need for a corresponding miniaturization of the current diagnostic techniques available to measure the explosive phenomena. Laser interferometry and the use of spectrally coated optical windows have proven to be an essential interrogation technique to acquire particle velocity time history data in one- dimensional gas gun and relatively large-scale explosive experiments. A new diagnostic technique described herein allows for experimental measurement of apparent particle velocity time histories in microscale explosive configurations and can be applied to shocks/non-shocks in inert materials. The diagnostic, Embedded Fiber Optic Sensors (EFOS), has been tested in challengingmore » microscopic experimental configurations that give confidence in the technique's ability to measure the apparent particle velocity time histories of an explosive with pressure outputs in the tenths of kilobars to several kilobars. Embedded Fiber Optic Sensors also allow for several measurements to be acquired in a single experiment because they are microscopic, thus reducing the number of experiments necessary. The future of EFOS technology will focus on further miniaturization, material selection appropriate for the operating pressure regime, and extensive hydrocode and optical analysis to transform apparent particle velocity time histories into true particle velocity time histories as well as the more meaningful pressure time histories.« less
VAiRoma: A Visual Analytics System for Making Sense of Places, Times, and Events in Roman History.
Cho, Isaac; Dou, Wewnen; Wang, Derek Xiaoyu; Sauda, Eric; Ribarsky, William
2016-01-01
Learning and gaining knowledge of Roman history is an area of interest for students and citizens at large. This is an example of a subject with great sweep (with many interrelated sub-topics over, in this case, a 3,000 year history) that is hard to grasp by any individual and, in its full detail, is not available as a coherent story. In this paper, we propose a visual analytics approach to construct a data driven view of Roman history based on a large collection of Wikipedia articles. Extracting and enabling the discovery of useful knowledge on events, places, times, and their connections from large amounts of textual data has always been a challenging task. To this aim, we introduce VAiRoma, a visual analytics system that couples state-of-the-art text analysis methods with an intuitive visual interface to help users make sense of events, places, times, and more importantly, the relationships between them. VAiRoma goes beyond textual content exploration, as it permits users to compare, make connections, and externalize the findings all within the visual interface. As a result, VAiRoma allows users to learn and create new knowledge regarding Roman history in an informed way. We evaluated VAiRoma with 16 participants through a user study, with the task being to learn about roman piazzas through finding relevant articles and new relationships. Our study results showed that the VAiRoma system enables the participants to find more relevant articles and connections compared to Web searches and literature search conducted in a roman library. Subjective feedback on VAiRoma was also very positive. In addition, we ran two case studies that demonstrate how VAiRoma can be used for deeper analysis, permitting the rapid discovery and analysis of a small number of key documents even when the original collection contains hundreds of thousands of documents.
Computer-assisted versus oral-and-written dietary history taking for diabetes mellitus.
Wei, Igor; Pappas, Yannis; Car, Josip; Sheikh, Aziz; Majeed, Azeem
2011-12-07
Diabetes is a chronic illness characterised by insulin resistance or deficiency, resulting in elevated glycosylated haemoglobin A1c (HbA1c) levels. Diet and adherence to dietary advice is associated with lower HbA1c levels and control of disease. Dietary history may be an effective clinical tool for diabetes management and has traditionally been taken by oral-and-written methods, although it can also be collected using computer-assisted history taking systems (CAHTS). Although CAHTS were first described in the 1960s, there remains uncertainty about the impact of these methods on dietary history collection, clinical care and patient outcomes such as quality of life. To assess the effects of computer-assisted versus oral-and-written dietary history taking on patient outcomes for diabetes mellitus. We searched The Cochrane Library (issue 6, 2011), MEDLINE (January 1985 to June 2011), EMBASE (January 1980 to June 2011) and CINAHL (January 1981 to June 2011). Reference lists of obtained articles were also pursued further and no limits were imposed on languages and publication status. Randomised controlled trials of computer-assisted versus oral-and-written history taking in patients with diabetes mellitus. Two authors independently scanned the title and abstract of retrieved articles. Potentially relevant articles were investigated as full text. Studies that met the inclusion criteria were abstracted for relevant population and intervention characteristics with any disagreements resolved by discussion, or by a third party. Risk of bias was similarly assessed independently. Of the 2991 studies retrieved, only one study with 38 study participants compared the two methods of history taking over a total of eight weeks. The authors found that as patients became increasingly familiar with using CAHTS, the correlation between patients' food records and computer assessments improved. Reported fat intake decreased in the control group and increased when queried by the computer. The effect of the intervention on the management of diabetes mellitus and blood glucose levels was not reported. Risk of bias was considered moderate for this study. Based on one small study judged to be of moderate risk of bias, we tentatively conclude that CAHTS may be well received by study participants and potentially offer time saving in practice. However, more robust studies with larger sample sizes are needed to confirm these. We cannot draw on any conclusions in relation to any other clinical outcomes at this stage.
Completeness of pedigree and family cancer history for ovarian cancer patients.
Son, Yedong; Lim, Myong Cheol; Seo, Sang Soo; Kang, Sokbom; Park, Sang Yoon
2014-10-01
To investigate the completeness of pedigree and of number of pedigree analysis to know the acceptable familial history in Korean women with ovarian cancer. Interview was conducted in 50 ovarian cancer patients for obtaining familial history three times over the 6 weeks. The completeness of pedigree is estimated in terms of familial history of disease (cancer), health status (health living, disease and death), and onset age of disease and death. The completion of pedigree was 79.3, 85.1, and 85.6% at the 1st, 2nd, and 3rd time of interview and the time for pedigree analysis was 34.3, 10.8, and 3.1 minutes, respectively. The factors limiting pedigree analysis were as follows: out of contact with their relatives (38%), no living ancestors who know the family history (34%), dispersed family member because of the Korean War (16%), unknown cause of death (12%), reluctance to ask medical history of relatives (10%), and concealing their ovarian cancer (10%). The percentage of cancers revealed in 1st (2%) and 2nd degree (8%) relatives were increasing through surveys, especially colorectal cancer related with Lynch syndrome (4%). Analysis of pedigree at least two times is acceptable in Korean woman with ovarian cancer from the first study. The completion of pedigree is increasing, while time to take family history is decreasing during three time survey.
A paleolatitude approach to assessing surface temperature history for use in burial heating models
Barker, Charles E.
2000-01-01
Calculations using heat flow theory as well as case histories show that over geologic time scales (106 years), changes in mean annual surface temperature (Ts) on the order of 10°C penetrate kilometers deep into the crust. Thus, burial heating models of sedimentary basins, which typically span kilometers in depth and persist over geological time frames, should consider Ts history to increase their accuracy. In any case, Ts history becomes important when it changes enough to be detected by a thermal maturation index like vitrinite reflectance, a parameter widely used to constrain burial heating models. Assessment of the general temperature conditions leading to petroleum generation indicates that changes in Ts as small as 6°C can be detected by vitrinite reflectance measurements. This low temperature threshold indicates that oil and gas windows can be significantly influenced by Ts history. A review of paleoclimatic factors suggests the significant and geologically resolvable factors affecting Ts history are paleolatitude, long-term changes between cool and warm geological periods (climate mode), the degree to which a basin is removed from the sea (geographic isolation), and elevation or depth relative to sea level. Case studies using geologically realistic data ranges or different methods of estimating Ts in a burial heating model indicate a significant impact of Ts when: (1) continental drift, subduction, tectonism and erosion significantly change paleolatitude, paleoaltitude, or paleogeography; (2) strata are at, or near, maximum burial, and changes in Ts directly influence maximum burial temperature; and (3), when a significant change in Ts occurs near the opening or closing of the oil or gas windows causing petroleum generation to begin or cease. Case studies show that during the burial heating and petroleum generation phase of basin development changes in climate mode alone can influence Ts by about 15°C. At present, Ts changes from the poles to the equator by about 50°C. Thus, in extreme cases, continental drift alone can seemingly produce Ts changes on the order of 50°C over a time frame of 107 years.
User's manual for THPLOT, A FORTRAN 77 Computer program for time history plotting
NASA Technical Reports Server (NTRS)
Murray, J. E.
1982-01-01
A general purpose FORTRAN 77 computer program (THPLOT) for plotting time histories using Calcomp pen plotters is described. The program is designed to read a time history data file and to generate time history plots for selected time intervals and/or selected data channels. The capabilities of the program are described. The card input required to define the plotting operation is described and examples of card input and the resulting plotted output are given. The examples are followed by a description of the printed output, including both normal output and error messages. Lastly, implementation of the program is described. A complete listing of the program with reference maps produced by the CDC FTN 5.0 compiler is included.
2017-04-20
Categorization Guide for High -Loading- Rate Applications – History and Rationale by Robert Jensen, David Flanagan, Daniel DeSchepper, and Charles...Adhesives: Test Method, Group Assignment, and Categorization Guide for High -Loading- Rate Applications – History and Rationale by Robert Jensen...Categorization Guide for High - Loading-Rate Applications – History and Rationale 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6
Life cycle and population growth rate of Caenorhabditis elegans studied by a new method.
Muschiol, Daniel; Schroeder, Fabian; Traunspurger, Walter
2009-05-16
The free-living nematode Caenorhabditis elegans is the predominant model organism in biological research, being used by a huge number of laboratories worldwide. Many researchers have evaluated life-history traits of C. elegans in investigations covering quite different aspects such as ecotoxicology, inbreeding depression and heterosis, dietary restriction/supplement, mutations, and ageing. Such traits include juvenile growth rates, age at sexual maturity, adult body size, age-specific fecundity/mortality, total reproduction, mean and maximum lifespan, and intrinsic population growth rates. However, we found that in life-cycle experiments care is needed regarding protocol design. Here, we test a recently developed method that overcomes some problems associated with traditional cultivation techniques. In this fast and yet precise approach, single individuals are maintained within hanging drops of semi-fluid culture medium, allowing the simultaneous investigation of various life-history traits at any desired degree of accuracy. Here, the life cycles of wild-type C. elegans strains N2 (Bristol, UK) and MY6 (Münster, Germany) were compared at 20 degrees C with 5 x 10(9) Escherichia coli ml-1 as food source. High-resolution life tables and fecundity schedules of the two strains are presented. Though isolated 700 km and 60 years apart from each other, the two strains barely differed in life-cycle parameters. For strain N2 (n = 69), the intrinsic rate of natural increase (r m d(-1)), calculated according to the Lotka equation, was 1.375, the net reproductive rate (R 0) 291, the mean generation time (T) 90 h, and the minimum generation time (T min) 73.0 h. The corresponding values for strain MY6 (n = 72) were r m = 1.460, R0 = 289, T = 84 h, and T min = 67.3 h. Peak egg-laying rates in both strains exceeded 140 eggs d(-1). Juvenile and early adulthood mortality was negligible. Strain N2 lived, on average, for 16.7 d, while strain MY6 died 2 days earlier; however, differences in survivorship curves were statistically non-significant. We found no evidence that adaptation to the laboratory altered the life history traits of C. elegans strain N2. Our results, discussed in the light of earlier studies on C. elegans, demonstrate certain advantages of the hanging drop method in investigations of nematode life cycles. Assuming that its reproducibility is validated in further studies, the method will reduce the inter-laboratory variability of life-history estimates and may ultimately prove to be more convenient than the current standard methods used by C. elegans researchers.
Life cycle and population growth rate of Caenorhabditis elegans studied by a new method
Muschiol, Daniel; Schroeder, Fabian; Traunspurger, Walter
2009-01-01
Background The free-living nematode Caenorhabditis elegans is the predominant model organism in biological research, being used by a huge number of laboratories worldwide. Many researchers have evaluated life-history traits of C. elegans in investigations covering quite different aspects such as ecotoxicology, inbreeding depression and heterosis, dietary restriction/supplement, mutations, and ageing. Such traits include juvenile growth rates, age at sexual maturity, adult body size, age-specific fecundity/mortality, total reproduction, mean and maximum lifespan, and intrinsic population growth rates. However, we found that in life-cycle experiments care is needed regarding protocol design. Here, we test a recently developed method that overcomes some problems associated with traditional cultivation techniques. In this fast and yet precise approach, single individuals are maintained within hanging drops of semi-fluid culture medium, allowing the simultaneous investigation of various life-history traits at any desired degree of accuracy. Here, the life cycles of wild-type C. elegans strains N2 (Bristol, UK) and MY6 (Münster, Germany) were compared at 20°C with 5 × 109 Escherichia coli ml-1 as food source. Results High-resolution life tables and fecundity schedules of the two strains are presented. Though isolated 700 km and 60 years apart from each other, the two strains barely differed in life-cycle parameters. For strain N2 (n = 69), the intrinsic rate of natural increase (rmd-1), calculated according to the Lotka equation, was 1.375, the net reproductive rate (R0) 291, the mean generation time (T) 90 h, and the minimum generation time (Tmin) 73.0 h. The corresponding values for strain MY6 (n = 72) were rm = 1.460, R0 = 289, T = 84 h, and Tmin = 67.3 h. Peak egg-laying rates in both strains exceeded 140 eggs d-1. Juvenile and early adulthood mortality was negligible. Strain N2 lived, on average, for 16.7 d, while strain MY6 died 2 days earlier; however, differences in survivorship curves were statistically non-significant. Conclusion We found no evidence that adaptation to the laboratory altered the life history traits of C. elegans strain N2. Our results, discussed in the light of earlier studies on C. elegans, demonstrate certain advantages of the hanging drop method in investigations of nematode life cycles. Assuming that its reproducibility is validated in further studies, the method will reduce the inter-laboratory variability of life-history estimates and may ultimately prove to be more convenient than the current standard methods used by C. elegans researchers. PMID:19445697
Forgotten marriages? Measuring the reliability of marriage histories
Chae, Sophia
2016-01-01
BACKGROUND Marriage histories are a valuable data source for investigating nuptiality. While researchers typically acknowledge the problems associated with their use, it is unknown to what extent these problems occur and how marriage analyses are affected. OBJECTIVE This paper seeks to investigate the quality of marriage histories by measuring levels of misreporting, examining the characteristics associated with misreporting, and assessing whether misreporting biases marriage indicators. METHODS Using data from the Malawi Longitudinal Study of Families and Health (MLSFH), I compare marriage histories reported by the same respondents at two different points in time. I investigate whether respondents consistently report their spouses (by name), status of marriage, and dates of marriage. I use multivariate regression models to investigate the characteristics associated with misreporting. Finally, I examine whether misreporting marriages and marriage dates affects marriage indicators. RESULTS Results indicate that 28.3% of men and 17.9% of women omitted at least one marriage in one of the survey waves. Multivariate regression models show that misreporting is not random: marriage, individual, interviewer, and survey characteristics are associated with marriage omission and marriage date inconsistencies. Misreporting also affects marriage indicators. CONCLUSIONS This is the first study of its kind to examine the reliability of marriage histories collected in the context of Sub-Saharan Africa. Although marriage histories are frequently used to study marriage dynamics, until now no knowledge has existed on the degree of misreporting. Misreporting in marriage histories is shown to be non-negligent and could potentially affect analyses. PMID:27152090
Towards a global historical biogeography of Palms
NASA Astrophysics Data System (ADS)
Couvreur, Thomas; Baker, William J.; Frigerio, Jean-Marc; Sepulchre, Pierre; Franc, Alain
2017-04-01
Four mechanisms are at work for deciphering historical biogeography of plants : speciation, extinction, migration, and drift (a sort of neutral speciation). The first three mechanisms are under selection pressure of the environment, mainly the climate and connectivity of land masses. Hence, an accurate history of climate and connectivity or non connectivity between landmasses, as well as orogenesis processes, can shed new light on the most likely speciation events and migration routes driven by paleogeography and paleoclimatology. Currently, some models exist (like DIVA) to infer the most parsimonious history (in the number of migration events) knowing the speciation history given by phylogenies (extinction are mostly unknown), in a given setting of climate and landmass connectivity. In a previous project, we have built in collaboration with LSCE a series of paleogeographic and paleoclimatic maps since the Early Cretaceous. We have developed a program, called Aran, which enables to extend DIVA to a time series of varying paleoclimatic and paleogeogarphic conditions. We apply these new methods and data to unravel the biogeographic history of palms (Arecaceae), a pantropical family of 182 genera and >2600 species whose divergence is dated in Late Cretaceous (100 My). Based on a robust dated molecular phylogeny, novel paleoclimatic and paleogeographic maps, we will generate an updated biogeographic history of Arecaceae inferred from the most parsimonious history using Aran. We will discuss the results, and put them in context with what is known and needed to provide a global biogeographic history of tropical palms.
A case study in Gantt charts as historiophoty: A century of psychology at the University of Alberta.
Dawson, Michael R W
2013-05-01
History is typically presented as historiography, where historians communicate via the written word. However, some historians have suggested alternative formats for communicating and thinking about historical information. One such format is known as historiophoty, which involves using a variety of visual images to represent history. The current article proposes that a particular type of graph, known as a Gantt chart, is well suited for conducting historiophoty. When used to represent history, Gantt charts provide a tremendous amount of information. Furthermore, the spatial nature of Gantt charts permits other kinds of spatial operations to be performed on them. This is illustrated with a case study of the history of a particular psychology department. The academic year 2009-2010 marked the centennial of psychology at the University of Alberta. This centennial was marked by compiling a list of its full-time faculty members for each year of its history. This historiography was converted into historiophoty by using it as the source for the creation of a Gantt chart. The current article shows how the history of psychology at the University of Alberta is revealed by examining this Gantt chart in a variety of different ways. This includes computing simple descriptive statistics from the chart, creating smaller versions of the Gantt to explore departmental demographics, and using image processing methods to provide measures of departmental stability throughout its history. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Polanski, A; Kimmel, M; Chakraborty, R
1998-05-12
Distribution of pairwise differences of nucleotides from data on a sample of DNA sequences from a given segment of the genome has been used in the past to draw inferences about the past history of population size changes. However, all earlier methods assume a given model of population size changes (such as sudden expansion), parameters of which (e.g., time and amplitude of expansion) are fitted to the observed distributions of nucleotide differences among pairwise comparisons of all DNA sequences in the sample. Our theory indicates that for any time-dependent population size, N(tau) (in which time tau is counted backward from present), a time-dependent coalescence process yields the distribution, p(tau), of the time of coalescence between two DNA sequences randomly drawn from the population. Prediction of p(tau) and N(tau) requires the use of a reverse Laplace transform known to be unstable. Nevertheless, simulated data obtained from three models of monotone population change (stepwise, exponential, and logistic) indicate that the pattern of a past population size change leaves its signature on the pattern of DNA polymorphism. Application of the theory to the published mtDNA sequences indicates that the current mtDNA sequence variation is not inconsistent with a logistic growth of the human population.
Cultural and climatic changes shape the evolutionary history of the Uralic languages.
Honkola, T; Vesakoski, O; Korhonen, K; Lehtinen, J; Syrjänen, K; Wahlberg, N
2013-06-01
Quantitative phylogenetic methods have been used to study the evolutionary relationships and divergence times of biological species, and recently, these have also been applied to linguistic data to elucidate the evolutionary history of language families. In biology, the factors driving macroevolutionary processes are assumed to be either mainly biotic (the Red Queen model) or mainly abiotic (the Court Jester model) or a combination of both. The applicability of these models is assumed to depend on the temporal and spatial scale observed as biotic factors act on species divergence faster and in smaller spatial scale than the abiotic factors. Here, we used the Uralic language family to investigate whether both 'biotic' interactions (i.e. cultural interactions) and abiotic changes (i.e. climatic fluctuations) are also connected to language diversification. We estimated the times of divergence using Bayesian phylogenetics with a relaxed-clock method and related our results to climatic, historical and archaeological information. Our timing results paralleled the previous linguistic studies but suggested a later divergence of Finno-Ugric, Finnic and Saami languages. Some of the divergences co-occurred with climatic fluctuation and some with cultural interaction and migrations of populations. Thus, we suggest that both 'biotic' and abiotic factors contribute either directly or indirectly to the diversification of languages and that both models can be applied when studying language evolution. © 2013 The Authors. Journal of Evolutionary Biology © 2013 European Society For Evolutionary Biology.
NASA Astrophysics Data System (ADS)
Koh, E. H.; Lee, E.; Kaown, D.; Lee, K. K.; Green, C. T.
2017-12-01
Timing and magnitudes of nitrate contamination are determined by various factors like contaminant loading, recharge characteristics and geologic system. Information of an elapsed time since recharged water traveling to a certain outlet location, which is defined as groundwater age, can provide indirect interpretation related to the hydrologic characteristics of the aquifer system. There are three major methods (apparent ages, lumped parameter model, and numerical model) to date groundwater ages, which differently characterize groundwater mixing resulted by various groundwater flow pathways in a heterogeneous aquifer system. Therefore, in this study, we compared the three age models in a complex aquifer system by using observed age tracer data and reconstructed history of nitrate contamination by long-term source loading. The 3H-3He and CFC-12 apparent ages, which did not consider the groundwater mixing, estimated the most delayed response time and a highest period of the nitrate loading had not reached yet. However, the lumped parameter model could generate more recent loading response than the apparent ages and the peak loading period influenced the water quality. The numerical model could delineate various groundwater mixing components and its different impacts on nitrate dynamics in the complex aquifer system. The different age estimation methods lead to variations in the estimated contaminant loading history, in which the discrepancy in the age estimation was dominantly observed in the complex aquifer system.
NASA Astrophysics Data System (ADS)
Li, Yuankai; Ding, Liang; Zheng, Zhizhong; Yang, Qizhi; Zhao, Xingang; Liu, Guangjun
2018-05-01
For motion control of wheeled planetary rovers traversing on deformable terrain, real-time terrain parameter estimation is critical in modeling the wheel-terrain interaction and compensating the effect of wheel slipping. A multi-mode real-time estimation method is proposed in this paper to achieve accurate terrain parameter estimation. The proposed method is composed of an inner layer for real-time filtering and an outer layer for online update. In the inner layer, sinkage exponent and internal frictional angle, which have higher sensitivity than that of the other terrain parameters to wheel-terrain interaction forces, are estimated in real time by using an adaptive robust extended Kalman filter (AREKF), whereas the other parameters are fixed with nominal values. The inner layer result can help synthesize the current wheel-terrain contact forces with adequate precision, but has limited prediction capability for time-variable wheel slipping. To improve estimation accuracy of the result from the inner layer, an outer layer based on recursive Gauss-Newton (RGN) algorithm is introduced to refine the result of real-time filtering according to the innovation contained in the history data. With the two-layer structure, the proposed method can work in three fundamental estimation modes: EKF, REKF and RGN, making the method applicable for flat, rough and non-uniform terrains. Simulations have demonstrated the effectiveness of the proposed method under three terrain types, showing the advantages of introducing the two-layer structure.
Making Pictures as a Method of Teaching Art History
ERIC Educational Resources Information Center
Martikainen, Jari
2017-01-01
Inspired by the affective and sensory turns in the paradigm of art history, this article discusses making pictures as a method of teaching art history in Finnish Upper Secondary Vocational Education and Training (Qualification in Visual Expression, Study Programmes in Visual and Media Arts and Photography). A total of 25 students majoring in…
Khani, Mehdi; Ziaee, Vahid; Moradinejad, Mohamad-Hassan; Parvaneh, Nima
2013-01-01
Objective To compare Juvenile Idiopathic Arthritis (JIA) patients with and without family history of autoimmune disease with respect to clinical features and laboratory data. Methods Sixteen JIA patients with family history of autoimmune disease were identified during study, 32 patients were chosen for comparative group from referred patients to the rheumatology clinic according to the date of referral. Two groups were compared with respect to age of onset, sex, subtype, disease activity, duration of active disease and laboratory variables. Findings The age of onset was significantly lower in JIA patients with family history of autoimmunity (4.7 years vs. 7.0 years; P=0.02), polyarthicular subtype was more frequent in patients with positive family history (50% vs.25%; P=0.04) most of JIA patients with positive family history were in the active phase at the time of study (64% vs 25%; P=0.02) and had a longer duration of active disease (21.0 months vs 12.3 months; P=0.04). Patients with positive family history had more positive ANA (43.5%% vs 12.5%; P=0.01) and also more positive ADA (75% vs 20.8%; P=0.002). Two groups were similar according to sex, and other laboratory variables. Conclusion JIA patients with family history of autoimmune disease seem to have a more severe disease than patients without such family history, they are younger at the onset, and have mostly poyarthicular subtype. They also have more ANA and ADA positivity. These findings are different from familial JIA case-control studies according to active disease duration, subtype, and ANA positivity. PMID:24800019
Edmond, Sara N.; Shelby, Rebecca A.; Keefe, Francis J.; Fisher, Hannah M.; Schmidt, John; Soo, Mary Scott; Skinner, Celette Sugg; Ahrendt, Gretchen M.; Manculich, Jessica; Sumkin, Jules H.; Zuley, Margarita L.; Bovbjerg, Dana H.
2016-01-01
Objectives This study compared persistent breast pain among women who received breast-conserving surgery for breast cancer and women without a history of breast cancer. Methods Breast cancer survivors (n=200) were recruited at their first post-surgical surveillance mammogram (6-15 months post-surgery). Women without a breast cancer history (n=150) were recruited at the time of a routine screening mammogram. All women completed measures of breast pain, pain interference with daily activities and intimacy, worry about breast pain, anxiety symptoms, and depression symptoms. Demographic and medical information were also collected. Results Persistent breast pain (duration ≥ 6 months) was reported by 46.5% of breast cancer survivors and 12.7% of women without a breast cancer history (p<0.05). Breast cancer survivors also had significantly higher rates of clinically significant persistent breast pain (pain intensity score ≥3/10), as well as higher average breast pain intensity and unpleasantness scores. Breast cancer survivors with persistent breast pain had significantly higher levels of depressive symptoms, as well as pain worry and interference, compared to survivors without persistent breast pain or women without a breast cancer history. Anxiety symptoms were significantly higher in breast cancer survivors with persistent breast pain compared to women without a breast cancer history. Discussion Results indicate that persistent breast pain negatively impacts women with a history of breast conserving cancer surgery compared to women without that history. Strategies to ameliorate persistent breast pain and to improve adjustment among women with persistent breast pain should be explored for incorporation into standard care for breast cancer survivors. PMID:27922843
Characterize kinematic rupture history of large earthquakes with Multiple Haskell sources
NASA Astrophysics Data System (ADS)
Jia, Z.; Zhan, Z.
2017-12-01
Earthquakes are often regarded as continuous rupture along a single fault, but the occurrence of complex large events involving multiple faults and dynamic triggering challenges this view. Such rupture complexities cause difficulties in existing finite fault inversion algorithms, because they rely on specific parameterizations and regularizations to obtain physically meaningful solutions. Furthermore, it is difficult to assess reliability and uncertainty of obtained rupture models. Here we develop a Multi-Haskell Source (MHS) method to estimate rupture process of large earthquakes as a series of sub-events of varying location, timing and directivity. Each sub-event is characterized by a Haskell rupture model with uniform dislocation and constant unilateral rupture velocity. This flexible yet simple source parameterization allows us to constrain first-order rupture complexity of large earthquakes robustly. Additionally, relatively few parameters in the inverse problem yields improved uncertainty analysis based on Markov chain Monte Carlo sampling in a Bayesian framework. Synthetic tests and application of MHS method on real earthquakes show that our method can capture major features of large earthquake rupture process, and provide information for more detailed rupture history analysis.
Inferring sex-specific demographic history from SNP data
Gautier, Mathieu
2018-01-01
The relative female and male contributions to demography are of great importance to better understand the history and dynamics of populations. While earlier studies relied on uniparental markers to investigate sex-specific questions, the increasing amount of sequence data now enables us to take advantage of tens to hundreds of thousands of independent loci from autosomes and the X chromosome. Here, we develop a novel method to estimate effective sex ratios or ESR (defined as the female proportion of the effective population) from allele count data for each branch of a rooted tree topology that summarizes the history of the populations of interest. Our method relies on Kimura’s time-dependent diffusion approximation for genetic drift, and is based on a hierarchical Bayesian model to integrate over the allele frequencies along the branches. We show via simulations that parameters are inferred robustly, even under scenarios that violate some of the model assumptions. Analyzing bovine SNP data, we infer a strongly female-biased ESR in both dairy and beef cattle, as expected from the underlying breeding scheme. Conversely, we observe a strongly male-biased ESR in early domestication times, consistent with an easier taming and management of cows, and/or introgression from wild auroch males, that would both cause a relative increase in male effective population size. In humans, analyzing a subsample of non-African populations, we find a male-biased ESR in Oceanians that may reflect complex marriage patterns in Aboriginal Australians. Because our approach relies on allele count data, it may be applied on a wide range of species. PMID:29385127
Ornelas, Juan Francisco; Sosa, Victoria; Soltis, Douglas E.; Daza, Juan M.; González, Clementina; Soltis, Pamela S.; Gutiérrez-Rodríguez, Carla; de los Monteros, Alejandro Espinosa; Castoe, Todd A.; Bell, Charles; Ruiz-Sanchez, Eduardo
2013-01-01
Comparative phylogeography can elucidate the influence of historical events on current patterns of biodiversity and can identify patterns of co-vicariance among unrelated taxa that span the same geographic areas. Here we analyze temporal and spatial divergence patterns of cloud forest plant and animal species and relate them to the evolutionary history of naturally fragmented cloud forests–among the most threatened vegetation types in northern Mesoamerica. We used comparative phylogeographic analyses to identify patterns of co-vicariance in taxa that share geographic ranges across cloud forest habitats and to elucidate the influence of historical events on current patterns of biodiversity. We document temporal and spatial genetic divergence of 15 species (including seed plants, birds and rodents), and relate them to the evolutionary history of the naturally fragmented cloud forests. We used fossil-calibrated genealogies, coalescent-based divergence time inference, and estimates of gene flow to assess the permeability of putative barriers to gene flow. We also used the hierarchical Approximate Bayesian Computation (HABC) method implemented in the program msBayes to test simultaneous versus non-simultaneous divergence of the cloud forest lineages. Our results show shared phylogeographic breaks that correspond to the Isthmus of Tehuantepec, Los Tuxtlas, and the Chiapas Central Depression, with the Isthmus representing the most frequently shared break among taxa. However, dating analyses suggest that the phylogeographic breaks corresponding to the Isthmus occurred at different times in different taxa. Current divergence patterns are therefore consistent with the hypothesis of broad vicariance across the Isthmus of Tehuantepec derived from different mechanisms operating at different times. This study, coupled with existing data on divergence cloud forest species, indicates that the evolutionary history of contemporary cloud forest lineages is complex and often lineage-specific, and thus difficult to capture in a simple conservation strategy. PMID:23409165
NASA Astrophysics Data System (ADS)
Klokočník, J.; Kostelecký, J.; Böhm, V.; Böhm, B.; Vondrák, J.; Vítek, F.
2008-05-01
The Maya used their own very precise calendar. When transforming data from the Mayan calendar to ours, or vice versa, a surprisingly large uncertainty is found. The relationship between the two calendars has been investigated by many researchers during the last century and about 50 different values of the transformation coefficient, known as the correlation, have been deduced. They can differ by centuries, potentially yielding an incredibly large error in the relation of Mayan history to the history of other civilizations. The most frequently used correlation is the GMT one (of Goodman-Martínez-Thompson), based largely on historical evidence from colonial times. Astronomy (celestial mechanics) may resolve the problem of the correlation, provided that historians have correctly decoded the records of various astronomical phenomena discovered, namely, in one extremely important and rare Mayan book, the Dresden Codex (DC). This describes (among other matters) observations of various astronomical phenomena (eclipses, conjunctions, maximum elongations, heliacal aspects, etc), made by the Maya. Modern celestial mechanics enables us to compute exactly when the phenomena occurred in the sky for the given place on the Earth, even though far back in time. Here we check (by a completely independent method), confirming the value of the correlation obtained by Böhm & Böhm (1996, 1999). In view of these tests, we advocate rejecting the GMT correlation and replacing it by the Böhm's correlation. We also comment on the criticism of GMT by some investigators. The replacement of GMT by another correlation seems, however, unacceptable to many Mayanists, as they would need to rewrite the whole history of Mesoamerica. The history of the Maya would be - for example with Böhm's correlation - closer to our time by 104 years.
Watters, Anna J; Gotlib, Ian H; Harris, Anthony W F; Boyce, Philip M; Williams, Leanne M
2013-09-05
Unaffected relatives (URs) of individuals with major depressive disorder (MDD) are biologically more vulnerable to depression. We compare healthy URs and controls at the level of phenotype (symptoms and functioning) and endophenotype (negative emotion bias), and further investigate the interrelation between these and the contribution of environmental early life stress. URs (n=101), identified using Family History Screen interview methods and matched controls completed written and interview questions assessing symptoms of depression and anxiety, negative cognitive style, life functioning and early life stress. Biases in emotion processing were measured using a facial expression of emotion identification paradigm. Compared to controls, URs reported higher levels of depression and anxiety, a stronger negative cognitive bias, and poorer functioning and lower satisfaction with life. URs were slower to correctly identify fear and sad facial expressions. A slower response time to identify sad faces was correlated with lower quality of life in the social domain. Early life stress (ELS) did not contribute significantly to any outcome. The methodology relies on accurate reporting of participants' own psychiatric history and that of their family members. The degree of vulnerability varies among URs. A family history of depression accounts for subtle differences in symptom levels and functioning without a necessary role of ELS. A negative emotion bias in processing emotion may be one vulnerability marker for MDD. Biological markers may affect functioning measures before symptoms at the level of experience. Copyright © 2013 Elsevier B.V. All rights reserved.
1976-01-01
AGAJ74) of C-81. The program computes aircraft trim, stability derivatives and control power, and time histories of aircraft and blade motions and...activated. The quasi-static, time-variant trim was used for the main rotor for camel where either time history solutions or steady-state blade loads...of the maneuver since test data were not recorded for the start of the maneu- ver. The time histories for the test data which were avail- able
Statistics of Crack Growth in Engine Materials. Volume 2. Spectrum Loading and advanced Techniques
1984-02-01
Histories of Some WPB Fastener H oles ..................................................................................... 66 51 Typical Sample Function of...Computed Directly from Some Actual Time- Histories of W PB Fastener Holes ................................................................ 77 56 Simulated...Sample Functions of Crack Propagation Time- Histories for W PB Fastener Holes ................................................................ 78 57
ERIC Educational Resources Information Center
Christian, David
1991-01-01
Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)
Just methods in revolting times
Fine, Michelle
2016-01-01
ABSTRACT This article takes up the challenge of critical methods in “revolting times,” as we conduct qualitative research on (in)justice festering within repulsive inequality gaps, and yet surrounded by the thrill of radical social movements dotting the globe. I introduce a call for “critical bifocality,” a term coined by Lois Weis and myself, to argue for research designs that interrogate how history, structures, and lives shape, reveal, and refract the conditions we study. Borrowing from critical researchers long gone, W. E. B. Du Bois in his text The Philadelphia Negro and Marie Jahoda in her stunning case study Marienthal, I offer up a set of epistemological muddles and methodological experiments, hoping to incite a conversation about our responsibilities as critical psychologists in deeply contentious times, refusing downstream analyses and resurrecting instead what Edward Said called “lost causes.” PMID:27812314
Method for tracking the location of mobile agents using stand-off detection technique
Schmitt, Randal L [Tijeras, NM; Bender, Susan Fae Ann [Tijeras, NM; Rodacy, Philip J [Albuquerque, NM; Hargis, Jr., Philip J.; Johnson, Mark S [Albuquerque, NM
2006-12-26
A method for tracking the movement and position of mobile agents using light detection and ranging (LIDAR) as a stand-off optical detection technique. The positions of the agents are tracked by analyzing the time-history of a series of optical measurements made over the field of view of the optical system. This provides a (time+3-D) or (time+2-D) mapping of the location of the mobile agents. Repeated pulses of a laser beam impinge on a mobile agent, such as a bee, and are backscattered from the agent into a LIDAR detection system. Alternatively, the incident laser pulses excite fluorescence or phosphorescence from the agent, which is detected using a LIDAR system. Analysis of the spatial location of signals from the agents produced by repeated pulses generates a multidimensional map of agent location.
A stochastic model of particle dispersion in turbulent reacting gaseous environments
NASA Astrophysics Data System (ADS)
Sun, Guangyuan; Lignell, David; Hewson, John
2012-11-01
We are performing fundamental studies of dispersive transport and time-temperature histories of Lagrangian particles in turbulent reacting flows. The particle-flow statistics including the full particle temperature PDF are of interest. A challenge in modeling particle motions is the accurate prediction of fine-scale aerosol-fluid interactions. A computationally affordable stochastic modeling approach, one-dimensional turbulence (ODT), is a proven method that captures the full range of length and time scales, and provides detailed statistics of fine-scale turbulent-particle mixing and transport. Limited results of particle transport in ODT have been reported in non-reacting flow. Here, we extend ODT to particle transport in reacting flow. The results of particle transport in three flow configurations are presented: channel flow, homogeneous isotropic turbulence, and jet flames. We investigate the functional dependence of the statistics of particle-flow interactions including (1) parametric study with varying temperatures, Reynolds numbers, and particle Stokes numbers; (2) particle temperature histories and PDFs; (3) time scale and the sensitivity of initial and boundary conditions. Flow statistics are compared to both experimental measurements and DNS data.
Li, Jing; Zipper, Carl E; Donovan, Patricia F; Wynne, Randolph H; Oliphant, Adam J
2015-09-01
Surface mining disturbances have attracted attention globally due to extensive influence on topography, land use, ecosystems, and human populations in mineral-rich regions. We analyzed a time series of Landsat satellite imagery to produce a 28-year disturbance history for surface coal mining in a segment of eastern USA's central Appalachian coalfield, southwestern Virginia. The method was developed and applied as a three-step sequence: vegetation index selection, persistent vegetation identification, and mined-land delineation by year of disturbance. The overall classification accuracy and kappa coefficient were 0.9350 and 0.9252, respectively. Most surface coal mines were identified correctly by location and by time of initial disturbance. More than 8 % of southwestern Virginia's >4000-km(2) coalfield area was disturbed by surface coal mining over the 28-year period. Approximately 19.5 % of the Appalachian coalfield surface within the most intensively mined county (Wise County) has been disturbed by mining. Mining disturbances expanded steadily and progressively over the study period. Information generated can be applied to gain further insight concerning mining influences on ecosystems and other essential environmental features.
Vandeleur, C L; Rothen, S; Lustenberger, Y; Glaus, J; Castelao, E; Preisig, M
2015-01-15
The use of the family history method is recommended in family studies as a type of proxy interview of non-participating relatives. However, using different sources of information can result in bias as direct interviews may provide a higher likelihood of assigning diagnoses than family history reports. The aims of the present study were to: (1) compare diagnoses for threshold and subthreshold mood syndromes from interviews to those relying on information from relatives; (2) test the appropriateness of lowering the diagnostic threshold and combining multiple reports from the family history method to obtain comparable prevalence estimates to the interviews; (3) identify factors that influence the likelihood of agreement and reporting of disorders by informants. Within a family study, 1621 informant-index subject pairs were identified. DSM-5 diagnoses from direct interviews of index subjects were compared to those derived from family history information provided by their first-degree relatives. (1) Inter-informant agreement was acceptable for Mania, but low for all other mood syndromes. (2) Except for Mania and subthreshold depression, the family history method provided significantly lower prevalence estimates. The gap improved for all other syndromes after lowering the threshold of the family history method. (3) Individuals who had a history of depression themselves were more likely to report depression in their relatives. Low proportion of affected individuals for manic syndromes and lack of independence of data. The higher likelihood of reporting disorders by affected informants entails the risk of overestimation of the size of familial aggregation of depression. Copyright © 2014 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Geschwind, Nicole; Peeters, Frenk; Drukker, Marjan; van Os, Jim; Wichers, Marieke
2011-01-01
Objective: To examine whether mindfulness-based cognitive therapy (MBCT) increases momentary positive emotions and the ability to make use of natural rewards in daily life. Method: Adults with a life-time history of depression and current residual depressive symptoms (mean age = 43.9 years, SD = 9.6; 75% female; all Caucasian) were randomized to…
A Survey of Quantum Programming Languages: History, Methods, and Tools
2008-01-01
and entanglement , to achieve computational solutions to certain problems in less time (fewer computational cycles) than is possible using classical...superposition of quantum bits, entanglement , destructive measurement, and the no-cloning theorem. These differences must be thoroughly understood and even...computers using well-known languages such as C, C++, Java, and rapid prototyping languages such as Maple, Mathematica, and Matlab . A good on-line
Reconfigurable Control Design with Neural Network Augmentation for a Modified F-15 Aircraft
NASA Technical Reports Server (NTRS)
Burken, John J.
2007-01-01
The viewgraphs present background information about reconfiguration control design, design methods used for paper, control failure survivability results, and results and time histories of tests. Topics examined include control reconfiguration, general information about adaptive controllers, model reference adaptive control (MRAC), the utility of neural networks, radial basis functions (RBF) neural network outputs, neurons, and results of investigations of failures.
Operational considerations to reduce solar array loads
NASA Technical Reports Server (NTRS)
Gerstenmaier, W.
1992-01-01
The key parameters associated with solar array plume loads are examined, and operational considerations aimed at minimizing the effect of the Shuttle plumes on the Space Station solar arrays are discussed. These include solar array pointing to reduce loads and restrictions on Shuttle piloting. Particular attention is given to the method used to obtain the forcing functions (thruster time firing histories) for solar array plume calculation.
Compressible Navier-Stokes equations: A study of leading edge effects
NASA Technical Reports Server (NTRS)
Hariharan, S. I.; Karbhari, P. R.
1987-01-01
A computational method is developed that allows numerical calculations of the time dependent compressible Navier-Stokes equations.The current results concern a study of flow past a semi-infinite flat plate.Flow develops from given inflow conditions upstream and passes over the flat plate to leave the computational domain without reflecting at the downstream boundary. Leading edge effects are included in this paper. In addition, specification of a heated region which gets convected with the flow is considered. The time history of this convection is obtained, and it exhibits a wave phenomena.
Method and Apparatus for Monitoring of Daily Activity in Terms of Ground Reaction Forces
NASA Technical Reports Server (NTRS)
Whalen, Robert T. (Inventor); Breit, Gregory A. (Inventor)
2001-01-01
A device to record and analyze habitual daily activity in terms of the history of gait-related musculoskeletal loading is disclosed. The device consists of a pressure-sensing insole placed into the shoe or embedded in a shoe sole, which detects contact of the foot with the ground. The sensor is coupled to a portable battery-powered digital data logger clipped to the shoe or worn around the ankle or waist. During the course of normal daily activity, the system maintains a record of time-of-occurrence of all non-spurious foot-down and lift-off events. Off line, these data are filtered and converted to a history of foot-ground contact times, from which measures of cumulative musculoskeletal loading, average walking- and running-specific gait speed, total time spent walking and running, total number of walking steps and running steps, and total gait-related energy expenditure are estimated from empirical regressions of various gait parameters to the contact time reciprocal. Data are available as cumulative values or as daily averages by menu selection. The data provided by this device are useful for assessment of musculoskeletal and cardiovascular health and risk factors associated with habitual patterns of daily activity.
Klieve, Helen; Sveticic, Jerneja; De Leo, Diego
2009-01-01
Background The 1996 Australian National Firearms Agreement introduced strict access limitations. However, reports on the effectiveness of the new legislation are conflicting. This study, accessing all cases of suicide 1997-2004, explores factors which may impact on the choice of firearms as a suicide method, including current licence possession and previous history of legal access. Methods Detailed information on all Queensland suicides (1997-2004) was obtained from the Queensland Suicide Register, with additional details of firearm licence history accessed from the Firearm Registry (Queensland Police Service). Cases were compared against licence history and method choice (firearms or other method). Odds ratios (OR) assessed the risk of firearms suicide and suicide by any method against licence history. A logistic regression was undertaken identifying factors significant in those most likely to use firearms in suicide. Results The rate of suicide using firearms in those with a current license (10.92 per 100,000) far exceeded the rate in those with no license history (1.03 per 100,000). Those with a license history had a far higher rate of suicide (30.41 per 100,000) compared to that of all suicides (15.39 per 100,000). Additionally, a history of firearms licence (current or present) was found to more than double the risk of suicide by any means (OR = 2.09, P < 0.001). The group with the highest risk of selecting firearms to suicide were older males from rural locations. Conclusion Accessibility and familiarity with firearms represent critical elements in determining the choice of method. Further licensing restrictions and the implementation of more stringent secure storage requirements are likely to reduce the overall familiarity with firearms in the community and contribute to reductions in rates of suicide. PMID:19778414
Reduced Order Methods for Prediction of Thermal-Acoustic Fatigue
NASA Technical Reports Server (NTRS)
Przekop, A.; Rizzi, S. A.
2004-01-01
The goal of this investigation is to assess the quality of high-cycle-fatigue life estimation via a reduced order method, for structures undergoing random nonlinear vibrations in a presence of thermal loading. Modal reduction is performed with several different suites of basis functions. After numerically solving the reduced order system equations of motion, the physical displacement time history is obtained by an inverse transformation and stresses are recovered. Stress ranges obtained through the rainflow counting procedure are used in a linear damage accumulation method to yield fatigue estimates. Fatigue life estimates obtained using various basis functions in the reduced order method are compared with those obtained from numerical simulation in physical degrees-of-freedom.
A Nonlinear Reduced Order Method for Prediction of Acoustic Fatigue
NASA Technical Reports Server (NTRS)
Przekop, Adam; Rizzi, Stephen A.
2006-01-01
The goal of this investigation is to assess the quality of high-cycle-fatigue life estimation via a reduced order method, for structures undergoing geometrically nonlinear random vibrations. Modal reduction is performed with several different suites of basis functions. After numerically solving the reduced order system equations of motion, the physical displacement time history is obtained by an inverse transformation and stresses are recovered. Stress ranges obtained through the rainflow counting procedure are used in a linear damage accumulation method to yield fatigue estimates. Fatigue life estimates obtained using various basis functions in the reduced order method are compared with those obtained from numerical simulation in physical degrees-of-freedom.
NASA Technical Reports Server (NTRS)
Campbell, John P; Mckinney, Marion O
1952-01-01
A summary of methods for making dynamic lateral stability and response calculations and for estimating the aerodynamic stability derivatives required for use in these calculations is presented. The processes of performing calculations of the time histories of lateral motions, of the period and damping of these motions, and of the lateral stability boundaries are presented as a series of simple straightforward steps. Existing methods for estimating the stability derivatives are summarized and, in some cases, simple new empirical formulas are presented. Detailed estimation methods are presented for low-subsonic-speed conditions but only a brief discussion and a list of references are given for transonic and supersonic speed conditions.
Genetic effects on life-history traits in the Glanville fritillary butterfly
Corander, Jukka
2017-01-01
Background Adaptation to local habitat conditions may lead to the natural divergence of populations in life-history traits such as body size, time of reproduction, mate signaling or dispersal capacity. Given enough time and strong enough selection pressures, populations may experience local genetic differentiation. The genetic basis of many life-history traits, and their evolution according to different environmental conditions remain however poorly understood. Methods We conducted an association study on the Glanville fritillary butterfly, using material from five populations along a latitudinal gradient within the Baltic Sea region, which show different degrees of habitat fragmentation. We investigated variation in 10 principal components, cofounding in total 21 life-history traits, according to two environmental types, and 33 genetic SNP markers from 15 candidate genes. Results We found that nine SNPs from five genes showed strong trend for trait associations (p-values under 0.001 before correction). These associations, yet non-significant after multiple test corrections, with a total number of 1,086 tests, were consistent across the study populations. Additionally, these nine genes also showed an allele frequency difference between the populations from the northern fragmented versus the southern continuous landscape. Discussion Our study provides further support for previously described trait associations within the Glanville fritillary butterfly species across different spatial scales. Although our results alone are inconclusive, they are concordant with previous studies that identified these associations to be related to climatic changes or habitat fragmentation within the Åland population. PMID:28560112
Patient adherence to prescribed antimicrobial drug dosing regimens.
Vrijens, Bernard; Urquhart, John
2005-05-01
The aim of this article is to review current knowledge about the clinical impact of patients' variable adherence to prescribed anti-infective drug dosing regimens, with the aim of renewing interest and exploration of this important but largely neglected area of therapeutics. Central to the estimation of a patient's adherence to a prescribed drug regimen is a reliably compiled drug dosing history. Electronic monitoring methods have emerged as the virtual 'gold standard' for compiling drug dosing histories in ambulatory patients. Reliably compiled drug dosing histories are consistently downwardly skewed, with varying degrees of under-dosing. In particular, the consideration of time intervals between protease inhibitor doses has revealed that ambulatory patients' variable execution of prescribed dosing regimens is a leading source of variance in viral response. Such analyses reveal the need for a new discipline, called pharmionics, which is the study of how ambulatory patients use prescription drugs. Properly analysed, reliable data on the time-course of patients' actual intake of prescription drugs can eliminate a major source of unallocated variance in drug responses, including the non-response that occurs and is easily misinterpreted when a patient's complete non-execution of a prescribed drug regimen is unrecognized clinically. As such, reliable compilation of ambulatory patients' drug dosing histories has the promise of being a key step in reducing unallocated variance in drug response and in improving the informational yield of clinical trials. It is also the basis for sound, measurement-guided steps taken to improve a patient's execution of a prescribed dosing regimen.
From Dates to Rates: The Emergence of Integrated Geochronometry (Invited)
NASA Astrophysics Data System (ADS)
Hodges, K. V.; Adams, B. A.; Bohon, W.; Cooper, F. J.; Tripathy-Lang, A.; Van Soest, M. C.; Watson, E. B.; Young, K. E.
2013-12-01
Many applications of isotope geochemistry to telling time have involved geochronology - the measurement of the crystallization age of a mineral - or thermochronology, the measurement of the time at which a mineral cooled through an estimated closure temperature. The resulting data typically provide one or two points along an evolving temperature-time (Tt) path. Unfortunately, many problems require a richer knowledge of longer portions of the Tt path and thus the integrated application of multiple chronometers to individual minerals or suites of minerals from a particular sample or outcrop. In this presentation, we review some of the most recent advances in geochronometry, the direct dating of rates of a wide range of geologic processes on timescales ranging from seconds (in the case of bolide impact on Earth and elsewhere in the Solar System) to hundreds of millions of years (in the case of very slowly cooled Precambrian terrains). For all chronometers except those based on the production of fission tracks, our capacity to extract precise and accurate Tt paths depends on a good understanding of the kinetics of diffusive loss of radiogenic daughter isotopes. Laboratory experiments have substantially improved our understanding of nominal kinetic parameters in recent years, but our increased use of new methods for their determination (e.g., Rutherford backscattering spectroscopy, nuclear reaction analysis, and laser depth profiling) have demonstrated complexities related to compositional variations and asymmetric diffusion. At the same time, a growing number of geologic applications of these chronometers illustrate the importance of deformation history and radiation damage in modifying effective diffusion parameters. Such factors have two important implications for geochronometry. First, they suggest that studies of multiple minerals employing multiple isotopic methods - integrated geochronometry - are likely to produce more robust constraints on Tt paths than those involving the application of a single geochronometer. Second, they suggest that characterization of the chemistry and structure of minerals prior to dating may become standard procedure in most laboratories. Some of the most valuable constraints on the cooling histories of individual crystals come from microanalytical techniques that illuminate natural diffusive loss profiles, either directly (e.g., laser and ion microprobe mapping) or indirectly (e.g., 40Ar/39Ar and 4He/3He incremental heating experimentation). For most materials and most cooling histories, direct microanalytical approaches yield less spatial resolution and thus a poorer resolution of the cooling history. On the other hand, the extraction of cooling histories based on data obtained through indirect techniques requires significant simplifying assumptions regarding the three-dimensional distribution of parent isotopes that are not always warranted. Studies that integrate such techniques, rare in the literature thus far, are ushering in a new era of quantitative geochronometry.
Smith, Christopher Irwin; Tank, Shantel; Godsoe, William; Levenick, Jim; Strand, Eva; Esque, Todd C.; Pellmyr, Olle
2011-01-01
Comparative phylogeographic studies have had mixed success in identifying common phylogeographic patterns among co-distributed organisms. Whereas some have found broadly similar patterns across a diverse array of taxa, others have found that the histories of different species are more idiosyncratic than congruent. The variation in the results of comparative phylogeographic studies could indicate that the extent to which sympatrically-distributed organisms share common biogeographic histories varies depending on the strength and specificity of ecological interactions between them. To test this hypothesis, we examined demographic and phylogeographic patterns in a highly specialized, coevolved community – Joshua trees (Yucca brevifolia) and their associated yucca moths. This tightly-integrated, mutually interdependent community is known to have experienced significant range changes at the end of the last glacial period, so there is a strong a priori expectation that these organisms will show common signatures of demographic and distributional changes over time. Using a database of >5000 GPS records for Joshua trees, and multi-locus DNA sequence data from the Joshua tree and four species of yucca moth, we combined paleaodistribution modeling with coalescent-based analyses of demographic and phylgeographic history. We extensively evaluated the power of our methods to infer past population size and distributional changes by evaluating the effect of different inference procedures on our results, comparing our palaeodistribution models to Pleistocene-aged packrat midden records, and simulating DNA sequence data under a variety of alternative demographic histories. Together the results indicate that these organisms have shared a common history of population expansion, and that these expansions were broadly coincident in time. However, contrary to our expectations, none of our analyses indicated significant range or population size reductions at the end of the last glacial period, and the inferred demographic changes substantially predate Holocene climate changes.
Smith, C.I.; Tank, S.; Godsoe, W.; Levenick, J.; Strand, Espen; Esque, T.; Pellmyr, O.
2011-01-01
Comparative phylogeographic studies have had mixed success in identifying common phylogeographic patterns among co-distributed organisms. Whereas some have found broadly similar patterns across a diverse array of taxa, others have found that the histories of different species are more idiosyncratic than congruent. The variation in the results of comparative phylogeographic studies could indicate that the extent to which sympatrically-distributed organisms share common biogeographic histories varies depending on the strength and specificity of ecological interactions between them. To test this hypothesis, we examined demographic and phylogeographic patterns in a highly specialized, coevolved community - Joshua trees (Yucca brevifolia) and their associated yucca moths. This tightly-integrated, mutually interdependent community is known to have experienced significant range changes at the end of the last glacial period, so there is a strong a priori expectation that these organisms will show common signatures of demographic and distributional changes over time. Using a database of >5000 GPS records for Joshua trees, and multi-locus DNA sequence data from the Joshua tree and four species of yucca moth, we combined paleaodistribution modeling with coalescent-based analyses of demographic and phylgeographic history. We extensively evaluated the power of our methods to infer past population size and distributional changes by evaluating the effect of different inference procedures on our results, comparing our palaeodistribution models to Pleistocene-aged packrat midden records, and simulating DNA sequence data under a variety of alternative demographic histories. Together the results indicate that these organisms have shared a common history of population expansion, and that these expansions were broadly coincident in time. However, contrary to our expectations, none of our analyses indicated significant range or population size reductions at the end of the last glacial period, and the inferred demographic changes substantially predate Holocene climate changes.
Charm: Cosmic history agnostic reconstruction method
NASA Astrophysics Data System (ADS)
Porqueres, Natalia; Ensslin, Torsten A.
2017-03-01
Charm (cosmic history agnostic reconstruction method) reconstructs the cosmic expansion history in the framework of Information Field Theory. The reconstruction is performed via the iterative Wiener filter from an agnostic or from an informative prior. The charm code allows one to test the compatibility of several different data sets with the LambdaCDM model in a non-parametric way.
Adaptive controller for a strength testbed for aircraft structures
NASA Astrophysics Data System (ADS)
Laperdin, A. I.; Yurkevich, V. D.
2017-07-01
The problem of control system design for a strength testbed of aircraft structures is considered. A method for calculating the parameters of a proportional-integral controller (control algorithm) using the time-scale separation method for the testbed taking into account the dead time effect in the control loop is presented. An adaptive control algorithm structure is proposed which limits the amplitude of high-frequency oscillations in the control system with a change in the direction of motion of the rod of the hydraulic cylinders and provides the desired accuracy and quality of transients at all stages of structural loading history. The results of tests of the developed control system with the adaptive control algorithm on an experimental strength testbed for aircraft structures are given.
Development and evaluation of the impulse transfer function technique
NASA Technical Reports Server (NTRS)
Mantus, M.
1972-01-01
The development of the test/analysis technique known as the impulse transfer function (ITF) method is discussed. This technique, when implemented with proper data processing systems, should become a valuable supplement to conventional dynamic testing and analysis procedures that will be used in the space shuttle development program. The method can relieve many of the problems associated with extensive and costly testing of the shuttle for transient loading conditions. In addition, the time history information derived from impulse testing has the potential for being used to determine modal data for the structure under investigation. The technique could be very useful in determining the time-varying modal characteristics of structures subjected to thermal transients, where conventional mode surveys are difficult to perform.
Data assimilation using a GPU accelerated path integral Monte Carlo approach
NASA Astrophysics Data System (ADS)
Quinn, John C.; Abarbanel, Henry D. I.
2011-09-01
The answers to data assimilation questions can be expressed as path integrals over all possible state and parameter histories. We show how these path integrals can be evaluated numerically using a Markov Chain Monte Carlo method designed to run in parallel on a graphics processing unit (GPU). We demonstrate the application of the method to an example with a transmembrane voltage time series of a simulated neuron as an input, and using a Hodgkin-Huxley neuron model. By taking advantage of GPU computing, we gain a parallel speedup factor of up to about 300, compared to an equivalent serial computation on a CPU, with performance increasing as the length of the observation time used for data assimilation increases.
NASA Astrophysics Data System (ADS)
Pioldi, Fabio; Rizzi, Egidio
2016-08-01
This paper proposes a new output-only element-level system identification and input estimation technique, towards the simultaneous identification of modal parameters, input excitation time history and structural features at the element-level by adopting earthquake-induced structural response signals. The method, named Full Dynamic Compound Inverse Method (FDCIM), releases strong assumptions of earlier element-level techniques, by working with a two-stage iterative algorithm. Jointly, a Statistical Average technique, a modification process and a parameter projection strategy are adopted at each stage to achieve stronger convergence for the identified estimates. The proposed method works in a deterministic way and is completely developed in State-Space form. Further, it does not require continuous- to discrete-time transformations and does not depend on initialization conditions. Synthetic earthquake-induced response signals from different shear-type buildings are generated to validate the implemented procedure, also with noise-corrupted cases. The achieved results provide a necessary condition to demonstrate the effectiveness of the proposed identification method.
A century of enzyme kinetic analysis, 1913 to 2013.
Johnson, Kenneth A
2013-09-02
This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Li, Kebin; Li, Xiaojie; Yan, Honghao; Wang, Xiaohong; Miao, Yusong
2017-12-01
A new velocity probe which permits recording the time history of detonation and shock waves has been developed by improving the commercial on principle and structure. A method based on the probe is then designed to measure the detonation velocity and near-field shock parameters in a single underwater explosion, by which the oblique shock wave front of cylindrical charges and the peak pressure attenuation curve of spherical explosive are obtained. A further derivation of detonation pressure, adiabatic exponent, and other shock parameters is conducted. The present method offers a novel and reliable parameter determination for near-field underwater explosion.
NASA Astrophysics Data System (ADS)
Li, Kebin; Li, Xiaojie; Yan, Honghao; Wang, Xiaohong; Miao, Yusong
2017-12-01
A new velocity probe which permits recording the time history of detonation and shock waves has been developed by improving the commercial on principle and structure. A method based on the probe is then designed to measure the detonation velocity and near-field shock parameters in a single underwater explosion, by which the oblique shock wave front of cylindrical charges and the peak pressure attenuation curve of spherical explosive are obtained. A further derivation of detonation pressure, adiabatic exponent, and other shock parameters is conducted. The present method offers a novel and reliable parameter determination for near-field underwater explosion.
ERIC Educational Resources Information Center
Blois, B. A.
1976-01-01
History as a social science does not need to be dull. A method which has had great success in a survey course of modern American history has been the use of oral history techniques in studying the Depression. Students responded enthusiastically and their projects formed a valuable nucleus of the school's oral history project. (Author/JDS)
De Matteis, Sara; Jarvis, Deborah; Young, Heather; Young, Alan; Allen, Naomi; Potts, James; Darnton, Andrew; Rushton, Lesley; Cullinan, Paul
2017-03-01
Objectives The standard approach to the assessment of occupational exposures is through the manual collection and coding of job histories. This method is time-consuming and costly and makes it potentially unfeasible to perform high quality analyses on occupational exposures in large population-based studies. Our aim was to develop a novel, efficient web-based tool to collect and code lifetime job histories in the UK Biobank, a population-based cohort of over 500 000 participants. Methods We developed OSCAR (occupations self-coding automatic recording) based on the hierarchical structure of the UK Standard Occupational Classification (SOC) 2000, which allows individuals to collect and automatically code their lifetime job histories via a simple decision-tree model. Participants were asked to find each of their jobs by selecting appropriate job categories until they identified their job title, which was linked to a hidden 4-digit SOC code. For each occupation a job title in free text was also collected to estimate Cohen's kappa (κ) inter-rater agreement between SOC codes assigned by OSCAR and an expert manual coder. Results OSCAR was administered to 324 653 UK Biobank participants with an existing email address between June and September 2015. Complete 4-digit SOC-coded lifetime job histories were collected for 108 784 participants (response rate: 34%). Agreement between the 4-digit SOC codes assigned by OSCAR and the manual coder for a random sample of 400 job titles was moderately good [κ=0.45, 95% confidence interval (95% CI) 0.42-0.49], and improved when broader job categories were considered (κ=0.64, 95% CI 0.61-0.69 at a 1-digit SOC-code level). Conclusions OSCAR is a novel, efficient, and reasonably reliable web-based tool for collecting and automatically coding lifetime job histories in large population-based studies. Further application in other research projects for external validation purposes is warranted.
Experimental test of entangled histories
NASA Astrophysics Data System (ADS)
Cotler, Jordan; Duan, Lu-Ming; Hou, Pan-Yu; Wilczek, Frank; Xu, Da; Yin, Zhang-Qi; Zu, Chong
2017-12-01
Entangled histories arise when a system partially decoheres in such a way that its past cannot be described by a sequence of states, but rather a superposition of sequences of states. Such entangled histories have not been previously observed. We propose and demonstrate the first experimental scheme to create entangled history states of the Greenberger-Horne-Zeilinger (GHZ) type. In our experiment, the polarization states of a single photon at three different times are prepared as a GHZ entangled history state. We define a GHZ functional which attains a maximum value 1 on the ideal GHZ entangled history state and is bounded above by 1 / 16 for any three-time history state lacking tripartite entanglement. We have measured the GHZ functional on a state we have prepared experimentally, yielding a value of 0 . 656 ± 0 . 005, clearly demonstrating the contribution of entangled histories.
NASA Astrophysics Data System (ADS)
Kim, Y.; Johnson, M. S.
2017-12-01
Spectral entropy (Hs) is an index which can be used to measure the structural complexity of time series data. When a time series is made up of one periodic function, the Hs value becomes smaller, while Hs becomes larger when a time series is composed of several periodic functions. We hypothesized that this characteristic of the Hs could be used to quantify the water stress history of vegetation. For the ideal condition for which sufficient water is supplied to an agricultural crop or natural vegetation, there should be a single distinct phenological cycle represented in a vegetation index time series (e.g., NDVI and EVI). However, time series data for a vegetation area that repeatedly experiences water stress may include several fluctuations that can be observed in addition to the predominant phenological cycle. This is because the process of experiencing water stress and recovering from it generates small fluctuations in phenological characteristics. Consequently, the value of Hs increases when vegetation experiences several water shortages. Therefore, the Hs could be used as an indicator for water stress history. To test this hypothesis, we analyzed Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) data for a natural area in comparison to a nearby sugarcane area in seasonally-dry western Costa Rica. In this presentation we will illustrate the use of spectral entropy to evaluate the vegetative responses of natural vegetation (dry tropical forest) and sugarcane under three different irrigation techniques (center pivot irrigation, drip irrigation and flood irrigation). Through this comparative analysis, the utility of Hs as an indicator will be tested. Furthermore, crop response to the different irrigation methods will be discussed in terms of Hs, NDVI and yield.
Seo, Kwang Won; Ahn, Jong-Joon; Ra, Seung Won; Kwon, Woon-Jung; Jegal, Yangjin
2014-06-01
The interferon gamma (IFN-γ) release assays (IGRAs) are the best method of detecting Mycobacterium tuberculosis infection. However, reports on IGRAs results obtained during and right after the treatment of tuberculosis (TB) have presented differing results. Some studies have shown declining responses, whereas other reports described persistent, fluctuating, or increasing responses. We postulated that the IGRA-positivity will decrease or revert long time after treatment of TB, and thus, evaluated the response of IGRA in subjects with a history of pulmonary TB. Seventy subjects (M:F = 51:19; age = 53.2 ± 11.8 years) underwent tuberculin skin tests (TSTs) and IGRA. The interval of time elapsed after the completion of anti-TB treatment was < 10 years for 16 subjects, 10-20 years for 13 subjects, 20-30 years for 16 subjects, and ≥ 30 years for 25 subjects. The TST was positive in 49 subjects (74%) and negative in 17 subjects (26%). The IGRA was positive in 52 subjects (74%) and negative in 18 subjects (26%). The IFN-γ level and the size of induration showed good correlation (r = 0.525, P < 0.001). However, the correlation between time elapsed after the completion of anti-TB treatment and the size of induration or that between time and the IFN-γ level was not significant. The TST and IGRA were positive in 72.7% and 68.0% of subjects ≥ 30 years after the treatment of pulmonary TB. In conclusion, majority of subjects with a history of pulmonary TB are IGRA-positive, even a few decades after the completion of anti-TB treatment.
Turbulent Statistics From Time-Resolved PIV Measurements of a Jet Using Empirical Mode Decomposition
NASA Technical Reports Server (NTRS)
Dahl, Milo D.
2013-01-01
Empirical mode decomposition is an adaptive signal processing method that when applied to a broadband signal, such as that generated by turbulence, acts as a set of band-pass filters. This process was applied to data from time-resolved, particle image velocimetry measurements of subsonic jets prior to computing the second-order, two-point, space-time correlations from which turbulent phase velocities and length and time scales could be determined. The application of this method to large sets of simultaneous time histories is new. In this initial study, the results are relevant to acoustic analogy source models for jet noise prediction. The high frequency portion of the results could provide the turbulent values for subgrid scale models for noise that is missed in large-eddy simulations. The results are also used to infer that the cross-correlations between different components of the decomposed signals at two points in space, neglected in this initial study, are important.
Turbulent Statistics from Time-Resolved PIV Measurements of a Jet Using Empirical Mode Decomposition
NASA Technical Reports Server (NTRS)
Dahl, Milo D.
2012-01-01
Empirical mode decomposition is an adaptive signal processing method that when applied to a broadband signal, such as that generated by turbulence, acts as a set of band-pass filters. This process was applied to data from time-resolved, particle image velocimetry measurements of subsonic jets prior to computing the second-order, two-point, space-time correlations from which turbulent phase velocities and length and time scales could be determined. The application of this method to large sets of simultaneous time histories is new. In this initial study, the results are relevant to acoustic analogy source models for jet noise prediction. The high frequency portion of the results could provide the turbulent values for subgrid scale models for noise that is missed in large-eddy simulations. The results are also used to infer that the cross-correlations between different components of the decomposed signals at two points in space, neglected in this initial study, are important.
Liu, P.; Archuleta, R.J.; Hartzell, S.H.
2006-01-01
We present a new method for calculating broadband time histories of ground motion based on a hybrid low-frequency/high-frequency approach with correlated source parameters. Using a finite-difference method we calculate low- frequency synthetics (< ∼1 Hz) in a 3D velocity structure. We also compute broadband synthetics in a 1D velocity model using a frequency-wavenumber method. The low frequencies from the 3D calculation are combined with the high frequencies from the 1D calculation by using matched filtering at a crossover frequency of 1 Hz. The source description, common to both the 1D and 3D synthetics, is based on correlated random distributions for the slip amplitude, rupture velocity, and rise time on the fault. This source description allows for the specification of source parameters independent of any a priori inversion results. In our broadband modeling we include correlation between slip amplitude, rupture velocity, and rise time, as suggested by dynamic fault modeling. The method of using correlated random source parameters is flexible and can be easily modified to adjust to our changing understanding of earthquake ruptures. A realistic attenuation model is common to both the 3D and 1D calculations that form the low- and high-frequency components of the broadband synthetics. The value of Q is a function of the local shear-wave velocity. To produce more accurate high-frequency amplitudes and durations, the 1D synthetics are corrected with a randomized, frequency-dependent radiation pattern. The 1D synthetics are further corrected for local site and nonlinear soil effects by using a 1D nonlinear propagation code and generic velocity structure appropriate for the site’s National Earthquake Hazards Reduction Program (NEHRP) site classification. The entire procedure is validated by comparison with the 1994 Northridge, California, strong ground motion data set. The bias and error found here for response spectral acceleration are similar to the best results that have been published by others for the Northridge rupture.
Monte Carlo simulation of photon migration in a cloud computing environment with MapReduce
Pratx, Guillem; Xing, Lei
2011-01-01
Monte Carlo simulation is considered the most reliable method for modeling photon migration in heterogeneous media. However, its widespread use is hindered by the high computational cost. The purpose of this work is to report on our implementation of a simple MapReduce method for performing fault-tolerant Monte Carlo computations in a massively-parallel cloud computing environment. We ported the MC321 Monte Carlo package to Hadoop, an open-source MapReduce framework. In this implementation, Map tasks compute photon histories in parallel while a Reduce task scores photon absorption. The distributed implementation was evaluated on a commercial compute cloud. The simulation time was found to be linearly dependent on the number of photons and inversely proportional to the number of nodes. For a cluster size of 240 nodes, the simulation of 100 billion photon histories took 22 min, a 1258 × speed-up compared to the single-threaded Monte Carlo program. The overall computational throughput was 85,178 photon histories per node per second, with a latency of 100 s. The distributed simulation produced the same output as the original implementation and was resilient to hardware failure: the correctness of the simulation was unaffected by the shutdown of 50% of the nodes. PMID:22191916
Cryopreservation of Fish Spermatogonial Cells: The Future of Natural History Collections.
Hagedorn, Mary M; Daly, Jonathan P; Carter, Virginia L; Cole, Kathleen S; Jaafar, Zeehan; Lager, Claire V A; Parenti, Lynne R
2018-04-18
As global biodiversity declines, the value of biological collections increases. Cryopreserved diploid spermatogonial cells meet two goals: to yield high-quality molecular sequence data; and to regenerate new individuals, hence potentially countering species extinction. Cryopreserved spermatogonial cells that allow for such mitigative measures are not currently in natural history museum collections because there are no standard protocols to collect them. Vertebrate specimens, especially fishes, are traditionally formalin-fixed and alcohol-preserved which makes them ideal for morphological studies and as museum vouchers, but inadequate for molecular sequence data. Molecular studies of fishes routinely use tissues preserved in ethanol; yet tissues preserved in this way may yield degraded sequences over time. As an alternative to tissue fixation methods, we assessed and compared previously published cryopreservation methods by gating and counting fish testicular cells with flow cytometry to identify presumptive spermatogonia A-type cells. Here we describe a protocol to cryopreserve tissues that yields a high percentage of viable spermatogonial cells from the testes of Asterropteryx semipunctata, a marine goby. Material cryopreserved using this protocol represents the first frozen and post-thaw viable spermatogonial cells of fishes archived in a natural history museum to provide better quality material for re-derivation of species and DNA preservation and analysis.
Life histories in occupational therapy clinical practice.
Frank, G
1996-04-01
This article defines and compares several narrative methods used to describe and interpret patients' lives. The biographical methods presented are case histories, life-charts, life histories, life stories, assisted autobiography, hermeneutic case reconstruction, therapeutic employment, volitional narratives, and occupational storytelling and story making. Emphasis is placed the clinician as a collaborator and interpreter of the patient's life through ongoing interactions and dialogue.
ERIC Educational Resources Information Center
Voet, Michiel; De Wever, Bram
2017-01-01
The present study explores secondary school history teachers' knowledge of inquiry methods. To do so, a process model, outlining five core cognitive processes of inquiry in the history classroom, was developed based on a review of the literature. This process model was then used to analyze think-aloud protocols of 20 teachers' reasoning during an…
Composition measurements of binary mixture droplets by rainbow refractometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilms, J.; Weigand, B
2007-04-10
So far, refractive index measurements by rainbow refractometry have been used to determine the temperature of single droplets and ensembles of droplets. Rainbow refractometry is, for the first time, to the best of our knowledge, applied to measure composition histories of evaporating, binary mixture droplets. An evaluation method is presented that makes use of Airy theory and the simultaneous size measurement by Mie scattering imaging. The method further includes an empirical correction function for a certain diameter and refractive index range. The measurement uncertainty was investigated by numerical simulations with Lorenz-Mie theory. For the experiments, an optical levitation setup wasmore » used allowing for long measurement periods. Temperature measurements of single-component droplets at different temperature levels are shown to demonstrate the accuracy of rainbow refractometry. Measurements of size and composition histories of binary mixture droplets are presented for two different mixtures. Experimental results show good agreement with numerical results using a rapid-mixing model.« less
'Cutting for the stone': the ancient art of lithotomy.
Herr, Harry W
2008-05-01
Bladder stone was a common ailment plaguing mankind from antiquity to the 20th century. Largely forgotten today, lithotomy relieved the anguish of bladder stones and identified urology as a medical specialty nearly 2500 years ago. The historical literature pertaining to lithotomy was reviewed. Translated and original documents describing operative techniques and developments pertaining to the history of lithotomy were obtained through the internet and library sources. The ancient art of lithotomy was first recorded by the Greeks and evolved through five phases: the Celsian method, or 'lesser operation'; the Marian, or 'greater operation'; the lateral operation; suprapubic cystotomy, or 'high operation', and proctocystotomy. The practice of open lithotomy ceased to exist owing to better minimally invasive alternative methods and most notably by the virtual disappearance of bladder stones in modern man. The history of lithotomy is a fascinating story of how early surgeons forced by the culture and customs of the time dealt with common but devastating bladder stones. Out of their efforts, urology was born.
Genome dynamics and evolution in yeasts: A long-term yeast-bacteria competition experiment
Katz, Michael; Knecht, Wolfgang; Compagno, Concetta; Piškur, Jure
2018-01-01
There is an enormous genetic diversity evident in modern yeasts, but our understanding of the ecological basis of such diversifications in nature remains at best fragmented so far. Here we report a long-term experiment mimicking a primordial competitive environment, in which yeast and bacteria co-exist and compete against each other. Eighteen yeasts covering a wide phylogenetic background spanning approximately 250 million years of evolutionary history were used to establish independent evolution lines for at most 130 passages. Our collection of hundreds of modified strains generated through such a rare two-species cross-kingdom competition experiment re-created the appearance of large-scale genomic rearrangements and altered phenotypes important in the diversification history of yeasts. At the same time, the methodology employed in this evolutionary study would also be a non-gene-technological method of reprogramming yeast genomes and then selecting yeast strains with desired traits. Cross-kingdom competition may therefore be a method of significant value to generate industrially useful yeast strains with new metabolic traits. PMID:29624585
Morán Fagúndez, Luis Juan; Rivera Torres, Alejandra; González Sánchez, María Eugenia; de Torres Aured, Mari Lourdes; López-Pardo Martínez, Mercedes; Irles Rocamora, José Antonio
2015-02-26
The food consumption assessment methods are used in nutrition and health population surveys and are the basis for the development of guidelines, nutritional recommendations and health plans, The study of these issues is one of the major tasks of the research and health policy in developed countries. Major advances nationally in this area have been made since 1940, both in the reliability of the data and in the standardization of studies, which is a necessary condition to compare changes over time. In this article the history and application of different dietary surveys, dietary history and food frequency records are analyzed. Besides information from surveys conducted at a national level, the main data currently available for public health planning in nutrition comes from nutritional analysis of household budget surveys and food balance sheets, based on data provided by the Ministry of Agriculture. Copyright AULA MEDICA EDICIONES 2015. Published by AULA MEDICA. All rights reserved.
Palacios, Julia A; Minin, Vladimir N
2013-03-01
Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method. Copyright © 2013, The International Biometric Society.
Stockman, Jamila K; Syvertsen, Jennifer L; Robertson, Angela M; Ludwig-Barron, Natasha T; Bergmann, Julie N; Palinkas, Lawrence A
2014-01-01
Female-initiated barrier methods for the prevention of HIV may be an effective alternative for drug-using women who are unable to negotiate safe sex, often as a result of physical and/or sexual partner violence. Utilizing a SAVA (substance abuse, violence, and AIDS) syndemic framework, we qualitatively examined perspectives on female condoms and vaginal microbicides among 18 women with histories of methamphetamine abuse and partner violence in San Diego, California. Most women were not interested in female condoms owing to perceived discomfort, difficulty of insertion, time-intensive effort, and unappealing appearance. Alternatively, most women viewed vaginal microbicides as a useful method. Positive aspects included convenience, ability to disguise as a lubricant, and a sense of control and empowerment. Concerns included possible side effects, timing of application, and unfavorable characteristics of the gel. Acceptability of female-initiated barrier methods was context dependent (i.e., partner type, level of drug use and violence that characterized the sexual relationship). Findings indicate that efforts are needed to address barriers identified for vaginal microbicides to increase its uptake in future HIV prevention trials and marketing of future Food and Drug Administration-approved products. Strategies should address gender-based inequalities (e.g., partner violence) experienced by drug-using women and promote female empowerment. Education on female-initiated barrier methods is also needed for women who use drugs, as well as health care providers and other professionals providing sexual health care and contraception to women with histories of drug use and partner violence. Copyright © 2014 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y M; Bush, K; Han, B
Purpose: Accurate and fast dose calculation is a prerequisite of precision radiation therapy in modern photon and particle therapy. While Monte Carlo (MC) dose calculation provides high dosimetric accuracy, the drastically increased computational time hinders its routine use. Deterministic dose calculation methods are fast, but problematic in the presence of tissue density inhomogeneity. We leverage the useful features of deterministic methods and MC to develop a hybrid dose calculation platform with autonomous utilization of MC and deterministic calculation depending on the local geometry, for optimal accuracy and speed. Methods: Our platform utilizes a Geant4 based “localized Monte Carlo” (LMC) methodmore » that isolates MC dose calculations only to volumes that have potential for dosimetric inaccuracy. In our approach, additional structures are created encompassing heterogeneous volumes. Deterministic methods calculate dose and energy fluence up to the volume surfaces, where the energy fluence distribution is sampled into discrete histories and transported using MC. Histories exiting the volume are converted back into energy fluence, and transported deterministically. By matching boundary conditions at both interfaces, deterministic dose calculation account for dose perturbations “downstream” of localized heterogeneities. Hybrid dose calculation was performed for water and anthropomorphic phantoms. Results: We achieved <1% agreement between deterministic and MC calculations in the water benchmark for photon and proton beams, and dose differences of 2%–15% could be observed in heterogeneous phantoms. The saving in computational time (a factor ∼4–7 compared to a full Monte Carlo dose calculation) was found to be approximately proportional to the volume of the heterogeneous region. Conclusion: Our hybrid dose calculation approach takes advantage of the computational efficiency of deterministic method and accuracy of MC, providing a practical tool for high performance dose calculation in modern RT. The approach is generalizable to all modalities where heterogeneities play a large role, notably particle therapy.« less
Introducing TreeCollapse: a novel greedy algorithm to solve the cophylogeny reconstruction problem.
Drinkwater, Benjamin; Charleston, Michael A
2014-01-01
Cophylogeny mapping is used to uncover deep coevolutionary associations between two or more phylogenetic histories at a macro coevolutionary scale. As cophylogeny mapping is NP-Hard, this technique relies heavily on heuristics to solve all but the most trivial cases. One notable approach utilises a metaheuristic to search only a subset of the exponential number of fixed node orderings possible for the phylogenetic histories in question. This is of particular interest as it is the only known heuristic that guarantees biologically feasible solutions. This has enabled research to focus on larger coevolutionary systems, such as coevolutionary associations between figs and their pollinator wasps, including over 200 taxa. Although able to converge on solutions for problem instances of this size, a reduction from the current cubic running time is required to handle larger systems, such as Wolbachia and their insect hosts. Rather than solving this underlying problem optimally this work presents a greedy algorithm called TreeCollapse, which uses common topological patterns to recover an approximation of the coevolutionary history where the internal node ordering is fixed. This approach offers a significant speed-up compared to previous methods, running in linear time. This algorithm has been applied to over 100 well-known coevolutionary systems converging on Pareto optimal solutions in over 68% of test cases, even where in some cases the Pareto optimal solution has not previously been recoverable. Further, while TreeCollapse applies a local search technique, it can guarantee solutions are biologically feasible, making this the fastest method that can provide such a guarantee. As a result, we argue that the newly proposed algorithm is a valuable addition to the field of coevolutionary research. Not only does it offer a significantly faster method to estimate the cost of cophylogeny mappings but by using this approach, in conjunction with existing heuristics, it can assist in recovering a larger subset of the Pareto front than has previously been possible.
Demographic history of an elusive carnivore: using museums to inform management
Holbrook, Joseph D; DeYoung, Randy W; Tewes, Michael E; Young, John H
2012-01-01
Elusive carnivores present a challenge to managers because traditional survey methods are not suitable. We applied a genetic approach using museum specimens to examine how historical and recent conditions influenced the demographic history of Puma concolor in western and southern Texas, USA. We used 10 microsatellite loci and indexed population trends by estimating historical and recent genetic diversity, genetic differentiation and effective population size. Mountain lions in southern Texas exhibited a 9% decline in genetic diversity, whereas diversity remained stable in western Texas. Genetic differentiation between western and southern Texas was minimal historically (FST = 0.04, P < 0.01), but increased 2–2.5 times in our recent sample. An index of genetic drift for southern Texas was seven to eight times that of western Texas, presumably contributing to the current differentiation between western and southern Texas. Furthermore, southern Texas exhibited a >50% temporal decline in effective population size, whereas western Texas showed no change. Our results illustrate that population declines and genetic drift have occurred in southern Texas, likely because of contemporary habitat loss and predator control. Population monitoring may be needed to ensure the persistence of mountain lions in the southern Texas region. This study highlights the utility of sampling museum collections to examine demographic histories and inform wildlife management. PMID:23028402
Leonardo da Vinci and Andreas Vesalius; the shoulder girdle and the spine, a comparison.
Ganseman, Y; Broos, P
2008-01-01
Leonardo Da Vinci and Andreas Vesalius were two important renaissance persons; Vesalius was a surgeon-anatomist who delivered innovative work on the study of the human body, Leonardo da Vinci was an artist who delivered strikingly accurate and beautiful drawings on the human body. Below we compare both masters with regard to their knowledge of the working of the muscles, their method and system of dissection and their system and presentation of the drawings. The investigation consisted of a comparison between both anatomists, in particular concerning their study on the shoulder girdle and spine, by reviewing their original work as well as already existing literature on this subject. The investigation led to the conclusion that the drawings mentioned meant a change in history, and were of high quality, centuries ahead of their time. Both were anatomists, both were revolutionary, only one changed history at the moment itself, while the other changed history centuries later. Leonardo has made beautiful drawings that are at a match with the drawings of today or are even better. Vesalius set the start for medicine as a science as it is until this day. Their lives differed as strongly as their impact. In the light of their time, the achievement they made was extraordinary.
Hosseini, Mostafa; Heidari, Afshin; Jafarnejad, Babak
2013-10-01
This study is a comparison between three methods that are frequently used for the surgical treatment of pilonidal disease all over the world: modified excision and repair, wide excision and secondary repair, and wide excision and flap. The first technique is done by our group for the first time, and has not been described previously in the literature. This is an interventional study performed at Omid, Sadr, and Rasoul Akram hospitals on patients who had undergone operation because of pilonidal sinus disease and met the inclusion criteria between 2004 and 2007. Exclusion criteria were (1) acute pilonidal sinus diseases, (2) history of pilonidal sinus surgery, (3) history of systemic diseases (DM, malignancy, etc.), and (4) pilonidal abscess. Essential information was extracted from complete medical archives. Any data not available in files or during follow-up visits (all patients supposed to be followed at least for 1 year) were gathered by a telephone interview. A total of 194 patients met the criteria and had complete archived files. Longer duration of hospital stay was found in the "wide excision and closing with flap" method comparing with two other methods (P < 0.05). Length of incapacity for work was not different between the "wide excision and modified repair" and "wide excision" (P > 0.5) methods, but longer for "wide excision and flap" in comparison with two others (P < 0.05). Healing time was significantly longer in the "wide excision" method in comparison with two other methods (P < 0.05). However, "wide excision and modified repair" method had the least healing time between all above techniques, except for length of leaving the office. All the three recurrences (1.5 %) occurred in the wide excision and flap method (P < 0.05). The frequency of postoperative complications was 2 (3.3 %) in wide excision and modified repair, 15 (18.5 %) in wide excision, and 17 (32.7 %) in wide excision and flap closure; these differences in complications were statistically significant (P < 0.05). Our results show that the wide excision and modified repair technique, which has been described for the first time, is an acceptable method due to a low recurrence rate and better wound outcomes comparing with wide excision alone and wide excision and flap techniques for the surgical treatment of pilonidal sinus disease.
Monitoring of Volcanic Activity by Sub-mm Geodetic Analyses
NASA Astrophysics Data System (ADS)
Miura, S.; Mare, Y.; Ichiki, M.; Demachi, T.; Tachibana, K.; Nishimura, T.
2017-12-01
Volcanic earthquakes have been occurring beneath Zao volcano in northern Honshu, Japan since 2013, following the increase of deep low frequency earthquakes from 2012. On account of a burst of seismicity initiated in April 2015, the JMA announced a warning of eruption, however, the seismicity gradually decreased for the next two months and the warning was canceled in June. In the same time period, minor expansive deformation was observed by GNSS. Small earthquakes are still occurring, and low-freq. earthquakes (LPE) occur sometimes accompanied by static tilt changes. In this study, we try to extract the sub-mm displacements from the LPE waveforms observed by broadband seismometers (BBS) and utilize them for geodetic inversion to monitor volcanic activities. Thun et al. (2015, 2016) devised an efficient method using a running median filter (RMF) to remove LP noises, which contaminate displacement waveforms. They demonstrated the reproducibility of the waveforms corresponding to the experimentally given sub-mm displacements in the laboratory. They also apply the method to the field LPE data obtained from several volcanoes to show static displacements. The procedure is outlined as follows: (1) Unfiltered removal of the instrument response, (2) LP noise estimate by LPF with a corner frequency of 5/M, where M (seconds) is the time window of the RMF and should be at least three times the length of the rise time. (3) Subtract the noise estimated from step (2). (4) Integrate to obtain displacement waveforms. We apply the method to the BBS waveform at a distance of about 1.5 km ESE from the summit crater of Zao Volcano associated with a LPE on April 1, 2017. Assuming the time window M as 300 seconds, we successfully obtained the displacement history: taking the rise time of about 2 minutes, the site was gradually uplifted with the amount of about 50-60 µm and then subsided with HF displacements in the next 2 minutes resulting about 20-30 µm static upheaval. Comparing the waveforms obtained by the RMF and the conventional methods, the onset of the upheaval is delayed a little in the former one, while the latter one is consistent with the inclination change observed by the tiltmeter installed at the same site. It may be caused by the characteristics of the RMF and attention should be paid to detailed discussions of the source time histories.
[When history meets molecular medicine: molecular history of human tuberculosis].
Ottini, Laura; Falchetti, Mario
2010-01-01
Tuberculosis represents one of the humankind's most socially devastating diseases. Despite a long history of medical research and the development of effective therapies, this disease remains a global health danger even in the 21st century. Tuberculosis may cause death but infected people with effective immunity may remain healthy for years, suggesting long-term host-pathogen co-existence. Because of its antiquity, a supposed association with human settlements and the tendency to leave typical lesions on skeletal and mummified remains, tuberculosis has been the object of intensive multidisciplinary studies, including paleo-pathological research. During the past 10 years molecular paleo-pathology developed as a new scientific discipline allowing the study of ancient pathogens by direct detection of their DNA. In this work, we reviewed evidences for tuberculosis in ancient human remains, current methods for identifying ancient mycobacterial DNA and explored current theories of Mycobacterium tuberculosis evolution and their implications in the global development of tuberculosis looking into the past and present at the same time.
Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (Orion)
NASA Technical Reports Server (NTRS)
DeMott, Diana L.; Bigler, Mark A.
2017-01-01
NASA (National Aeronautics and Space Administration) Johnson Space Center (JSC) Safety and Mission Assurance (S&MA) uses two human reliability analysis (HRA) methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate or screening value is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a Probabilistic Risk Assessment (PRA) that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more challenging. To determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators, and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the assumptions and expectations expressed in the assessments will be needed when the procedures, flight rules, and operational requirements are developed and then finalized.
Human Reliability Assessments: Using the Past (Shuttle) to Predict the Future (Orion)
NASA Technical Reports Server (NTRS)
DeMott, Diana; Bigler, Mark
2016-01-01
NASA (National Aeronautics and Space Administration) Johnson Space Center (JSC) Safety and Mission Assurance (S&MA) uses two human reliability analysis (HRA) methodologies. The first is a simplified method which is based on how much time is available to complete the action, with consideration included for environmental and personal factors that could influence the human's reliability. This method is expected to provide a conservative value or placeholder as a preliminary estimate. This preliminary estimate or screening value is used to determine which placeholder needs a more detailed assessment. The second methodology is used to develop a more detailed human reliability assessment on the performance of critical human actions. This assessment needs to consider more than the time available, this would include factors such as: the importance of the action, the context, environmental factors, potential human stresses, previous experience, training, physical design interfaces, available procedures/checklists and internal human stresses. The more detailed assessment is expected to be more realistic than that based primarily on time available. When performing an HRA on a system or process that has an operational history, we have information specific to the task based on this history and experience. In the case of a Probabilistic Risk Assessment (PRA) that is based on a new design and has no operational history, providing a "reasonable" assessment of potential crew actions becomes more challenging. In order to determine what is expected of future operational parameters, the experience from individuals who had relevant experience and were familiar with the system and process previously implemented by NASA was used to provide the "best" available data. Personnel from Flight Operations, Flight Directors, Launch Test Directors, Control Room Console Operators and Astronauts were all interviewed to provide a comprehensive picture of previous NASA operations. Verification of the assumptions and expectations expressed in the assessments will be needed when the procedures, flight rules and operational requirements are developed and then finalized.
Vig, Hetal S; McCarthy, Anne Marie; Liao, Kaijun; Demeter, Mirar Bristol; Fredericks, Tracey; Armstrong, Katrina
2013-10-01
Standard BRCA genetic testing criteria include young age of diagnosis, family history, and Jewish ancestry. The purpose of this study was to assess the effect of these criteria on BRCA test utilization in breast cancer patients. Breast cancer patients aged 18 to 64 years living in Pennsylvania in 2007 completed a survey on family history of breast and ovarian cancer and BRCA testing (N = 2,213). Multivariate logistic regression was used to estimate odds of BRCA testing by patient characteristics, and predicted probabilities of testing were calculated for several clinical scenarios. Young age at diagnosis (<50 years) was strongly associated with BRCA testing, with women diagnosed before age 50 years having nearly five times the odds of receiving BRCA testing compared to women diagnosed at age 50 or older (OR = 4.81; 95% CI, 3.85-6.00; P < 0.001). Despite a similar BRCA mutation prevalence estimate (8-10%), a young Jewish patient <50 years with no family history had markedly higher predicted probability of testing (63%) compared with an older, non-Jewish breast cancer patient with more than one first-degree relative (43%). Age at diagnosis, Jewish ancestry, and both maternal and paternal family history are strongly predictive of BRCA testing. However, among women diagnosed at age 50 or older, family history may be an underused criterion that may benefit from targeted intervention. Robust methods specific to ascertaining detailed family history, such as through electronic medical records, are needed to accurately identify patients for BRCA testing.
A Portable Platform for Evaluation of Visual Performance in Glaucoma Patients
Rosen, Peter N.; Boer, Erwin R.; Gracitelli, Carolina P. B.; Abe, Ricardo Y.; Diniz-Filho, Alberto; Marvasti, Amir H.; Medeiros, Felipe A.
2015-01-01
Purpose To propose a new tablet-enabled test for evaluation of visual performance in glaucoma, the PERformance CEntered Portable Test (PERCEPT), and to evaluate its ability to predict history of falls and motor vehicle crashes. Design Cross-sectional study. Methods The study involved 71 patients with glaucomatous visual field defects on standard automated perimetry (SAP) and 59 control subjects. The PERCEPT was based on the concept of increasing visual task difficulty to improve detection of central visual field losses in glaucoma patients. Subjects had to perform a foveal 8-alternative-forced-choice orientation discrimination task, while detecting a simultaneously presented peripheral stimulus within a limited presentation time. Subjects also underwent testing with the Useful Field of View (UFOV) divided attention test. The ability to predict history of motor vehicle crashes and falls was investigated by odds ratios and incident-rate ratios, respectively. Results When adjusted for age, only the PERCEPT processing speed parameter showed significantly larger values in glaucoma compared to controls (difference: 243ms; P<0.001). PERCEPT results had a stronger association with history of motor vehicle crashes and falls than UFOV. Each 1 standard deviation increase in PERCEPT processing speed was associated with an odds ratio of 2.69 (P = 0.003) for predicting history of motor vehicle crashes and with an incident-rate ratio of 1.95 (P = 0.003) for predicting history of falls. Conclusion A portable platform for testing visual function was able to detect functional deficits in glaucoma, and its results were significantly associated with history of involvement in motor vehicle crashes and history of falls. PMID:26445501
Traumatic Brain Injury History is Associated with Earlier Age of Onset of Alzheimer Disease
LoBue, Christian; Wadsworth, Hannah; Wilmoth, Kristin; Clem, Matthew; Hart, John; Womack, Kyle B.; Didehbani, Nyaz; Lacritz, Laura H.; Rossetti, Heidi C.; Cullum, C. Munro
2016-01-01
Objective This study examined whether a history of traumatic brain injury (TBI) is associated with earlier onset of Alzheimer disease (AD), independent of apolipoprotein ε4 status (Apoe4) and gender. Method Participants with a clinical diagnosis of AD (n=7625) were obtained from the National Alzheimer’s Coordinating Center Uniform Data Set, and categorized based on self-reported lifetime TBI with loss of consciousness (LOC) (TBI+ vs TBI-) and presence of Apoe4. ANCOVAs, controlling for gender, race, and education were used to examine the association between history of TBI, presence of Apoe4, and an interaction of both risk factors on estimated age of AD onset. Results Estimated AD onset differed by TBI history and Apoe4 independently (p’s <.001). The TBI+ group had a mean age of onset 2.5 years earlier than the TBI- group. Likewise, Apoe4 carriers had a mean age of onset 2.3 years earlier than non-carriers. While the interaction was non-significant (p = .34), participants having both a history of TBI and Apoe4 had the earliest mean age of onset compared to those with a TBI history or Apoe4 alone (MDifference = 2.8 & 2.7 years, respectively). These results remained unchanged when stratified by gender. Conclusions History of self-reported TBI can be associated with an earlier onset of AD-related cognitive decline, regardless of Apoe4 status and gender. TBI may be related to an underlying neurodegenerative process in AD, but the implications of age at time of injury, severity, and repetitive injuries remain unclear. PMID:27855547
2014-01-01
Background Miscarriage, the unexpected loss of pregnancy before 20 weeks gestation, may have a negative effect on a mother’s perception of herself as a capable woman and on her emotional health when she is pregnant again subsequent to the miscarriage. As such, a mother with a history of miscarriage may be at greater risk for difficulties navigating the process of becoming a mother and achieving positive maternal-infant bonding with an infant born subsequent to the loss. The aim of this study was to examine the effect of miscarriage history on maternal-infant bonding after the birth of a healthy infant to test the hypothesis that women with a history of miscarriage have decreased maternal-infant bonding compared to women without a history of miscarriage. Methods We completed secondary analysis of the First Baby Study, a longitudinal cohort study, to examine the effect of a history of miscarriage on maternal-infant bonding at 1 month, 6 months, and 12 months after women experienced the birth of their first live-born baby. In a sample of 2798 women living in Pennsylvania, USA, we tested our hypothesis using linear regression analysis of Shortened Postpartum Bonding Questionnaire (S-PBQ) scores, followed by longitudinal analysis using a generalized estimating equations model with repeated measures. Results We found that women with a history of miscarriage had similar S-PBQ scores as women without a history of miscarriage at each of the three postpartum time points. Likewise, longitudinal analysis revealed no difference in the pattern of maternal-infant bonding scores between women with and without a history of miscarriage. Conclusions Women in the First Baby Study with a history of miscarriage did not differ from women without a history of miscarriage in their reported level of bonding with their subsequently born infants. It is important for clinicians to recognize that even though some women may experience impaired bonding related to a history of miscarriage, the majority of women form a healthy bond with their infant despite this history. PMID:25028056
Chlyeh, G; Henry, P Y; Jarne, P
2003-09-01
The population biology of the schistosome-vector snail Bulinus truncatus was studied in an irrigation area near Marrakech, Morocco, using demographic approaches, in order to estimate life-history parameters. The survey was conducted using 2 capture-mark-recapture analyses in 2 separate sites from the irrigation area, the first one in 1999 and the second one in 2000. Individuals larger than 5 mm were considered. The capture probability varied through time and space in both analyses. Apparent survival (from 0.7 to 1 per period of 2-3 days) varied with time and space (a series of sinks was considered), as well as a square function of size. These results suggest variation in population intrinsic rate of increase. They also suggest that results from more classical analyses of population demography, aiming, for example at estimating population size, should be interpreted with caution. Together with other results obtained in the same irrigation area, they also lead to some suggestions for population control.
Surface temperature/heat transfer measurement using a quantitative phosphor thermography system
NASA Technical Reports Server (NTRS)
Buck, G. M.
1991-01-01
A relative-intensity phosphor thermography technique developed for surface heating studies in hypersonic wind tunnels is described. A direct relationship between relative emission intensity and phosphor temperature is used for quantitative surface temperature measurements in time. The technique provides global surface temperature-time histories using a 3-CCD (Charge Coupled Device) video camera and digital recording system. A current history of technique development at Langley is discussed. Latest developments include a phosphor mixture for a greater range of temperature sensitivity and use of castable ceramics for inexpensive test models. A method of calculating surface heat-transfer from thermal image data in blowdown wind tunnels is included in an appendix, with an analysis of material thermal heat-transfer properties. Results from tests in the Langley 31-Inch Mach 10 Tunnel are presented for a ceramic orbiter configuration and a four-inch diameter hemisphere model. Data include windward heating for bow-shock/wing-shock interactions on the orbiter wing surface, and a comparison with prediction for hemisphere heating distribution.
Stoll, S; Roelcke, V; Raspe, H
2005-07-29
The article addresses the history of evidence-based medicine in Germany. Its aim was to reconstruct the standard of clinical-therapeutic investigation in Germany at the beginning of the 20 (th) century. By a historical investigation of five important German general medical journals for the time between 1918 and 1932 an overview of the situation of clinical investigation is given. 268 clinical trails are identified, and are analysed in view of their methodological design. Heterogeneous results are found: While few examples of sophisticated methodology exist, the design of the majority of the studies is poor. A response to the situation described can be seen in Paul Martini's book "Methodology of Therapeutic Investigation", first published in 1932. Paul Martini's biography, his criticism of the situation of clinical-therapeutic investigation of his time, the major points of his methodology and the reception of the book in Germany and abroad are described.
Local and Community History: Some Cautionary Remarks on an Idea Whose Time Has Returned.
ERIC Educational Resources Information Center
Gerber, David A.
1979-01-01
Analyzes past local history movements and addresses benefits and problems for historical studies and history teaching in the current upsurge of interest in local and community history. Concludes that local history must transcend parochialism in order to see the larger picture. (KC)
Management of surgical instruments with radio frequency identification tags.
Kusuda, Kaori; Yamashita, Kazuhiko; Ohnishi, Akiko; Tanaka, Kiyohito; Komino, Masaru; Honda, Hiroshi; Tanaka, Shinichi; Okubo, Takashi; Tripette, Julien; Ohta, Yuji
2016-01-01
To prevent malpractices, medical staff has adopted inventory time-outs and/or checklists. Accurate inventory and maintenance of surgical instruments decreases the risk of operating room miscounting and malfunction. In our previous study, an individual management of surgical instruments was accomplished using Radio Frequency Identification (RFID) tags. The purpose of this paper is to evaluate a new management method of RFID-tagged instruments. The management system of RFID-tagged surgical instruments was used for 27 months in clinical areas. In total, 13 study participants assembled surgical trays in the central sterile supply department. While using the management system, trays were assembled 94 times. During this period, no assembly errors occurred. An instrument malfunction had occurred after the 19th, 56th, and 73 th uses, no malfunction caused by the RFID tags, and usage history had been recorded. Additionally, the time it took to assemble surgical trays was recorded, and the long-term usability of the management system was evaluated. The system could record the number of uses and the defective history of each surgical instrument. In addition, the history of the frequency of instruments being transferred from one tray to another was recorded. The results suggest that our system can be used to manage instruments safely. Additionally, the management system was acquired of the learning effect and the usability on daily maintenance. This finding suggests that the management system examined here ensures surgical instrument and tray assembly quality.
Characterization and Simulation of Transient Vibrations Using Band Limited Temporal Moments
Smallwood, David O.
1994-01-01
A method is described to characterize shocks (transient time histories) in terms of the Fourier energy spectrum and the temporal moments of the shock passed through a contiguous set of band pass filters. The product model is then used to generate of a random process as simulations that in the mean will have the same energy and moments as the characterization of the transient event.
ERIC Educational Resources Information Center
Zavalo, Lauro
This paper explains that materials on the teaching of Latin-American literature are sparse, even though most researchers in the field will dedicate much of their time to teaching. The paper adds that, in scholarly journals, little attention is given to teaching literature, and the topic is also absent from most academic congresses. The paper then…
Jose M. Iniguez; Thomas W. Swetnam; Christopher H. Baisan
2016-01-01
Aim: The purpose of this study was to examine the influence of moisture and fire on historical ponderosa pine (Pinus ponderosa Dougl. ex Laws.) age structure patterns. Location: We used a natural experiment created over time by the unique desert island geography of southern Arizona. Methods: We sampled tree establishment dates in two sites on Rincon Peak and...
ERIC Educational Resources Information Center
Arias Ortiz, Elena; Dehon, Catherine
2013-01-01
In this paper we study the factors that influence both dropout and (4-year) degree completion throughout university by applying the set of discrete-time methods for competing risks in event history analysis, as described in Scott and Kennedy (2005). In the French-speaking Belgian community, participation rates are very high given that higher…
The Shock and Vibration Digest. Volume 14, Number 12
1982-12-01
to evaluate the uses of statistical energy analysis for determining sound transmission performance. Coupling loss factors were mea- sured and compared...measurements for the artificial (Also see No. 2623) cracks in mild-steel test pieces. 82-2676 Ihprovement of the Method of Statistical Energy Analysis for...eters, using a large number of free-response time histories In the application of the statistical energy analysis theory simultaneously in one analysis
Tracking composite material damage evolution using Bayesian filtering and flash thermography data
NASA Astrophysics Data System (ADS)
Gregory, Elizabeth D.; Holland, Steve D.
2016-05-01
We propose a method for tracking the condition of a composite part using Bayesian filtering of ash thermography data over the lifetime of the part. In this demonstration, composite panels were fabricated; impacted to induce subsurface delaminations; and loaded in compression over multiple time steps, causing the delaminations to grow in size. Flash thermography data was collected between each damage event to serve as a time history of the part. The ash thermography indicated some areas of damage but provided little additional information as to the exact nature or depth of the damage. Computed tomography (CT) data was also collected after each damage event and provided a high resolution volume model of damage that acted as truth. After each cycle, the condition estimate, from the ash thermography data and the Bayesian filter, was compared to 'ground truth'. The Bayesian process builds on the lifetime history of ash thermography scans and can give better estimates of material condition as compared to the most recent scan alone, which is common practice in the aerospace industry. Bayesian inference provides probabilistic estimates of damage condition that are updated as each new set of data becomes available. The method was tested on simulated data and then on an experimental data set.
Hartzell, S.; Harmsen, S.; Frankel, A.; Larsen, S.
1999-01-01
This article compares techniques for calculating broadband time histories of ground motion in the near field of a finite fault by comparing synthetics with the strong-motion data set for the 1994 Northridge earthquake. Based on this comparison, a preferred methodology is presented. Ground-motion-simulation techniques are divided into two general methods: kinematic- and composite-fault models. Green's functions of three types are evaluated: stochastic, empirical, and theoretical. A hybrid scheme is found to give the best fit to the Northridge data. Low frequencies ( 1 Hz) are calculated using a composite-fault model with a fractal subevent size distribution and stochastic, bandlimited, white-noise Green's functions. At frequencies below 1 Hz, theoretical elastic-wave-propagation synthetics introduce proper seismic-phase arrivals of body waves and surface waves. The 3D velocity structure more accurately reproduces record durations for the deep sedimentary basin structures found in the Los Angeles region. At frequencies above 1 Hz, scattering effects become important and wave propagation is more accurately represented by stochastic Green's functions. A fractal subevent size distribution for the composite fault model ensures an ??-2 spectral shape over the entire frequency band considered (0.1-20 Hz).
How Old? Tested and Trouble-Free Ways to Convey Geologic Time
ERIC Educational Resources Information Center
Clary, Renee
2009-01-01
Geologic time, or the time frame of our planet's history, is several orders of magnitude greater than general human understanding of "time." When students hear that our planet has a 4.6-billion-year history, they do not necessarily comprehend the magnitude of deep time, the huge expanse of time that has passed from the origin of Earth through the…
A Kinematic Model of Slow Slip Constrained by Tremor-Derived Slip Histories in Cascadia
NASA Astrophysics Data System (ADS)
Schmidt, D. A.; Houston, H.
2016-12-01
We explore new ways to constrain the kinematic slip distributions for large slow slip events using constraints from tremor. Our goal is to prescribe one or more slip pulses that propagate across the fault and scale appropriately to satisfy the observations. Recent work (Houston, 2015) inferred a crude representative stress time history at an average point using the tidal stress history, the static stress drop, and the timing of the evolution of tidal sensitivity of tremor over several days of slip. To convert a stress time history into a slip time history, we use simulations to explore the stressing history of a small locked patch due to an approaching rupture front. We assume that the locked patch releases strain through a series of tremor bursts whose activity rate is related to the stressing history. To test whether the functional form of a slip pulse is reasonable, we assume a hypothetical slip time history (Ohnaka pulse) timed with the occurrence of tremor to create a rupture front that propagates along the fault. The duration of the rupture front for a fault patch is constrained by the observed tremor catalog for the 2010 ETS event. The slip amplitude is scaled appropriately to match the observed surface displacements from GPS. Through a forward simulation, we evaluate the ability of the tremor-derived slip history to accurately predict the pattern of surface displacements observed by GPS. We find that the temporal progression of surface displacements are well modeled by a 2-4 day slip pulse, suggesting that some of the longer duration of slip typically found in time-dependent GPS inversions is biased by the temporal smoothing. However, at some locations on the fault, the tremor lingers beyond the passage of the slip pulse. A small percentage (5-10%) of the tremor appears to be activated ahead of the approaching slip pulse, and tremor asperities experience a driving stress on the order of 10 kPa/day. Tremor amplitude, rather than just tremor counts, is needed to better refine the pattern of slip across the fault.
Is xenodontine snake reproduction shaped by ancestry, more than by ecology?
Bellini, Gisela P; Arzamendia, Vanesa; Giraudo, Alejandro R
2017-01-01
One of the current challenges of evolutionary ecology is to understand the effects of phylogenetic history (PH) and/or ecological factors (EF) on the life-history traits of the species. Here, the effects of environment and phylogeny are tested for the first time on the reproductive biology of South American xenodontine snakes. We studied 60% of the tribes of this endemic and most representative clade in a temperate region of South America. A comparative method (canonical phylogenetic ordination-CPO) was used to find the relative contributions of EF and PH upon life-history aspects of snakes, comparing the reproductive mode, mean fecundity, reproductive potential, and frequency of nearly 1,000 specimens. CPO analysis showed that PH or ancestry explained most of the variation in reproduction, whereas EF explained little of this variation. The reproductive traits under study are suggested to have a strong phylogenetic signal in this clade, the ancestry playing a big role in reproduction. The EF also influenced the reproduction of South American xenodontines, although to a lesser extent. Our finding provides new evidence of how the evolutionary history is embodied in the traits of living species.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y. M., E-mail: ymingy@gmail.com; Bednarz, B.; Svatos, M.
Purpose: The future of radiation therapy will require advanced inverse planning solutions to support single-arc, multiple-arc, and “4π” delivery modes, which present unique challenges in finding an optimal treatment plan over a vast search space, while still preserving dosimetric accuracy. The successful clinical implementation of such methods would benefit from Monte Carlo (MC) based dose calculation methods, which can offer improvements in dosimetric accuracy when compared to deterministic methods. The standard method for MC based treatment planning optimization leverages the accuracy of the MC dose calculation and efficiency of well-developed optimization methods, by precalculating the fluence to dose relationship withinmore » a patient with MC methods and subsequently optimizing the fluence weights. However, the sequential nature of this implementation is computationally time consuming and memory intensive. Methods to reduce the overhead of the MC precalculation have been explored in the past, demonstrating promising reductions of computational time overhead, but with limited impact on the memory overhead due to the sequential nature of the dose calculation and fluence optimization. The authors propose an entirely new form of “concurrent” Monte Carlo treat plan optimization: a platform which optimizes the fluence during the dose calculation, reduces wasted computation time being spent on beamlets that weakly contribute to the final dose distribution, and requires only a low memory footprint to function. In this initial investigation, the authors explore the key theoretical and practical considerations of optimizing fluence in such a manner. Methods: The authors present a novel derivation and implementation of a gradient descent algorithm that allows for optimization during MC particle transport, based on highly stochastic information generated through particle transport of very few histories. A gradient rescaling and renormalization algorithm, and the concept of momentum from stochastic gradient descent were used to address obstacles unique to performing gradient descent fluence optimization during MC particle transport. The authors have applied their method to two simple geometrical phantoms, and one clinical patient geometry to examine the capability of this platform to generate conformal plans as well as assess its computational scaling and efficiency, respectively. Results: The authors obtain a reduction of at least 50% in total histories transported in their investigation compared to a theoretical unweighted beamlet calculation and subsequent fluence optimization method, and observe a roughly fixed optimization time overhead consisting of ∼10% of the total computation time in all cases. Finally, the authors demonstrate a negligible increase in memory overhead of ∼7–8 MB to allow for optimization of a clinical patient geometry surrounded by 36 beams using their platform. Conclusions: This study demonstrates a fluence optimization approach, which could significantly improve the development of next generation radiation therapy solutions while incurring minimal additional computational overhead.« less
Svatos, M.; Zankowski, C.; Bednarz, B.
2016-01-01
Purpose: The future of radiation therapy will require advanced inverse planning solutions to support single-arc, multiple-arc, and “4π” delivery modes, which present unique challenges in finding an optimal treatment plan over a vast search space, while still preserving dosimetric accuracy. The successful clinical implementation of such methods would benefit from Monte Carlo (MC) based dose calculation methods, which can offer improvements in dosimetric accuracy when compared to deterministic methods. The standard method for MC based treatment planning optimization leverages the accuracy of the MC dose calculation and efficiency of well-developed optimization methods, by precalculating the fluence to dose relationship within a patient with MC methods and subsequently optimizing the fluence weights. However, the sequential nature of this implementation is computationally time consuming and memory intensive. Methods to reduce the overhead of the MC precalculation have been explored in the past, demonstrating promising reductions of computational time overhead, but with limited impact on the memory overhead due to the sequential nature of the dose calculation and fluence optimization. The authors propose an entirely new form of “concurrent” Monte Carlo treat plan optimization: a platform which optimizes the fluence during the dose calculation, reduces wasted computation time being spent on beamlets that weakly contribute to the final dose distribution, and requires only a low memory footprint to function. In this initial investigation, the authors explore the key theoretical and practical considerations of optimizing fluence in such a manner. Methods: The authors present a novel derivation and implementation of a gradient descent algorithm that allows for optimization during MC particle transport, based on highly stochastic information generated through particle transport of very few histories. A gradient rescaling and renormalization algorithm, and the concept of momentum from stochastic gradient descent were used to address obstacles unique to performing gradient descent fluence optimization during MC particle transport. The authors have applied their method to two simple geometrical phantoms, and one clinical patient geometry to examine the capability of this platform to generate conformal plans as well as assess its computational scaling and efficiency, respectively. Results: The authors obtain a reduction of at least 50% in total histories transported in their investigation compared to a theoretical unweighted beamlet calculation and subsequent fluence optimization method, and observe a roughly fixed optimization time overhead consisting of ∼10% of the total computation time in all cases. Finally, the authors demonstrate a negligible increase in memory overhead of ∼7–8 MB to allow for optimization of a clinical patient geometry surrounded by 36 beams using their platform. Conclusions: This study demonstrates a fluence optimization approach, which could significantly improve the development of next generation radiation therapy solutions while incurring minimal additional computational overhead. PMID:27277051
Powe, Camille E; Tobias, Deirdre K; Michels, Karin B; Chen, Wendy Y; Eliassen, A Heather; Manson, JoAnn E; Rosner, Bernard; Willett, Walter C; Hu, Frank B; Zhang, Cuilin; Rich-Edwards, Janet W; Rexrode, Kathryn M
2017-03-01
Background: Type II diabetes is associated with breast cancer in epidemiologic studies. Pregnancy also modifies breast cancer risk. We hypothesized that women with a history of gestational diabetes mellitus (GDM), which shares pathogenesis and risk factors with type II diabetes, would have greater invasive breast cancer risk than parous women without a history of GDM. Methods: We conducted a prospective analysis among parous women in the Nurses' Health Study II, with mean age 35 years in 1989. Multivariate Cox proportional hazards models were used to compare risks of incident invasive breast cancer in women with and without a history of GDM. Results: Among 86,972 women studied, 5,188 women reported a history of GDM and 2,377 developed invasive breast cancer (100 with history of GDM, 2,277 without GDM) over 22 years of prospective follow-up. History of GDM was inversely associated with incident invasive breast cancer [HR, 0.68; 95% confidence interval (CI), 0.55-0.84; P = 0.0004], compared with no history of GDM, after adjustment for body mass index, reproductive history, and other breast cancer risk factors. Findings were similar by menopausal status, although observed person-time was predominantly premenopausal (premenopausal: HR, 0.73; 95% CI, 0.56-0.96; P = 0.03; postmenopausal: HR, 0.63; 95% CI, 0.43-0.92; P = 0.02). Restricting to women undergoing mammography screening modestly attenuated the relationship (HR, 0.74; 95% CI, 0.57-0.96; P = 0.02). Conclusions: Among a large cohort of U.S. women, history of GDM was not associated with an elevated risk of subsequent invasive breast cancer. Impact: Our findings highlight the need to further investigate GDM's role in breast cancer development. Cancer Epidemiol Biomarkers Prev; 26(3); 321-7. ©2016 AACR . ©2016 American Association for Cancer Research.
Yuan, Hui; Dryden, Jefferson K.; Strehl, Kristen E.; Cywinski, Jacek B.; Ehrenfeld, Jesse M.; Bromley, Pamela
2017-01-01
BACKGROUND: It has been suggested that longer-term postsurgical outcome may be adversely affected by less than severe hypotension under anesthesia. However, evidence-based guidelines are unavailable. The present study was designed to develop a method for identifying patients at increased risk of death within 30 days in association with the severity and duration of intraoperative hypotension. METHODS: Intraoperative mean arterial blood pressure recordings of 152,445 adult patients undergoing noncardiac surgery were analyzed for periods of time accumulated below each one of the 31 thresholds between 75 and 45 mm Hg (hypotensive exposure times). In a development cohort of 35,904 patients, the associations were sought between each of these 31 cumulative hypotensive exposure times and 30-day postsurgical mortality. On the basis of covariable-adjusted percentage increases in the odds of mortality per minute elapsed of hypotensive exposure time, certain sets of exposure time limits were calculated that portended certain percentage increases in the odds of mortality. A novel risk-scoring method was conceived by counting the number of exposure time limits that had been exceeded within each respective set, one of them being called the SLUScore. The validity of this new method in identifying patients at increased risk was tested in a multicenter validation cohort consisting of 116,541 patients from Cleveland Clinic, Vanderbilt and Saint Louis Universities. Data were expressed as 95% confidence interval, P < .05 considered significant. RESULTS: Progressively greater hypotensive exposures were associated with greater 30-day mortality. In the development cohort, covariable-adjusted (age, Charlson score, case duration, history of hypertension) exposure limits were identified for time accumulated below each of the thresholds that portended certain identical (5%–50%) percentage expected increases in the odds of mortality. These exposure time limit sets were shorter in patients with a history of hypertension. A novel risk score, the SLUScore (range 0–31), was conceived as the number of exposure limits exceeded for one of these sets (20% set). A SLUScore > 0 (average 13.8) was found in 40% of patients who had twice the mortality, adjusted odds increasing by 5% per limit exceeded. When tested in the validation cohort, a SLUScore > 0 (average 14.1) identified 35% of patients who had twice the mortality, each incremental limit exceeded portending a 5% compounding increase in adjusted odds of mortality, independent of age and Charlson score (C = 0.73, 0.72–0.74, P < .05). CONCLUSIONS: The SLUScore represents a novel method for identifying nearly 1 in every 3 patients experiencing greater 30-day mortality portended by more severe intraoperative hypotensive exposures. PMID:28107274
Teaching Electronic Health Record Communication Skills.
Palumbo, Mary Val; Sandoval, Marie; Hart, Vicki; Drill, Clarissa
2016-06-01
This pilot study investigated nurse practitioner students' communication skills when utilizing the electronic health record during history taking. The nurse practitioner students (n = 16) were videotaped utilizing the electronic health record while taking health histories with standardized patients. The students were videotaped during two separate sessions during one semester. Two observers recorded the time spent (1) typing and talking, (2) typing only, and (3) looking at the computer without talking. Total history taking time, computer placement, and communication skills were also recorded. During the formative session, mean history taking time was 11.4 minutes, with 3.5 minutes engaged with the computer (30.6% of visit). During the evaluative session, mean history taking time was 12.4 minutes, with 2.95 minutes engaged with the computer (24% of visit). The percentage of time individuals spent changed over the two visits: typing and talking, -3.1% (P = .3); typing only, +12.8% (P = .038); and looking at the computer, -9.6% (P = .039). This study demonstrated that time spent engaged with the computer during a patient encounter does decrease with student practice and education. Therefore, students benefit from instruction on electronic health record-specific communication skills, and use of a simple mnemonic to reinforce this is suggested.
Joint Inference of Population Assignment and Demographic History
Choi, Sang Chul; Hey, Jody
2011-01-01
A new approach to assigning individuals to populations using genetic data is described. Most existing methods work by maximizing Hardy–Weinberg and linkage equilibrium within populations, neither of which will apply for many demographic histories. By including a demographic model, within a likelihood framework based on coalescent theory, we can jointly study demographic history and population assignment. Genealogies and population assignments are sampled from a posterior distribution using a general isolation-with-migration model for multiple populations. A measure of partition distance between assignments facilitates not only the summary of a posterior sample of assignments, but also the estimation of the posterior density for the demographic history. It is shown that joint estimates of assignment and demographic history are possible, including estimation of population phylogeny for samples from three populations. The new method is compared to results of a widely used assignment method, using simulated and published empirical data sets. PMID:21775468
A Computer Program for the Computation of Running Gear Temperatures Using Green's Function
NASA Technical Reports Server (NTRS)
Koshigoe, S.; Murdock, J. W.; Akin, L. S.; Townsend, D. P.
1996-01-01
A new technique has been developed to study two dimensional heat transfer problems in gears. This technique consists of transforming the heat equation into a line integral equation with the use of Green's theorem. The equation is then expressed in terms of eigenfunctions that satisfy the Helmholtz equation, and their corresponding eigenvalues for an arbitrarily shaped region of interest. The eigenfunction are obtalned by solving an intergral equation. Once the eigenfunctions are found, the temperature is expanded in terms of the eigenfunctions with unknown time dependent coefficients that can be solved by using Runge Kutta methods. The time integration is extremely efficient. Therefore, any changes in the time dependent coefficients or source terms in the boundary conditions do not impose a great computational burden on the user. The method is demonstrated by applying it to a sample gear tooth. Temperature histories at representative surface locatons are given.
A Developmental History of Polymer Mass Spectrometry
ERIC Educational Resources Information Center
Vergne, Matthew J.; Hercules, David M.; Lattimer, Robert P.
2007-01-01
The history of the development of mass spectroscopic methods used to characterize polymers is discussed. The continued improvements in methods and instrumentation will offer new and better ways for the mass spectral characterization of polymers and mass spectroscopy (MS) should be recognized as a complementary polymer characterization method along…
A HANDBOOK FOR LITERACY TEACHERS.
ERIC Educational Resources Information Center
MCKILLIAM, K.R.
THE METHODS DESCRIBED IN THIS HANDBOOK CAN BE ADAPTED FOR USE IN ANY LANGUAGE WHICH CAN BE WRITTEN PHONETICALLY. CHAPTERS COVER THE VALUE OF ADULT LITERACY, HISTORY OF THE ALPHABET, HISTORY OF METHODS OF TEACHING READING AND WRITING, PRINCIPLES OF TEACHING, SOUNDS AS SYMBOLS, LESSON CONSTRUCTION, LETTER CONSTRUCTION, THE METHOD OF TEACHING…
Population momentum across vertebrate life histories
Koons, D.N.; Grand, J.B.; Arnold, J.M.
2006-01-01
Population abundance is critically important in conservation, management, and demographic theory. Thus, to better understand how perturbations to the life history affect long-term population size, we examined population momentum for four vertebrate classes with different life history strategies. In a series of demographic experiments we show that population momentum generally has a larger effect on long-term population size for organisms with long generation times than for organisms with short generation times. However, patterns between population momentum and generation time varied across taxonomic groups and according to the life history parameter that was changed. Our findings indicate that momentum may be an especially important aspect of population dynamics for long-lived vertebrates, and deserves greater attention in life history studies. Further, we discuss the importance of population momentum in natural resource management, pest control, and conservation arenas. ?? 2006 Elsevier B.V. All rights reserved.
Ruttenber, A J; McCrea, J S; Wade, T D; Schonbeck, M F; LaMontagne, A D; Van Dyke, M V; Martyny, J W
2001-02-01
We outline methods for integrating epidemiologic and industrial hygiene data systems for the purpose of exposure estimation, exposure surveillance, worker notification, and occupational medicine practice. We present examples of these methods from our work at the Rocky Flats Plant--a former nuclear weapons facility that fabricated plutonium triggers for nuclear weapons and is now being decontaminated and decommissioned. The weapons production processes exposed workers to plutonium, gamma photons, neutrons, beryllium, asbestos, and several hazardous chemical agents, including chlorinated hydrocarbons and heavy metals. We developed a job exposure matrix (JEM) for estimating exposures to 10 chemical agents in 20 buildings for 120 different job categories over a production history spanning 34 years. With the JEM, we estimated lifetime chemical exposures for about 12,000 of the 16,000 former production workers. We show how the JEM database is used to estimate cumulative exposures over different time periods for epidemiological studies and to provide notification and determine eligibility for a medical screening program developed for former workers. We designed an industrial hygiene data system for maintaining exposure data for current cleanup workers. We describe how this system can be used for exposure surveillance and linked with the JEM and databases on radiation doses to develop lifetime exposure histories and to determine appropriate medical monitoring tests for current cleanup workers. We also present time-line-based graphical methods for reviewing and correcting exposure estimates and reporting them to individual workers.
High speed imaging for assessment of impact damage in natural fibre biocomposites
NASA Astrophysics Data System (ADS)
Ramakrishnan, Karthik Ram; Corn, Stephane; Le Moigne, Nicolas; Ienny, Patrick; Leger, Romain; Slangen, Pierre R.
2017-06-01
The use of Digital Image Correlation has been generally limited to the estimation of mechanical properties and fracture behaviour at low to moderate strain rates. High speed cameras dedicated to ballistic testing are often used to measure the initial and residual velocities of the projectile but rarely for damage assessment. The evaluation of impact damage is frequently achieved post-impact using visual inspection, ultrasonic C-scan or other NDI methods. Ultra-high speed cameras and developments in image processing have made possible the measurement of surface deformations and stresses in real time during dynamic cracking. In this paper, a method is presented to correlate the force- displacement data from the sensors to the slow motion tracking of the transient failure cracks using real-time high speed imaging. Natural fibre reinforced composites made of flax fibres and polypropylene matrix was chosen for the study. The creation of macro-cracks during the impact results in the loss of stiffness and a corresponding drop in the force history. However, optical instrumentation shows that the initiation of damage is not always evident and so the assessment of damage requires the use of a local approach. Digital Image Correlation is used to study the strain history of the composite and to identify the initiation and progression of damage. The effect of fly-speckled texture on strain measurement by image correlation is also studied. The developed method can be used for the evaluation of impact damage for different composite materials.
Tulsyan, Aditya; Garvin, Christopher; Ündey, Cenk
2018-04-06
Biopharmaceutical manufacturing comprises of multiple distinct processing steps that require effective and efficient monitoring of many variables simultaneously in real-time. The state-of-the-art real-time multivariate statistical batch process monitoring (BPM) platforms have been in use in recent years to ensure comprehensive monitoring is in place as a complementary tool for continued process verification to detect weak signals. This article addresses a longstanding, industry-wide problem in BPM, referred to as the "Low-N" problem, wherein a product has a limited production history. The current best industrial practice to address the Low-N problem is to switch from a multivariate to a univariate BPM, until sufficient product history is available to build and deploy a multivariate BPM platform. Every batch run without a robust multivariate BPM platform poses risk of not detecting potential weak signals developing in the process that might have an impact on process and product performance. In this article, we propose an approach to solve the Low-N problem by generating an arbitrarily large number of in silico batches through a combination of hardware exploitation and machine-learning methods. To the best of authors' knowledge, this is the first article to provide a solution to the Low-N problem in biopharmaceutical manufacturing using machine-learning methods. Several industrial case studies from bulk drug substance manufacturing are presented to demonstrate the efficacy of the proposed approach for BPM under various Low-N scenarios. © 2018 Wiley Periodicals, Inc.
Dynamic response analysis of a 24-story damped steel structure
NASA Astrophysics Data System (ADS)
Feng, Demin; Miyama, Takafumi
2017-10-01
In Japanese and Chinese building codes, a two-stage design philosophy, damage limitation (small earthquake, Level 1) and life safety (extreme large earthquake, Level 2), is adopted. It is very interesting to compare the design method of a damped structure based on the two building codes. In the Chinese code, in order to be consistent with the conventional seismic design method, the damped structure is also designed at the small earthquake level. The effect of damper systems is considered by the additional damping ratio concept. The design force will be obtained from the damped design spectrum considering the reduction due to the additional damping ratio. The additional damping ratio by the damper system is usually calculated by a time history analysis method at the small earthquake level. The velocity dependent type dampers such as viscous dampers can function well even in the small earthquake level. But, if steel damper is used, which usually remains elastic in the small earthquake, there will be no additional damping ratio achieved. On the other hand, a time history analysis is used in Japan both for small earthquake and extreme large earthquake level. The characteristics of damper system and ductility of the structure can be modelled well. An existing 24-story steel frame is modified to demonstrate the design process of the damped structure based on the two building codes. Viscous wall type damper and low yield steel panel dampers are studied as the damper system.
A method for decoding the neurophysiological spike-response transform.
Stern, Estee; García-Crescioni, Keyla; Miller, Mark W; Peskin, Charles S; Brezina, Vladimir
2009-11-15
Many physiological responses elicited by neuronal spikes-intracellular calcium transients, synaptic potentials, muscle contractions-are built up of discrete, elementary responses to each spike. However, the spikes occur in trains of arbitrary temporal complexity, and each elementary response not only sums with previous ones, but can itself be modified by the previous history of the activity. A basic goal in system identification is to characterize the spike-response transform in terms of a small number of functions-the elementary response kernel and additional kernels or functions that describe the dependence on previous history-that will predict the response to any arbitrary spike train. Here we do this by developing further and generalizing the "synaptic decoding" approach of Sen et al. (1996). Given the spike times in a train and the observed overall response, we use least-squares minimization to construct the best estimated response and at the same time best estimates of the elementary response kernel and the other functions that characterize the spike-response transform. We avoid the need for any specific initial assumptions about these functions by using techniques of mathematical analysis and linear algebra that allow us to solve simultaneously for all of the numerical function values treated as independent parameters. The functions are such that they may be interpreted mechanistically. We examine the performance of the method as applied to synthetic data. We then use the method to decode real synaptic and muscle contraction transforms.
History in the Schools. National Council for the Social Studies Bulletin 74.
ERIC Educational Resources Information Center
Downey, Matthew T., Ed.
This examination of the condition of history instruction in the public schools offers chapters on four major areas of concern to educators: (1) the status of history in the schools; (2) problem areas in the history curriculum; (3) students, methods, and instructional materials; and (4) preparation and certification of history teachers. In the…
Teacher Candidates' Attitudes to Using Oral History in History Education
ERIC Educational Resources Information Center
Demircioglu, Ebru
2016-01-01
The aim of this research is to determine the views of history teacher candidates towards an oral history project carried out in the Special Teaching Method Course of the history pedagogy program of the Fatih Faculty of Education (FFE) at Karadeniz Technical University in Turkey. An open-ended questionnaire and semi-structured interview were the…
Wilson, Richard A.; Chapman, Wendy W.; DeFries, Shawn J.; Becich, Michael J.; Chapman, Brian E.
2010-01-01
Background: Clinical records are often unstructured, free-text documents that create information extraction challenges and costs. Healthcare delivery and research organizations, such as the National Mesothelioma Virtual Bank, require the aggregation of both structured and unstructured data types. Natural language processing offers techniques for automatically extracting information from unstructured, free-text documents. Methods: Five hundred and eight history and physical reports from mesothelioma patients were split into development (208) and test sets (300). A reference standard was developed and each report was annotated by experts with regard to the patient’s personal history of ancillary cancer and family history of any cancer. The Hx application was developed to process reports, extract relevant features, perform reference resolution and classify them with regard to cancer history. Two methods, Dynamic-Window and ConText, for extracting information were evaluated. Hx’s classification responses using each of the two methods were measured against the reference standard. The average Cohen’s weighted kappa served as the human benchmark in evaluating the system. Results: Hx had a high overall accuracy, with each method, scoring 96.2%. F-measures using the Dynamic-Window and ConText methods were 91.8% and 91.6%, which were comparable to the human benchmark of 92.8%. For the personal history classification, Dynamic-Window scored highest with 89.2% and for the family history classification, ConText scored highest with 97.6%, in which both methods were comparable to the human benchmark of 88.3% and 97.2%, respectively. Conclusion: We evaluated an automated application’s performance in classifying a mesothelioma patient’s personal and family history of cancer from clinical reports. To do so, the Hx application must process reports, identify cancer concepts, distinguish the known mesothelioma from ancillary cancers, recognize negation, perform reference resolution and determine the experiencer. Results indicated that both information extraction methods tested were dependant on the domain-specific lexicon and negation extraction. We showed that the more general method, ConText, performed as well as our task-specific method. Although Dynamic- Window could be modified to retrieve other concepts, ConText is more robust and performs better on inconclusive concepts. Hx could greatly improve and expedite the process of extracting data from free-text, clinical records for a variety of research or healthcare delivery organizations. PMID:21031012
Fong, Simon; Zhuang, Yan
2012-01-01
User authentication has been widely used by biometric applications that work on unique bodily features, such as fingerprints, retina scan, and palm vessels recognition. This paper proposes a novel concept of biometric authentication by exploiting a user's medical history. Although medical history may not be absolutely unique to every individual person, the chances of having two persons who share an exactly identical trail of medical and prognosis history are slim. Therefore, in addition to common biometric identification methods, medical history can be used as ingredients for generating Q&A challenges upon user authentication. This concept is motivated by a recent advancement on smart-card technology that future identity cards are able to carry patents' medical history like a mobile database. Privacy, however, may be a concern when medical history is used for authentication. Therefore in this paper, a new method is proposed for abstracting the medical data by using attribute value taxonomies, into a hierarchical data tree (h-Data). Questions can be abstracted to various level of resolution (hence sensitivity of private data) for use in the authentication process. The method is described and a case study is given in this paper.
US Public Cord Blood Banking Practices: Recruitment, Donation, and the Timing of Consent
Broder, Sherri; Ponsaran, Roselle; Goldenberg, Aaron
2012-01-01
BACKGROUND Cord blood has moved rapidly from an experimental stem cell source to an accepted and important source of hematopoietic stem cells. There has been no comprehensive assessment of US public cord blood banking practices since the Institute of Medicine study in 2005. STUDY DESIGN AND METHODS Of 34 US public cord blood banks identified, 16 participated in our qualitative survey of public cord blood banking practices. Participants took part in in-depth telephone interviews in which they were asked structured and open-ended questions regarding recruitment, donation, and the informed consent process at these banks. RESULTS 13 of 16 participants reported a variably high percentage of women who consented to public cord blood donation. 15 banks offered donor registration at the time of hospital admission for labor and delivery. 7 obtained full informed consent and medical history during early labor and 8 conducted some form of phased consent and/or phased medical screening and history. 9 participants identified initial selection of the collection site location as the chief mode by which they recruited minority donors. CONCLUSION Since 2005, more public banks offer cord blood donor registration at the time of admission for labor and delivery. That, and the targeted location of cord blood collection sites, are the main methods used to increase access to donation and HLA diversity of banked units. Currently, the ability to collect and process donations, rather than donor willingness, is the major barrier to public cord blood banking. PMID:22803637
Unsteady Aerodynamic Testing Using the Dynamic Plunge Pitch and Roll Model Mount
NASA Technical Reports Server (NTRS)
Lutze, Frederick H.; Fan, Yigang
1999-01-01
A final report on the DyPPiR tests that were run are presented. Essentially it consists of two parts, a description of the data reduction techniques and the results. The data reduction techniques include three methods that were considered: 1) signal processing of wind on - wind off data; 2) using wind on data in conjunction with accelerometer measurements; and 3) using a dynamic model of the sting to predict the sting oscillations and determining the aerodynamic inputs using an optimization process. After trying all three, we ended up using method 1, mainly because of its simplicity and our confidence in its accuracy. The results section consists of time history plots of the input variables (angle of attack, roll angle, and/or plunge position) and the corresponding time histories of the output variables, C(sub L), C(sub D), C(sub m), C(sub l), C(sub m), C(sub n). Also included are some phase plots of one or more of the output variable vs. an input variable. Typically of interest are pitch moment coefficient vs. angle of attack for an oscillatory motion where the hysteresis loops can be observed. These plots are useful to determine the "more interesting" cases. Samples of the data as it appears on the disk are presented at the end of the report. The last maneuver, a rolling pull up, is indicative of the unique capabilities of the DyPPiR, allowing combinations of motions to be exercised at the same time.
NASA Technical Reports Server (NTRS)
Dunn, H. J.
1981-01-01
A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.
Driven Metadynamics: Reconstructing Equilibrium Free Energies from Driven Adaptive-Bias Simulations
2013-01-01
We present a novel free-energy calculation method that constructively integrates two distinct classes of nonequilibrium sampling techniques, namely, driven (e.g., steered molecular dynamics) and adaptive-bias (e.g., metadynamics) methods. By employing nonequilibrium work relations, we design a biasing protocol with an explicitly time- and history-dependent bias that uses on-the-fly work measurements to gradually flatten the free-energy surface. The asymptotic convergence of the method is discussed, and several relations are derived for free-energy reconstruction and error estimation. Isomerization reaction of an atomistic polyproline peptide model is used to numerically illustrate the superior efficiency and faster convergence of the method compared with its adaptive-bias and driven components in isolation. PMID:23795244
Western Civ., Multiculturalism and the Problem of a Unified World History.
ERIC Educational Resources Information Center
Dunn, Ross E.
This paper traces the development of the concept of a unified world history and applies that concept to the present curriculum. World history became more European-centered over time as other cultures were viewed as backward. The exclusion of so much of humanity from the "known world of progress" made less and less sense over time as global…
The Chicanos: A History of Mexican Americans. American Century Series.
ERIC Educational Resources Information Center
Meier, Matt S.; Rivera, Feliciano
To identify the Mexican American as a member of a unique cultural group is the purpose of this history of the Chicanos. The history of the Mexican American is divided into 5 broad time periods: the Indo-Hispanic period, during which there was a blending of the Indian and Spanish cultures; the Mexican period, a time of political activity which…
Recursive Deadbeat Controller Design
NASA Technical Reports Server (NTRS)
Juang, Jer-Nan; Phan, Minh Q.
1997-01-01
This paper presents a recursive algorithm for a deadbeat predictive controller design. The method combines together the concepts of system identification and deadbeat controller designs. It starts with the multi-step output prediction equation and derives the control force in terms of past input and output time histories. The formulation thus derived satisfies simultaneously system identification and deadbeat controller design requirements. As soon as the coefficient matrices are identified satisfying the output prediction equation, no further work is required to compute the deadbeat control gain matrices. The method can be implemented recursively just as any typical recursive system identification techniques.
Gong, Qi; Schaubel, Douglas E
2017-03-01
Treatments are frequently evaluated in terms of their effect on patient survival. In settings where randomization of treatment is not feasible, observational data are employed, necessitating correction for covariate imbalances. Treatments are usually compared using a hazard ratio. Most existing methods which quantify the treatment effect through the survival function are applicable to treatments assigned at time 0. In the data structure of our interest, subjects typically begin follow-up untreated; time-until-treatment, and the pretreatment death hazard are both heavily influenced by longitudinal covariates; and subjects may experience periods of treatment ineligibility. We propose semiparametric methods for estimating the average difference in restricted mean survival time attributable to a time-dependent treatment, the average effect of treatment among the treated, under current treatment assignment patterns. The pre- and posttreatment models are partly conditional, in that they use the covariate history up to the time of treatment. The pre-treatment model is estimated through recently developed landmark analysis methods. For each treated patient, fitted pre- and posttreatment survival curves are projected out, then averaged in a manner which accounts for the censoring of treatment times. Asymptotic properties are derived and evaluated through simulation. The proposed methods are applied to liver transplant data in order to estimate the effect of liver transplantation on survival among transplant recipients under current practice patterns. © 2016, The International Biometric Society.
Paraskevopoulou, Sivylla E; Wu, Di; Eftekhar, Amir; Constandinou, Timothy G
2014-09-30
This work presents a novel unsupervised algorithm for real-time adaptive clustering of neural spike data (spike sorting). The proposed Hierarchical Adaptive Means (HAM) clustering method combines centroid-based clustering with hierarchical cluster connectivity to classify incoming spikes using groups of clusters. It is described how the proposed method can adaptively track the incoming spike data without requiring any past history, iteration or training and autonomously determines the number of spike classes. Its performance (classification accuracy) has been tested using multiple datasets (both simulated and recorded) achieving a near-identical accuracy compared to k-means (using 10-iterations and provided with the number of spike classes). Also, its robustness in applying to different feature extraction methods has been demonstrated by achieving classification accuracies above 80% across multiple datasets. Last but crucially, its low complexity, that has been quantified through both memory and computation requirements makes this method hugely attractive for future hardware implementation. Copyright © 2014 Elsevier B.V. All rights reserved.
1991-01-01
hours, days, etc. Direction of movement is indicated by what Steven Hawking calls "time arrows." (Steven W. Hawking, A Brief History of Time (New York...John. Military Misfortunes. New York: The Free Press, 1990. Hawking, Stephen W. A Brief History of Time . New York: Bantam Books, Inc., 1988. Headquarters
Arrows of time in the bouncing universes of the no-boundary quantum state
NASA Astrophysics Data System (ADS)
Hartle, James; Hertog, Thomas
2012-05-01
We derive the arrows of time of our universe that follow from the no-boundary theory of its quantum state (NBWF) in a minisuperspace model. Arrows of time are viewed four-dimensionally as properties of the four-dimensional Lorentzian histories of the universe. Probabilities for these histories are predicted by the NBWF. For histories with a regular “bounce” at a minimum radius fluctuations are small at the bounce and grow in the direction of expansion on either side. For recollapsing classical histories with big bang and big crunch singularities the fluctuations are small near one singularity and grow through the expansion and recontraction to the other singularity. The arrow of time defined by the growth in fluctuations thus points in one direction over the whole of a recollapsing spacetime but is bidirectional in a bouncing spacetime. We argue that the electromagnetic, thermodynamic, and psychological arrows of time are aligned with the fluctuation arrow. The implications of a bidirectional arrow of time for causality are discussed.
Using the entire history in the analysis of nested case cohort samples.
Rivera, C L; Lumley, T
2016-08-15
Countermatching designs can provide more efficient estimates than simple matching or case-cohort designs in certain situations such as when good surrogate variables for an exposure of interest are available. We extend pseudolikelihood estimation for the Cox model under countermatching designs to models where time-varying covariates are considered. We also implement pseudolikelihood with calibrated weights to improve efficiency in nested case-control designs in the presence of time-varying variables. A simulation study is carried out, which considers four different scenarios including a binary time-dependent variable, a continuous time-dependent variable, and the case including interactions in each. Simulation results show that pseudolikelihood with calibrated weights under countermatching offers large gains in efficiency if compared to case-cohort. Pseudolikelihood with calibrated weights yielded more efficient estimators than pseudolikelihood estimators. Additionally, estimators were more efficient under countermatching than under case-cohort for the situations considered. The methods are illustrated using the Colorado Plateau uranium miners cohort. Furthermore, we present a general method to generate survival times with time-varying covariates. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Prediction of Quality Change During Thawing of Frozen Tuna Meat by Numerical Calculation I
NASA Astrophysics Data System (ADS)
Murakami, Natsumi; Watanabe, Manabu; Suzuki, Toru
A numerical calculation method has been developed to determine the optimum thawing method for minimizing the increase of metmyoglobin content (metMb%) as an indicator of color changes in frozen tuna meat during thawing. The calculation method is configured the following two steps: a) calculation of temperature history in each part of frozen tuna meat during thawing by control volume method under the assumption of one-dimensional heat transfer, and b) calculation of metMb% based on the combination of calculated temperature history, Arrenius equation and the first-order reaction equation for the increase rate of metMb%. Thawing experiments for measuring temperature history of frozen tuna meat were carried out under the conditions of rapid thawing and slow thawing to compare the experimental data with calculated temperature history as well as the increase of metMb%. The results were coincident with the experimental data. The proposed simulation method would be useful for predicting the optimum thawing conditions in terms of metMb%.
Using single cell sequencing data to model the evolutionary history of a tumor.
Kim, Kyung In; Simon, Richard
2014-01-24
The introduction of next-generation sequencing (NGS) technology has made it possible to detect genomic alterations within tumor cells on a large scale. However, most applications of NGS show the genetic content of mixtures of cells. Recently developed single cell sequencing technology can identify variation within a single cell. Characterization of multiple samples from a tumor using single cell sequencing can potentially provide information on the evolutionary history of that tumor. This may facilitate understanding how key mutations accumulate and evolve in lineages to form a heterogeneous tumor. We provide a computational method to infer an evolutionary mutation tree based on single cell sequencing data. Our approach differs from traditional phylogenetic tree approaches in that our mutation tree directly describes temporal order relationships among mutation sites. Our method also accommodates sequencing errors. Furthermore, we provide a method for estimating the proportion of time from the earliest mutation event of the sample to the most recent common ancestor of the sample of cells. Finally, we discuss current limitations on modeling with single cell sequencing data and possible improvements under those limitations. Inferring the temporal ordering of mutational sites using current single cell sequencing data is a challenge. Our proposed method may help elucidate relationships among key mutations and their role in tumor progression.
Increment-borer methods for determining fire history in coniferous forests
Stephen W. Barrett; Stephen F. Arno
1988-01-01
Describes use of increment borers for interpreting fire history in coniferous forests. These methods are intended for use in wildernesses, parks, and other natural areas where sawing cross-sections from fire-scarred trees is prohibited.
Hsieh, Chia-Hung; Ko, Chiun-Cheng; Chung, Cheng-Han; Wang, Hurng-Yi
2014-07-01
The sweet potato whitefly, Bemisia tabaci, is a highly differentiated species complex. Despite consisting of several morphologically indistinguishable entities and frequent invasions on all continents with important associated economic losses, the phylogenetic relationships, species status, and evolutionary history of this species complex is still debated. We sequenced and analyzed one mitochondrial and three single-copy nuclear genes from 9 of the 12 genetic groups of B. tabaci and 5 closely related species. Bayesian species delimitation was applied to investigate the speciation events of B. tabaci. The species statuses of the different genetic groups were strongly supported under different prior settings and phylogenetic scenarios. Divergence histories were estimated by a multispecies coalescence approach implemented in (*)BEAST. Based on mitochondrial locus, B. tabaci was originated 6.47 million years ago (MYA). Nevertheless, the time was 1.25MYA based on nuclear loci. According to the method of approximate Bayesian computation, this difference is probably due to different degrees of migration among loci; i.e., although the mitochondrial locus had differentiated, gene flow at nuclear loci was still possible, a scenario similar to parapatric mode of speciation. This is the first study in whiteflies using multilocus data and incorporating Bayesian coalescence approaches, both of which provide a more biologically realistic framework for delimiting species status and delineating the divergence history of B. tabaci. Our study illustrates that gene flow during species divergence should not be overlooked and has a great impact on divergence time estimation. Copyright © 2014 Elsevier Inc. All rights reserved.
Early symptom burden predicts recovery after sport-related concussion
Mannix, Rebekah; Monuteaux, Michael C.; Stein, Cynthia J.; Bachur, Richard G.
2014-01-01
Objective: To identify independent predictors of and use recursive partitioning to develop a multivariate regression tree predicting symptom duration greater than 28 days after a sport-related concussion. Methods: We conducted a prospective cohort study of patients in a sports concussion clinic. Participants completed questionnaires that included the Post-Concussion Symptom Scale (PCSS). Participants were asked to record the date on which they last experienced symptoms. Potential predictor variables included age, sex, score on symptom inventories, history of prior concussions, performance on computerized neurocognitive assessments, loss of consciousness and amnesia at the time of injury, history of prior medical treatment for headaches, history of migraines, and family history of concussion. We used recursive partitioning analysis to develop a multivariate prediction model for identifying athletes at risk for a prolonged recovery from concussion. Results: A total of 531 patients ranged in age from 7 to 26 years (mean 14.6 ± 2.9 years). The mean PCSS score at the initial visit was 26 ± 26; mean time to presentation was 12 ± 5 days. Only total score on symptom inventory was independently associated with symptoms lasting longer than 28 days (adjusted odds ratio 1.044; 95% confidence interval [CI] 1.034, 1.054 for PCSS). No other potential predictor variables were independently associated with symptom duration or useful in developing the optimal regression decision tree. Most participants (86%; 95% CI 80%, 90%) with an initial PCSS score of <13 had resolution of their symptoms within 28 days of injury. Conclusions: The only independent predictor of prolonged symptoms after sport-related concussion is overall symptom burden. PMID:25381296
Cole, K.L.; Taylor, R.S.
1995-01-01
The history of a rapidly changing mosaic of prairie and oak savanna in northern Indiana was reconstructed using several methods emphasizing different time scales ranging from annual to millennial. Vegetation change was monitored for 8 yr using plots and for 30 yr using aerial photographs. A 20th century fire history was reconstructed from the stand structure of multiple-stemmed trees and fire scars. General Land Office Survey data were used to reconstruct the forest of A.D. 1834. Fossil pollen and charcoal records were used to reconstruct the last 4000 yr of vegetation and fire history. Since its deposition along the shore of Lake Michigan about 4000 yr ago, the area has followed a classical primary dune successional sequence, gradually changing from pine forest to prairie/oak savanna between A.D. 264 and 1007. This successional trend, predicted in the models of Henry Cowles, occurred even though the climate cooled and prairies elsewhere in the region retreated. Severe fires in the 19th century reduced most tree species but led to a temporary increase in Populus tremuloides. During the last few decades, the prairie has been invaded by oaks and other woody species, primarily because of fire suppression since A.D. 1972. The rapid and complex changes now occurring are a response to the compounded effects of plant succession, intense burning and logging in the 19th century, recent fire suppression, and possibly increased airborne deposition of nitrates. The compilation of several historical research techniques emphasizing different time scales allows this study of the interactions between multiple disturbance variables
Kareken, David A.; Dzemidzic, Mario; Wetherill, Leah; Eiler, William; Oberlin, Brandon G.; Harezlak, Jaroslaw; Wang, Yang; O’Connor, Sean J.
2013-01-01
Rationale Impulsive behavior is associated with both alcohol use disorders and a family history of alcoholism (FHA). One operational definition of impulsive behavior is the stop signal task (SST), which measures the time needed to stop a ballistic hand movement. Objective Employ functional magnetic resonance imaging (fMRI) to study right frontal responses to stop signals in heavy drinking subjects with and without FHA, and as a function of alcohol exposure. Methods Twenty two family history positive (FHP; age = 22.7 years, SD= 1.9) and 18 family history negative (FHN; age = 23.7, SD= 1.8) subjects performed the SST in fMRI in two randomized visits: once during intravenous infusion of alcohol, clamped at a steady-state breath alcohol (BrAC) concentration of 60mg%, and once during infusion of placebo saline. An independent reference group (n= 13, age= 23.7, SD= 1.8) was used to identify a priori right prefrontal regions activated by successful inhibition (Inh) trials, relative to ‘Go’ trials that carried no need for inhibition (Inh > Go). Results FHA interacted with alcohol exposure in right prefrontal cortex, where alcohol reduced [Inh > Go] activation in FHN subjects, but not in FHP subjects. Within this right frontal cortical region, stop signal reaction time (SSRT) also correlated negatively with [Inh > Go] activation, suggesting that the [Inh > Go] activity was related to inhibitory behavior. Conclusions The results are consistent with the low level of response theory (Schuckit, 1980; Quinn & Fromme, 2011), with FHP being less sensitive to alcohol’s effects. PMID:23468100
A Quantitative Measure of Handgrip Myotonia in Non-dystrophic Myotonia
Statland, Jeffrey M; Bundy, Brian N; Wang, Yunxia; Trivedi, Jaya R; Rayan, Dipa Raja; Herbelin, Laura; Donlan, Merideth; McLin, Rhonda; Eichinger, Katy J; Findlater, Karen; Dewar, Liz; Pandya, Shree; Martens, William B; Venance, Shannon L; Matthews, Emma; Amato, Anthony A; Hanna, Michael G; Griggs, Robert C; Barohn, Richard J
2012-01-01
Introduction Non-dystrophic Myotonia (NDM) is characterized by myotonia without muscle wasting. A standardized quantitative myotonia assessment (QMA) is important for clinical trials. Methods Myotonia was assessed in 91 individuals enrolled in a natural history study using a commercially available computerized handgrip myometer and automated software. Average peak force and 90% to 5% relaxation times were compared to historical normal controls studied with identical methods. Results 30 subjects had chloride channel mutations, 31 sodium channel mutations, 6 DM2, and 24 no identified mutation. Chloride channel mutations were associated with prolonged 1st handgrip relaxation times, and warm up on subsequent handgrips. Sodium channel mutations were associated with prolonged 1st handgrip relaxation times and paradoxical myotonia or warm-up, depending on underlying mutations. DM2 subjects had normal relaxation times but decreased peak force. Sample size estimates are provided for clinical trial planning. Conclusion QMA is an automated, non-invasive technique for evaluating myotonia in NDM. PMID:22987687
NASA Technical Reports Server (NTRS)
Clements, P. A.; Borutzki, S. E.; Kirk, A.
1984-01-01
The Deep Space Network (DSN), managed by the Jet Propulsion Laboratory for NASA, must maintain time and frequency within specified limits in order to accurately track the spacecraft engaged in deep space exploration. Various methods are used to coordinate the clocks among the three tracking complexes. These methods include Loran-C, TV Line 10, Very Long Baseline Interferometry (VLBI), and the Global Positioning System (GPS). Calculations are made to obtain frequency offsets and Allan variances. These data are analyzed and used to monitor the performance of the hydrogen masers that provide the reference frequencies for the DSN Frequency and Timing System (DFT). Areas of discussion are: (1) a brief history of the GPS timing receivers in the DSN, (2) a description of the data and information flow, (3) data on the performance of the DSN master clocks and GPS measurement system, and (4) a description of hydrogen maser frequency steering using these data.
Computer-assisted self interviewing in sexual health clinics.
Fairley, Christopher K; Sze, Jun Kit; Vodstrcil, Lenka A; Chen, Marcus Y
2010-11-01
This review describes the published information on what constitutes the elements of a core sexual history and the use of computer-assisted self interviewing (CASI) within sexually transmitted disease clinics. We searched OVID Medline from 1990 to February 2010 using the terms "computer assisted interviewing" and "sex," and to identify published articles on a core sexual history, we used the term "core sexual history." Since 1990, 3 published articles used a combination of expert consensus, formal clinician surveys, and the Delphi technique to decide on what questions form a core sexual health history. Sexual health histories from 4 countries mostly ask about the sex of the partners, the number of partners (although the time period varies), the types of sex (oral, anal, and vaginal) and condom use, pregnancy intent, and contraceptive methods. Five published studies in the United States, Australia, and the United Kingdom compared CASI with in person interviews in sexually transmitted disease clinics. In general, CASI identified higher risk behavior more commonly than clinician interviews, although there were substantial differences between studies. CASI was found to be highly acceptable and individuals felt it allowed more honest reporting. Currently, there are insufficient data to determine whether CASI results in differences in sexually transmitted infection testing, diagnosis, or treatment or if CASI improves the quality of sexual health care or its efficiency. The potential public health advantages of the widespread use of CASI are discussed.
Grace Under Fire: The Army Nurses of Pearl Harbor, 1941.
Milbrath, Gwyneth R
2016-01-01
Much has been written about the military events of December 7, 1941; however, little has been documented about the nurses' work and experience at Pearl Harbor, Hawaii. The aerial assault on Pearl Harbor was the first time in US history that Army nurses had been on the front line of battle. Nurses quickly triaged and stabilized those who could be saved, and provided compassion and comfort to those who were dying, in an environment where the nurses were unsure of their own survival. Traditional historical methods and a social history framework were used in this investigation. Primary sources included oral histories from the US Army Medical Department Center of History and Heritage and the State of Hawaii's website, Hawaii Aviation. Secondary sources included published books, newspaper articles, military websites, and history texts. Due to the limited bed capacity, Hickam Field Hospital converted to an evacuation hospital. Nurses, physicians, and medical corpsman triaged, stabilized, and transported those likely to survive, while staging the dead behind the building. The emergency room at Tripler Hospital was quickly flooded with patients from the battlefield, but the staff was able to sort patients appropriately to the wards, to the operating room, or provide comfort care as they died. At Schofield Hospital, collaboration between tireless doctors, nurses, and corpsmen was key to providing life-saving surgery and care.
DeVito, E. E.; Jiantonio, R. E.; Meda, S. A.; Stevens, M. C.; Potenza, M. N.; Krystal, J. H.; Pearlson, G. D.
2013-01-01
Rationale Individuals with a family history of alcoholism (family history positive [FHP]) show higher alcoholism rates and are more impulsive than those without such a family history (family history negative [FHN]), possibly due to altered N-methyl-D-aspartate (NMDA) receptor function. Objectives We investigated whether memantine, an NMDA receptor antagonist, differentially influences impulsivity measures and Go/No-Go behavior and fMRI activity in matched FHP and FHN individuals. Methods On separate days, participants received a single dose of 40 mg memantine or identical-appearing placebo. Results No group performance differences were observed on placebo for Go correct hit or No-Go false alarm reaction time on the Go/No-Go task. During fMRI, right cingulate activation differed for FHP vs. FHN subjects during No-Go correct rejects. Memantine had attenuated effects in FHP vs. FHN subjects: For No-Go false alarms, memantine was associated with limited reduction in subcortical, cingulate, and temporal regions in FHP subjects and reduced activity in fronto-striatal–parietal networks in FHN subjects. For No-Go correct rejects, memantine (relative to placebo) reduced activity in left cingulate and caudate in FHP but not FHN subjects. Conclusions Lower sensitivity to the effects of memantine in FHP subjects is consistent with greater NMDA receptor function in this group. PMID:22311382
ERIC Educational Resources Information Center
Mugleston, William F.
2000-01-01
Believes that by focusing on the recurrent situations and problems, or parallels, throughout history, students will understand the relevance of history to their own times and lives. Provides suggestions for parallels in history that may be introduced within lectures or as a means to class discussions. (CMK)
Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.
Donaldson, G
1996-04-01
An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche.
NASA Technical Reports Server (NTRS)
Patterson, J. C., Jr.; Jordan, F. L., Jr.
1975-01-01
A recently proposed method of flow visualization was investigated at the National Aeronautics and Space Administration's Langley Research Center. This method of flow visualization is particularly applicable to the study of lift-induced wing tip vortices through which it is possible to record the entire life span of the vortex. To accomplish this, a vertical screen of smoke was produced perpendicular to the flight path and allowed to become stationary. A model was then driven through the screen of smoke producing the circular vortex motion made visible as the smoke was induced along the path taken by the flow and was recorded by highspeed motion pictures.
Hughes, Lucinda; Ruth, Karen; Rebbeck, Timothy R.; Giri, Veda N.
2013-01-01
Background Men with a family history of prostate cancer and African American men are at high risk for prostate cancer and in need of personalized risk estimates to inform screening decisions. This study evaluated genetic variants in genes encoding microRNA (miRNA) binding sites for informing of time to prostate cancer diagnosis among ethnically-diverse, high-risk men undergoing prostate cancer screening. Methods The Prostate Cancer Risk Assessment Program (PRAP) is a longitudinal screening program for high-risk men. Eligibility includes men ages 35-69 with a family history of prostate cancer or African descent. Participants with ≥ 1 follow-up visit were included in the analyses (n=477). Genetic variants in regions encoding miRNA binding sites in four target genes (ALOX15, IL-16, IL-18, and RAF1) previously implicated in prostate cancer development were evaluated. Genotyping methods included Taqman® SNP Genotyping Assay (Applied Biosystems) or pyrosequencing. Cox models were used to assess time to prostate cancer diagnosis by risk genotype. Results Among 256 African Americans with ≥ one follow-up visit, the TT genotype at rs1131445 in IL-16 was significantly associated with earlier time to prostate cancer diagnosis vs. the CC/CT genotypes (p=0.013), with a suggestive association after correction for false-discovery (p=0.065). Hazard ratio after controlling for age and PSA for TT vs. CC/CT among African Americans was 3.0 (95% CI 1.26-7.12). No association to time to diagnosis was detected among Caucasians by IL-16 genotype. No association to time to prostate cancer diagnosis was found for the other miRNA target genotypes. Conclusions Genetic variation in IL-16 encoding miRNA target site may be informative of time to prostate cancer diagnosis among African American men enrolled in prostate cancer risk assessment, which may inform individualized prostate cancer screening strategies in the future. PMID:24061634
NASA Astrophysics Data System (ADS)
Hahn, K. E.; Turner, E. C.; Kontak, D. J.; Fayek, M.
2018-02-01
Ancient carbonate rocks commonly contain numerous post-depositional phases (carbonate minerals; quartz) recording successive diagenetic events that can be deciphered and tied to known or inferred geological events using a multi-pronged in situ analytical protocol. The framework voids of large, deep-water microbial carbonate seep-mounds in Arctic Canada (Mesoproterozoic Ikpiarjuk Formation) contain multiple generations of synsedimentary and late cement. An in situ analytical study of the post-seafloor cements used optical and cathodoluminescence petrography, SEM-EDS analysis, fluid inclusion (FI) microthermometry and evaporate mound analysis, LA-ICP-MS analysis, and SIMS δ18O to decipher the mounds' long-term diagenetic history. The six void-filling late cements include, in paragenetic order: inclusion-rich euhedral dolomite (ED), finely crystalline clear dolomite (FCD), hematite-bearing dolomite (HD), coarsely crystalline clear dolomite (CCD), quartz (Q), replacive calcite (RC) and late calcite (LC). Based on the combined analytical results, the following fluid-flow history is defined: (1) ED precipitation by autocementation during shallow burial (fluid 1; Mesoproterozoic); (2) progressive mixing of Ca-rich hydrothermal fluid with the connate fluid, resulting in precipitation of FCD followed by HD (fluid 2; also Mesoproterozoic); (3) precipitation of hydrothermal dolomite (CCD) from high-Ca and K-rich fluids (fluid 3; possibly Mesoproterozoic, but timing unclear); (4) hydrothermal Q precipitation (fluid 4; timing unclear), and (5) RC and LC precipitation from a meteoric-derived water (fluid 5) in or since the Mesozoic. Fluids associated with FCD, HD, and CCD may have been mobilised during deposition of the upper Bylot Supergroup; this time interval was the most tectonically active episode in the region's Mesoproterozoic to Recent history. The entire history of intermittent fluid migration and cement precipitation recorded in seemingly unimportant void-filling mineral phases spans over 1 billion years, and was decipherable only because of the in situ protocol used. The multiple-method in situ analytical protocol employed in this study substantially augments the knowledge of an area's geological history, parts of which cannot be discerned by means other than meticulous study of diagenetic phases, and should become routine in similar studies.
NASA Astrophysics Data System (ADS)
Macomber, B.; Woollands, R. M.; Probe, A.; Younes, A.; Bai, X.; Junkins, J.
2013-09-01
Modified Chebyshev Picard Iteration (MCPI) is an iterative numerical method for approximating solutions of linear or non-linear Ordinary Differential Equations (ODEs) to obtain time histories of system state trajectories. Unlike other step-by-step differential equation solvers, the Runge-Kutta family of numerical integrators for example, MCPI approximates long arcs of the state trajectory with an iterative path approximation approach, and is ideally suited to parallel computation. Orthogonal Chebyshev Polynomials are used as basis functions during each path iteration; the integrations of the Picard iteration are then done analytically. Due to the orthogonality of the Chebyshev basis functions, the least square approximations are computed without matrix inversion; the coefficients are computed robustly from discrete inner products. As a consequence of discrete sampling and weighting adopted for the inner product definition, Runge phenomena errors are minimized near the ends of the approximation intervals. The MCPI algorithm utilizes a vector-matrix framework for computational efficiency. Additionally, all Chebyshev coefficients and integrand function evaluations are independent, meaning they can be simultaneously computed in parallel for further decreased computational cost. Over an order of magnitude speedup from traditional methods is achieved in serial processing, and an additional order of magnitude is achievable in parallel architectures. This paper presents a new MCPI library, a modular toolset designed to allow MCPI to be easily applied to a wide variety of ODE systems. Library users will not have to concern themselves with the underlying mathematics behind the MCPI method. Inputs are the boundary conditions of the dynamical system, the integrand function governing system behavior, and the desired time interval of integration, and the output is a time history of the system states over the interval of interest. Examples from the field of astrodynamics are presented to compare the output from the MCPI library to current state-of-practice numerical integration methods. It is shown that MCPI is capable of out-performing the state-of-practice in terms of computational cost and accuracy.
Madigan, Sheri; Wade, Mark; Plamondon, Andre; Jenkins, Jennifer
2015-01-01
The current study examined a temporal cascade linking mothers' history of abuse with their children's internalizing difficulties through proximal processes such as maternal postnatal depressive symptoms and responsive parenting. Participants consisted of 490 mother-child dyads assessed at three time points when children were, on average, 2 months old at Time 1 (T1), 18 months at Time 2 (T2), and 36 months at Time 3 (T3). Maternal abuse history and depressive symptoms were assessed via questionnaires at T1. Observations of responsive parenting were collected at T2 and were coded using a validated coding scheme. Children's internalizing difficulties were assessed in the preschool period using averaged parental reports. Path analysis revealed that maternal physical abuse was associated with depressive symptoms postnatally, which were in turn associated with children's internalizing behavior at 36 months of age. We also found that the association between physical abuse history and responsive parenting operated indirectly through maternal depressive symptoms. These findings remained after controlling for covariates including socioeconomic status, child gender, and age. After accounting for physical abuse history, sexual abuse history was not associated with child internalizing problems either directly or indirectly through maternal depressive symptoms and/or parenting behavior. Thus, mothers' physical abuse history is a risk factor for relatively poor mental health, which is itself predictive of both later parenting behavior and children's internalizing problems. © 2015 Michigan Association for Infant Mental Health.
History as Storytelling. Voices from the Past.
ERIC Educational Resources Information Center
Osborne, Ken
2000-01-01
Focuses on the use of storytelling as a means of teaching history. Explores the ideas presented by Charles McMurry in his handbook "Special Method in History" that addresses the use of stories in teaching. States that McMurry thought history could be interesting and tangible for even the youngest child. (CMK)
NASA Astrophysics Data System (ADS)
Olson, Donald
2009-10-01
How do astronomical methods make it possible to calculate dates and times for Vincent van Gogh's night-sky paintings? Why is there a blood-red sky in Edvard Munch's The Scream? On what dates did Ansel Adams create his moonrise photographs in Yosemite? How can the 18.6-year cycle of the lunar nodes and the Moon's declination on the night of August 29-30, 1857, explain a long-standing mystery about Abraham Lincoln's honesty in the murder case known as the almanac trial? Why is a bright star described in Act 1, Scene 1, of Hamlet? To answer questions like these, our Texas State group has published a series of articles over the last two decades, applying astronomy to art, history, and literature.
Surveillance of women with a personal history of breast cancer by tumour subtype.
Benveniste, A P; Dryden, M J; Bedrosian, I; Morrow, P K; Bassett, R L; Yang, W
2017-03-01
To determine if the rate and timing of a second breast cancer event (SBCE) in women with a personal history of breast cancer varies by disease subtype or breast imaging method. A retrospective review was performed of women with a SBCE from January 2006 to December 2010 at a single institution. Data analysed included oestrogen receptor (ER), progesterone receptor (PR), human epidermal growth factor receptor 2 (HER2) status of the primary and second breast cancers; mammographic and ultrasound (US) features from SBCE; and the time interval between both events. Of 207 patients diagnosed with a SBCE, the median age at first diagnosis was 50.6 years, range 24.8 to 80.2; at second diagnosis was 56.2 years, range 25.8 to 87.9. Eleven percent of SBCE were diagnosed >10 years after the primary cancer diagnosis. The median time between the first and second diagnosis for ER-positive patients was 2.7 years (range 0.7-17.4 years); and 1.9 years for ER-negative patients, (range 0.4-23.4 years; p<0.002). Patients with triple-negative breast cancer (TNBC) had a shorter time between diagnoses than others (p=0.0003). At 3, 5, and 10 years, 85%, 92%, and 97% of ER-negative and 54%, 81%, and 95% of ER-positive tumours, respectively, had recurred. ER-negative tumours and TNBC were more likely to be visible at US. There may be a role for customised imaging surveillance of women with a personal history of breast cancer (PHBC) after 10 years. Further studies are necessary to determine if US may be valuable in the surveillance of patients with ER-negative and TNBC tumours. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Sun Exposure and Protection Habits in Pediatric Patients with a History of Malignancy
Levy-Shraga, Yael; Cohen, Rinat; Ben Ami, Michal; Yeshayahu, Yonatan; Temam, Vered; Modan-Moses, Dalit
2015-01-01
Background Survivors of childhood cancer are at high risk for developing non-melanoma skin cancer and therefore are firmly advised to avoid or minimize sun exposure and adopt skin protection measures. We aimed to compare sun exposure and protection habits in a cohort of pediatric patients with a history of malignancy to those of healthy controls. Methods Case-control study of 143 pediatric patients with a history of malignancy (aged 11.2±4.6y, Male = 68, mean interval from diagnosis 4.4±3.8y) and 150 healthy controls (aged 10.4±4.8y, Male = 67). Sun exposure and protection habits were assessed using validated questionnaires. Results Patients and controls reported similar sun exposure time during weekdays (94±82minutes/day vs. 81±65minutes/day; p = 0.83), while during weekends patients spent significantly less time outside compared to controls (103±85minutes/day vs. 124±87minutes/day; p = 0.02). Time elapsed from diagnosis positively correlated with time spent outside both during weekdays (r = 0.194, p = 0.02) and weekends (r = 0.217, p = 0.01), and there was a step-up in sun exposure starting three years after diagnosis. There was no significant difference regarding composite sun protection score between patients and controls. Age was positively correlated with number of sunburns per year and sun exposure for the purpose of tanning, and was negatively correlated with the use of sun protection measures. Conclusions Although childhood cancer survivors are firmly instructed to adopt sun protection habits, the adherence to these instructions is incomplete, and more attention should be paid to improve these habits throughout their lives. Since sunlight avoidance may results in vitamin D deficiency, dietary supplementation will likely be needed. PMID:26348212
Learning from samples of one or fewer*
March, J; Sproull, L; Tamuz, M
2003-01-01
Organizations learn from experience. Sometimes, however, history is not generous with experience. We explore how organizations convert infrequent events into interpretations of history, and how they balance the need to achieve agreement on interpretations with the need to interpret history correctly. We ask what methods are used, what problems are involved, and what improvements might be made. Although the methods we observe are not guaranteed to lead to consistent agreement on interpretations, valid knowledge, improved organizational performance, or organizational survival, they provide possible insights into the possibilities for and problems of learning from fragments of history. PMID:14645764
2011-02-25
fast method of predicting the number of iterations needed for converged results. A new hybrid technique is proposed to predict the convergence history...interchanging between the modes, whereas a smaller veering (or crossing) region shows fast mode switching. Then, the nonlinear vibration re- sponse of the...problems of interest involve dynamic ( fast ) crack propagation, then the nodes selected by the proposed approach at some time instant might not
ERIC Educational Resources Information Center
HANDCOCK, ALAN; ROBINSON, JOHN
A COMBINATION OF TELEVISION, GROUP STUDY, PRIVATE STUDY, CORRESPONDENCE STUDY, AND PRACTICAL WORK PROVIDED AN EXTENSIVE INTRODUCTION TO THE NATURE AND METHODS OF SOCIAL WORK, PRINCIPALLY FOR VOLUNTEER AND PART-TIME WORKERS THROUGH 16 HALF-HOUR PROGRAMS ON SOCIAL WORK AND ADMINISTRATION BROADCAST ON BBC-2 BETWEEN OCTOBER 5, 1965 AND FEBRUARY 8,…
Prospective study of risk factors for suicidal behavior in individuals with anxiety disorders.
Uebelacker, L A; Weisberg, R; Millman, M; Yen, S; Keller, M
2013-07-01
Anxiety disorders are very common and increase risk for suicide attempts. Little is known about predictors of increased risk specifically among individuals with anxiety disorders. The purpose of this study was to investigate whether specific anxiety disorders and other co-morbid psychiatric disorders, physical health, or work or social functioning increased the future likelihood of a suicide attempts among individuals with anxiety disorders. Method In this prospective study, 676 individuals with an anxiety disorder were followed for an average of 12 years. As hypothesized, we found that post-traumatic stress disorder, major depressive disorder (MDD), intermittent depressive disorder (IDD), epilepsy, pain, and poor work and social functioning all predicted a shorter time to a suicide attempt in univariate analyses. In multivariate analyses, baseline MDD and IDD were independent predictors of time to suicide attempt, even when controlling for a past history of suicide attempt. No specific anxiety disorder was an independent predictor of time to attempt in this anxiety-disordered sample. Adding baseline physical health variables and social functioning did not improve the ability of the model to predict time to suicide attempt. Mood disorders and past history of suicide attempts are the most powerful predictors of a future suicide attempt in this sample of individuals, all of whom have an anxiety disorder.