Sample records for modeling rigorous decline

  1. Teachers' Perspectives Regarding the Decline in Boys' Participation in Post-Compulsory Rigorous Mathematics Subjects

    ERIC Educational Resources Information Center

    Easey, Michael

    2013-01-01

    This paper explores the decline in boys' participation in post-compulsory rigorous mathematics using the perspectives of eight experienced teachers at an independent, boys' College located in Brisbane, Queensland. This study coincides with concerns regarding the decline in suitably qualified tertiary graduates with requisite mathematical skills…

  2. Post mortem rigor development in the Egyptian goose (Alopochen aegyptiacus) breast muscle (pectoralis): factors which may affect the tenderness.

    PubMed

    Geldenhuys, Greta; Muller, Nina; Frylinck, Lorinda; Hoffman, Louwrens C

    2016-01-15

    Baseline research on the toughness of Egyptian goose meat is required. This study therefore investigates the post mortem pH and temperature decline (15 min-4 h 15 min post mortem) in the pectoralis muscle (breast portion) of this gamebird species. It also explores the enzyme activity of the Ca(2+)-dependent protease (calpain system) and the lysosomal cathepsins during the rigor mortis period. No differences were found for any of the variables between genders. The pH decline in the pectoralis muscle occurs quite rapidly (c = -0.806; ultimate pH ∼ 5.86) compared with other species and it is speculated that the high rigor temperature (>20 °C) may contribute to the increased toughness. No calpain I was found in Egyptian goose meat and the µ/m-calpain activity remained constant during the rigor period, while a decrease in calpastatin activity was observed. The cathepsin B, B & L and H activity increased over the rigor period. Further research into the connective tissue content and myofibrillar breakdown during aging is required in order to know if the proteolytic enzymes do in actual fact contribute to tenderisation. © 2015 Society of Chemical Industry.

  3. Sustaining a Vision of Rigor

    ERIC Educational Resources Information Center

    Williamson, Ronald; Blackburn, Barbara R.

    2010-01-01

    Even with the best planning and supportive implementation, one's school will experience challenges to achieving its vision of increased rigor. One of today's most serious issues is how schools can improve when resources are stagnant or even declining. Virtually every school faces dwindling resources and is caught between the expectation that…

  4. Rate decline curves analysis of multiple-fractured horizontal wells in heterogeneous reservoirs

    NASA Astrophysics Data System (ADS)

    Wang, Jiahang; Wang, Xiaodong; Dong, Wenxiu

    2017-10-01

    In heterogeneous reservoir with multiple-fractured horizontal wells (MFHWs), due to the high density network of artificial hydraulic fractures, the fluid flow around fracture tips behaves like non-linear flow. Moreover, the production behaviors of different artificial hydraulic fractures are also different. A rigorous semi-analytical model for MFHWs in heterogeneous reservoirs is presented by combining source function with boundary element method. The model are first validated by both analytical model and simulation model. Then new Blasingame type curves are established. Finally, the effects of critical parameters on the rate decline characteristics of MFHWs are discussed. The results show that heterogeneity has significant influence on the rate decline characteristics of MFHWs; the parameters related to the MFHWs, such as fracture conductivity and length also can affect the rate characteristics of MFHWs. One novelty of this model is to consider the elliptical flow around artificial hydraulic fracture tips. Therefore, our model can be used to predict rate performance more accurately for MFHWs in heterogeneous reservoir. The other novelty is the ability to model the different production behavior at different fracture stages. Compared to numerical and analytic methods, this model can not only reduce extensive computing processing but also show high accuracy.

  5. Comments in reply: new directions in migration research.

    PubMed

    Shaw, R P

    1986-01-01

    The author comments on a review of his recent book NEW DIRECTIONS IN MIGRATION RESEARCH and reflects on theory and model specification, problems of estimation and statistical inference, realities of temporal and spatial heterogeneity, choices of explanatory variables, and the importance of broader political issues in migration studies. A core hypothesis is that market forces have declined as influences on internal migration in Canada over the last 30 years. Theoretical underpinnings include declining relevance of wage considerations in the decision to migrate on the assumption that marginal utility of money diminishes and marginal utility of leisure increases as society becomes wealthier. The author perceives the human capital model to have limitations and is especially troubled by the "as if" clause--that all migrants behave "as if" they calculate benefits and risks with equal rigor. The author has "shadowed" and not quantified the costs involved. He implies that normative frameworks for future migration research and planning should be established.

  6. Rigor force responses of permeabilized fibres from fast and slow skeletal muscles of aged rats.

    PubMed

    Plant, D R; Lynch, G S

    2001-09-01

    1. Ageing is generally associated with a decline in skeletal muscle mass and strength and a slowing of muscle contraction, factors that impact upon the quality of life for the elderly. The mechanisms underlying this age-related muscle weakness have not been fully resolved. The purpose of the present study was to determine whether the decrease in muscle force as a consequence of age could be attributed partly to a decrease in the number of cross-bridges participating during contraction. 2. Given that the rigor force is proportional to the approximate total number of interacting sites between the actin and myosin filaments, we tested the null hypothesis that the rigor force of permeabilized muscle fibres from young and old rats would not be different. 3. Permeabilized fibres from the extensor digitorum longus (fast-twitch; EDL) and soleus (predominantly slow-twitch) muscles of young (6 months of age) and old (27 months of age) male F344 rats were activated in Ca2+-buffered solutions to determine force-pCa characteristics (where pCa = -log(10)[Ca2+]) and then in solutions lacking ATP and Ca2+ to determine rigor force levels. 4. The rigor forces for EDL and soleus muscle fibres were not different between young and old rats, indicating that the approximate total number of cross-bridges that can be formed between filaments did not decline with age. We conclude that the age-related decrease in force output is more likely attributed to a decrease in the force per cross-bridge and/or decreases in the efficiency of excitation-contraction coupling.

  7. Value of the distant future: Model-independent results

    NASA Astrophysics Data System (ADS)

    Katz, Yuri A.

    2017-01-01

    This paper shows that the model-independent account of correlations in an interest rate process or a log-consumption growth process leads to declining long-term tails of discount curves. Under the assumption of an exponentially decaying memory in fluctuations of risk-free real interest rates, I derive the analytical expression for an apt value of the long run discount factor and provide a detailed comparison of the obtained result with the outcome of the benchmark risk-free interest rate models. Utilizing the standard consumption-based model with an isoelastic power utility of the representative economic agent, I derive the non-Markovian generalization of the Ramsey discounting formula. Obtained analytical results allowing simple calibration, may augment the rigorous cost-benefit and regulatory impact analysis of long-term environmental and infrastructure projects.

  8. Examination of the effect of ageing and temperature at rigor on colour stability of lamb meat.

    PubMed

    Hopkins, D L; Lamb, T A; Kerr, M J; van de Ven, R J; Ponnampalam, E N

    2013-10-01

    A study of factors (ageing period, rigor temperature and vitamin E level) impacting on the colour stability of lamb m. longissimus thoracis et lumborum (LL) during 3 days of simulated retail display was undertaken. The LL were taken from 84 lambs from 3 slaughters. Slices of LL were measured fresh (24h post-mortem) or after ageing for 5 days in vacuum packaging. The oxy/met ratio (630/580 nm), declined with display time, and increased with increasing temperature at pH6.0. Redness (a*) values also declined with display time and a reduction in redness values was observed as LL pH at 24h post-mortem and/or pH at 18°C increased. There was no effect of ageing period or vitamin E level on the oxy/met ratio or a* values when the vitamin E level averaged 3.76 mg/kg LL. These results suggest that maximising vitamin E levels in lambs and achieving a moderate rate of pH decline will optimise colour stability irrespective of ageing period. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  9. Changes in the contractile state, fine structure and metabolism of cardiac muscle cells during the development of rigor mortis.

    PubMed

    Vanderwee, M A; Humphrey, S M; Gavin, J B; Armiger, L C

    1981-01-01

    Transmural slices from the left anterior papillary muscle of dog hearts were maintained for 120 min in a moist atmosphere at 37 degrees C. At 15-min intervals tissue samples were taken for estimation of adenosine triphosphate (ATP) and glucose-6-phosphate (G6P) and for electron microscopic examination. At the same time the deformability under standard load of comparable regions of an adjacent slice of tissue was measured. ATP levels fell rapidly during the first 45 to 75 min after excision of the heart. During a subsequent further decline in ATP, the mean deformability of myocardium fell from 30 to 12% indicating the development of rigor mortis. Conversely, G6P levels increased during the first decline in adenosine triphosphate but remained relatively steady thereafter. Whereas many of the myocardial cells fixed after 5 min contracted on contact with glutaraldehyde, all cells examined after 15 to 40 min were relaxed. A progressive increase in the proportion of contracted cells was observed during the rapid increase in myocardial rigidity. During this late contraction the cells showed morphological evidence of irreversible injury. These findings suggest that ischaemic myocytes contract just before actin and myosin become strongly linked to maintain the state of rigor mortis.

  10. Multiscale sagebrush rangeland habitat modeling in southwest Wyoming

    USGS Publications Warehouse

    Homer, Collin G.; Aldridge, Cameron L.; Meyer, Debra K.; Coan, Michael J.; Bowen, Zachary H.

    2009-01-01

    Sagebrush-steppe ecosystems in North America have experienced dramatic elimination and degradation since European settlement. As a result, sagebrush-steppe dependent species have experienced drastic range contractions and population declines. Coordinated ecosystem-wide research, integrated with monitoring and management activities, would improve the ability to maintain existing sagebrush habitats. However, current data only identify resource availability locally, with rigorous spatial tools and models that accurately model and map sagebrush habitats over large areas still unavailable. Here we report on an effort to produce a rigorous large-area sagebrush-habitat classification and inventory with statistically validated products and estimates of precision in the State of Wyoming. This research employs a combination of significant new tools, including (1) modeling sagebrush rangeland as a series of independent continuous field components that can be combined and customized by any user at multiple spatial scales; (2) collecting ground-measured plot data on 2.4-meter imagery in the same season the satellite imagery is acquired; (3) effective modeling of ground-measured data on 2.4-meter imagery to maximize subsequent extrapolation; (4) acquiring multiple seasons (spring, summer, and fall) of an additional two spatial scales of imagery (30 meter and 56 meter) for optimal large-area modeling; (5) using regression tree classification technology that optimizes data mining of multiple image dates, ratios, and bands with ancillary data to extrapolate ground training data to coarser resolution sensors; and (6) employing rigorous accuracy assessment of model predictions to enable users to understand the inherent uncertainties. First-phase results modeled eight rangeland components (four primary targets and four secondary targets) as continuous field predictions. The primary targets included percent bare ground, percent herbaceousness, percent shrub, and percent litter. The four secondary targets included percent sagebrush (Artemisia spp.), percent big sagebrush (Artemisia tridentata), percent Wyoming sagebrush (Artemisia tridentata wyomingensis), and sagebrush height (centimeters). Results were validated by an independent accuracy assessment with root mean square error (RMSE) values ranging from 6.38 percent for bare ground to 2.99 percent for sagebrush at the QuickBird scale and RMSE values ranging from 12.07 percent for bare ground to 6.34 percent for sagebrush at the full Landsat scale. Subsequent project phases are now in progress, with plans to deliver products that improve accuracies of existing components, model new components, complete models over larger areas, track changes over time (from 1988 to 2007), and ultimately model wildlife population trends against these changes. We believe these results offer significant improvement in sagebrush rangeland quantification at multiple scales and offer users products that have been rigorously validated.

  11. Why aren't there more Atlantic salmon (Salmo salar)?

    USGS Publications Warehouse

    Parrish, D.L.; Behnke, R.J.; Gephard, S.R.; McCormick, S.D.; Reeves, G.H.

    1998-01-01

    Numbers of wild anadromous Atlantic salmon (Salmo salar) have declined demonstrably throughout their native range. The current status of runs on rivers historically supporting salmon indicate widespread declines and extirpations in Europe and North America primarily in southern portions of the range. Many of these declines or extirpations can be attributed to the construction of mainstem dams, pollution (including acid rain), and total dewatering of streams. Purported effects on declines during the 1960s through the 1990s include overfishing, and more recently, changing ocean conditions, and intensive aquaculture. Most factors affecting salmon numbers do not act singly, but rather in concert, which masks the relative contribution of each factor. Salmon researchers and managers should not look for a single culprit in declining numbers of salmon, but rather, seek solutions through rigorous data gathering and testing of multiple effects integrated across space and time.

  12. The influence of low temperature, type of muscle and electrical stimulation on the course of rigor mortis, ageing and tenderness of beef muscles.

    PubMed

    Olsson, U; Hertzman, C; Tornberg, E

    1994-01-01

    The course of rigor mortis, ageing and tenderness have been evaluated for two beef muscles, M. semimembranosus (SM) and M. longissimus dorsi (LD), when entering rigor at constant temperatures in the cold-shortening region (1, 4, 7 and 10°C). The influence of electrical stimulation (ES) was also examined. Post-mortem changes were registered by shortening and isometric tension and by following the decline of pH, ATP and creatine phosphate. The effect of ageing on tenderness was recorded by measuring shear-force (2, 8 and 15 days post mortem) and the sensory properties were assessed 15 days post mortem. It was found that shortening increased with decreasing temperature, resulting in decreased tenderness. Tenderness for LD, but not for SM, was improved by ES at 1 and 4°C, whereas ES did not give rise to any decrease in the degree of shortening during rigor mortis development. This suggests that ES influences tenderization more than it prevents cold-shortening. The samples with a pre-rigor mortis temperature of 1°C could not be tenderized, when stored up to 15 days, whereas this was the case for the muscles entering rigor mortis at the other higher temperatures. The results show that under the conditions used in this study, the course of rigor mortis is more important for the ultimate tenderness than the course of ageing. Copyright © 1994. Published by Elsevier Ltd.

  13. Accuracy and performance of 3D mask models in optical projection lithography

    NASA Astrophysics Data System (ADS)

    Agudelo, Viviana; Evanschitzky, Peter; Erdmann, Andreas; Fühner, Tim; Shao, Feng; Limmer, Steffen; Fey, Dietmar

    2011-04-01

    Different mask models have been compared: rigorous electromagnetic field (EMF) modeling, rigorous EMF modeling with decomposition techniques and the thin mask approach (Kirchhoff approach) to simulate optical diffraction from different mask patterns in projection systems for lithography. In addition, each rigorous model was tested for two different formulations for partially coherent imaging: The Hopkins assumption and rigorous simulation of mask diffraction orders for multiple illumination angles. The aim of this work is to closely approximate results of the rigorous EMF method by the thin mask model enhanced with pupil filtering techniques. The validity of this approach for different feature sizes, shapes and illumination conditions is investigated.

  14. Comprehensive methods for earlier detection and monitoring of forest decline

    Treesearch

    Jennifer Pontius; Richard Hallett

    2014-01-01

    Forested ecosystems are threatened by invasive pests, pathogens, and unusual climatic events brought about by climate change. Earlier detection of incipient forest health problems and a quantitatively rigorous assessment method is increasingly important. Here, we describe a method that is adaptable across tree species and stress agents and practical for use in the...

  15. Dinosaurs in decline tens of millions of years before their final extinction

    PubMed Central

    Sakamoto, Manabu; Benton, Michael J.; Venditti, Chris

    2016-01-01

    Whether dinosaurs were in a long-term decline or whether they were reigning strong right up to their final disappearance at the Cretaceous–Paleogene (K-Pg) mass extinction event 66 Mya has been debated for decades with no clear resolution. The dispute has continued unresolved because of a lack of statistical rigor and appropriate evolutionary framework. Here, for the first time to our knowledge, we apply a Bayesian phylogenetic approach to model the evolutionary dynamics of speciation and extinction through time in Mesozoic dinosaurs, properly taking account of previously ignored statistical violations. We find overwhelming support for a long-term decline across all dinosaurs and within all three dinosaurian subclades (Ornithischia, Sauropodomorpha, and Theropoda), where speciation rate slowed down through time and was ultimately exceeded by extinction rate tens of millions of years before the K-Pg boundary. The only exceptions to this general pattern are the morphologically specialized herbivores, the Hadrosauriformes and Ceratopsidae, which show rapid species proliferations throughout the Late Cretaceous instead. Our results highlight that, despite some heterogeneity in speciation dynamics, dinosaurs showed a marked reduction in their ability to replace extinct species with new ones, making them vulnerable to extinction and unable to respond quickly to and recover from the final catastrophic event. PMID:27092007

  16. Dinosaurs in decline tens of millions of years before their final extinction.

    PubMed

    Sakamoto, Manabu; Benton, Michael J; Venditti, Chris

    2016-05-03

    Whether dinosaurs were in a long-term decline or whether they were reigning strong right up to their final disappearance at the Cretaceous-Paleogene (K-Pg) mass extinction event 66 Mya has been debated for decades with no clear resolution. The dispute has continued unresolved because of a lack of statistical rigor and appropriate evolutionary framework. Here, for the first time to our knowledge, we apply a Bayesian phylogenetic approach to model the evolutionary dynamics of speciation and extinction through time in Mesozoic dinosaurs, properly taking account of previously ignored statistical violations. We find overwhelming support for a long-term decline across all dinosaurs and within all three dinosaurian subclades (Ornithischia, Sauropodomorpha, and Theropoda), where speciation rate slowed down through time and was ultimately exceeded by extinction rate tens of millions of years before the K-Pg boundary. The only exceptions to this general pattern are the morphologically specialized herbivores, the Hadrosauriformes and Ceratopsidae, which show rapid species proliferations throughout the Late Cretaceous instead. Our results highlight that, despite some heterogeneity in speciation dynamics, dinosaurs showed a marked reduction in their ability to replace extinct species with new ones, making them vulnerable to extinction and unable to respond quickly to and recover from the final catastrophic event.

  17. Dinosaurs in decline tens of millions of years before their final extinction

    NASA Astrophysics Data System (ADS)

    Sakamoto, Manabu; Benton, Michael J.

    2016-05-01

    Whether dinosaurs were in a long-term decline or whether they were reigning strong right up to their final disappearance at the Cretaceous-Paleogene (K-Pg) mass extinction event 66 Mya has been debated for decades with no clear resolution. The dispute has continued unresolved because of a lack of statistical rigor and appropriate evolutionary framework. Here, for the first time to our knowledge, we apply a Bayesian phylogenetic approach to model the evolutionary dynamics of speciation and extinction through time in Mesozoic dinosaurs, properly taking account of previously ignored statistical violations. We find overwhelming support for a long-term decline across all dinosaurs and within all three dinosaurian subclades (Ornithischia, Sauropodomorpha, and Theropoda), where speciation rate slowed down through time and was ultimately exceeded by extinction rate tens of millions of years before the K-Pg boundary. The only exceptions to this general pattern are the morphologically specialized herbivores, the Hadrosauriformes and Ceratopsidae, which show rapid species proliferations throughout the Late Cretaceous instead. Our results highlight that, despite some heterogeneity in speciation dynamics, dinosaurs showed a marked reduction in their ability to replace extinct species with new ones, making them vulnerable to extinction and unable to respond quickly to and recover from the final catastrophic event.

  18. Measuring Critical Thinking: Results from an Art Museum Field Trip Experiment

    ERIC Educational Resources Information Center

    Kisida, Brian; Bowen, Daniel H.; Greene, Jay P.

    2016-01-01

    Research shows that participation in school-based arts education has declined over the past decade. A problem for the arts' role in education has been a lack of rigorous scholarship that demonstrates educational benefits. A component of this problem has been a lack of available data. In this study, we use original data collected through a…

  19. Designing End-of-Year Exams: Trials and Tribulations

    ERIC Educational Resources Information Center

    Stanford, Matt

    2017-01-01

    Since the decline of the National Curriculum Level Descriptions, schools in England have been asked to design their own forms of assessment at Key Stage 3. This had led to a great deal of creativity, but also a number of challenges. In this article Matt Stanford reflects on his department's attempts to develop a rigorous end-of year assessment. In…

  20. Counting Cats: Spatially Explicit Population Estimates of Cheetah (Acinonyx jubatus) Using Unstructured Sampling Data

    PubMed Central

    Broekhuis, Femke; Gopalaswamy, Arjun M.

    2016-01-01

    Many ecological theories and species conservation programmes rely on accurate estimates of population density. Accurate density estimation, especially for species facing rapid declines, requires the application of rigorous field and analytical methods. However, obtaining accurate density estimates of carnivores can be challenging as carnivores naturally exist at relatively low densities and are often elusive and wide-ranging. In this study, we employ an unstructured spatial sampling field design along with a Bayesian sex-specific spatially explicit capture-recapture (SECR) analysis, to provide the first rigorous population density estimates of cheetahs (Acinonyx jubatus) in the Maasai Mara, Kenya. We estimate adult cheetah density to be between 1.28 ± 0.315 and 1.34 ± 0.337 individuals/100km2 across four candidate models specified in our analysis. Our spatially explicit approach revealed ‘hotspots’ of cheetah density, highlighting that cheetah are distributed heterogeneously across the landscape. The SECR models incorporated a movement range parameter which indicated that male cheetah moved four times as much as females, possibly because female movement was restricted by their reproductive status and/or the spatial distribution of prey. We show that SECR can be used for spatially unstructured data to successfully characterise the spatial distribution of a low density species and also estimate population density when sample size is small. Our sampling and modelling framework will help determine spatial and temporal variation in cheetah densities, providing a foundation for their conservation and management. Based on our results we encourage other researchers to adopt a similar approach in estimating densities of individually recognisable species. PMID:27135614

  1. Counting Cats: Spatially Explicit Population Estimates of Cheetah (Acinonyx jubatus) Using Unstructured Sampling Data.

    PubMed

    Broekhuis, Femke; Gopalaswamy, Arjun M

    2016-01-01

    Many ecological theories and species conservation programmes rely on accurate estimates of population density. Accurate density estimation, especially for species facing rapid declines, requires the application of rigorous field and analytical methods. However, obtaining accurate density estimates of carnivores can be challenging as carnivores naturally exist at relatively low densities and are often elusive and wide-ranging. In this study, we employ an unstructured spatial sampling field design along with a Bayesian sex-specific spatially explicit capture-recapture (SECR) analysis, to provide the first rigorous population density estimates of cheetahs (Acinonyx jubatus) in the Maasai Mara, Kenya. We estimate adult cheetah density to be between 1.28 ± 0.315 and 1.34 ± 0.337 individuals/100km2 across four candidate models specified in our analysis. Our spatially explicit approach revealed 'hotspots' of cheetah density, highlighting that cheetah are distributed heterogeneously across the landscape. The SECR models incorporated a movement range parameter which indicated that male cheetah moved four times as much as females, possibly because female movement was restricted by their reproductive status and/or the spatial distribution of prey. We show that SECR can be used for spatially unstructured data to successfully characterise the spatial distribution of a low density species and also estimate population density when sample size is small. Our sampling and modelling framework will help determine spatial and temporal variation in cheetah densities, providing a foundation for their conservation and management. Based on our results we encourage other researchers to adopt a similar approach in estimating densities of individually recognisable species.

  2. No association between dietary patterns and risk for cognitive decline in older women with nine-year follow-up: data from the Women’s Health Initiative Memory Study

    PubMed Central

    Haring, Bernhard; Wu, Chunyuan; Mossavar-Rahmani, Yasmin; Snetselaar, Linda; Brunner, Robert; Wallace, Robert B.; Neuhouser, Marian L.; Wassertheil-Smoller, Sylvia

    2015-01-01

    Background Data on the association between dietary patterns and age-related cognitive decline are inconsistent. Objective To determine whether dietary patterns assessed by the alternate Mediterranean diet score (aMED), the Healthy Eating Index (HEI) 2010, the Alternate Healthy Eating Index (AHEI) 2010 or the Dietary Approach to Stop Hypertension (DASH) diet score are associated with cognitive decline in older women. To examine if dietary patterns modify the risk for cognitive decline in hypertensive women. Design Prospective, longitudinal cohort study. Food frequency questionnaires (FFQs) were used to derive dietary patterns at baseline. Hypertension was defined as self-report of current drug therapy for hypertension or clinic measurement of SBP ≥ 140mmHg or DBP ≥ 90mmHg. Participants/setting Postmenopausal women (N=6,425) aged 65 to 79 years who participated in the Women’s Health Initiative Memory Study (WHIMS) and were cognitively intact at baseline. Main Outcome Measures Cognitive decline was defined as cases of mild cognitive impairment (MCI) or probable dementia (PD). Cases were identified through rigorous screening and expert adjudication. Statistical Analyses performed Cox proportional hazards models with multivariable adjustment were used to estimate the relative risk for developing MCI or PD. Results During a median follow-up of 9.11 years, we documented 499 cases of MCI and 390 of PD. In multivariable analyses we did not detect any statistically significant relationships across quintiles of aMED, HEI-2010, DASH and AHEI-2010 scores and MCI or PD (ptrend=0.30, 0.44, 0.23 and 0.45). In hypertensive women we found no significant association between dietary patterns and cognitive decline (ptrend=0.19, 0.08, 0.07 and 0.60). Conclusions Dietary patterns characterized by the aMED, HEI-2010, AHEI-2010 or DASH dietary score were not associated with cognitive decline in older women. Adherence to a healthy dietary pattern did not modify the risk for cognitive decline in hypertensive women. PMID:27050728

  3. Muscle structure, sarcomere length and influences on meat quality: A review.

    PubMed

    Ertbjerg, Per; Puolanne, Eero

    2017-10-01

    The basic contractile unit of muscle, the sarcomere, will contract as the muscle goes into rigor post-mortem. Depending on the conditions, such as the rate of pH decline, the cooling rate and the mechanical restraints on the muscles, this longitudinal shortening will result in various post-mortem sarcomere lengths as well as lateral differences in the distances between the myosin and actin filaments. This shortening is underlying the phenomena described as rigor contraction, thaw rigor, cold shortening and heat shortening. The shortening in combination with the molecular architecture of the sarcomere as defined by the myosin filaments and their S-1 and S-2 units, the interaction with the actin filaments, and the boundaries formed by the Z-disks will subsequently influence basic meat quality traits including tenderness and water-holding capacity. Biochemical reactions from proteolysis and glycogen metabolism interrelate with the sarcomere length in a complex manner. The sarcomere length is also influencing the eating quality of cooked meat and the water-holding in meat products. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Misfortunes of War. Press and Public Reactions to Civilian Deaths in Wartime

    DTIC Science & Technology

    2006-01-01

    Although declines in reporting were evident even before the incident, the shootings at Columbine High School in Littleton, Colorado, on April 20...RAND mono- graphs undergo rigorous peer review to ensure high standards for research quality and objectivity. Prepared for the United States Air Force...189 5.13. High -Altitude Diagram Used by MG McChrystal. . . . . . . . . . . . 193 5.14. Reporting on Civilian Casualties, Marketplace Incident

  5. Near infrared spectroscopy as an on-line method to quantitatively determine glycogen and predict ultimate pH in pre rigor bovine M. longissimus dorsi.

    PubMed

    Lomiwes, D; Reis, M M; Wiklund, E; Young, O A; North, M

    2010-12-01

    The potential of near infrared (NIR) spectroscopy as an on-line method to quantify glycogen and predict ultimate pH (pH(u)) of pre rigor beef M. longissimus dorsi (LD) was assessed. NIR spectra (538 to 1677 nm) of pre rigor LD from steers, cows and bulls were collected early post mortem and measurements were made for pre rigor glycogen concentration and pH(u). Spectral and measured data were combined to develop models to quantify glycogen and predict the pH(u) of pre rigor LD. NIR spectra and pre rigor predicted values obtained from quantitative models were shown to be poorly correlated against glycogen and pH(u) (r(2)=0.23 and 0.20, respectively). Qualitative models developed to categorize each muscle according to their pH(u) were able to correctly categorize 42% of high pH(u) samples. Optimum qualitative and quantitative models derived from NIR spectra found low correlation between predicted values and reference measurements. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd.. All rights reserved.

  6. Combined effects of climate, predation, and density dependence on Greater and Lesser Scaup population dynamics

    USGS Publications Warehouse

    Ross, Beth E.; Hooten, Mevin B.; DeVink, Jean-Michel; Koons, David N.

    2015-01-01

    An understanding of species relationships is critical in the management and conservation of populations facing climate change, yet few studies address how climate alters species interactions and other population drivers. We use a long-term, broad-scale data set of relative abundance to examine the influence of climate, predators, and density dependence on the population dynamics of declining scaup (Aythya) species within the core of their breeding range. The state-space modeling approach we use applies to a wide range of wildlife species, especially populations monitored over broad spatiotemporal extents. Using this approach, we found that immediate snow cover extent in the preceding winter and spring had the strongest effects, with increases in mean snow cover extent having a positive effect on the local surveyed abundance of scaup. The direct effects of mesopredator abundance on scaup population dynamics were weaker, but the results still indicated a potential interactive process between climate and food web dynamics (mesopredators, alternative prey, and scaup). By considering climate variables and other potential effects on population dynamics, and using a rigorous estimation framework, we provide insight into complex ecological processes for guiding conservation and policy actions aimed at mitigating and reversing the decline of scaup.

  7. Environmental drivers of crocodyliform extinction across the Jurassic/Cretaceous transition

    PubMed Central

    Mannion, Philip D.; Upchurch, Paul

    2016-01-01

    Crocodyliforms have a much richer evolutionary history than represented by their extant descendants, including several independent marine and terrestrial radiations during the Mesozoic. However, heterogeneous sampling of their fossil record has obscured their macroevolutionary dynamics, and obfuscated attempts to reconcile external drivers of these patterns. Here, we present a comprehensive analysis of crocodyliform biodiversity through the Jurassic/Cretaceous (J/K) transition using subsampling and phylogenetic approaches and apply maximum-likelihood methods to fit models of extrinsic variables to assess what mediated these patterns. A combination of fluctuations in sea-level and episodic perturbations to the carbon and sulfur cycles was primarily responsible for both a marine and non-marine crocodyliform biodiversity decline through the J/K boundary, primarily documented in Europe. This was tracked by high extinction rates at the boundary and suppressed origination rates throughout the Early Cretaceous. The diversification of Eusuchia and Notosuchia likely emanated from the easing of ecological pressure resulting from the biodiversity decline, which also culminated in the extinction of the marine thalattosuchians in the late Early Cretaceous. Through application of rigorous techniques for estimating biodiversity, our results demonstrate that it is possible to tease apart the complex array of controls on diversification patterns in major archosaur clades. PMID:26962137

  8. Troyer Syndrome

    MedlinePlus

    ... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ...

  9. Transient Ischemic Attack

    MedlinePlus

    ... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ...

  10. Modeling the relationship between water level, wild rice abundance, and waterfowl abundance at a central North American wetland

    USGS Publications Warehouse

    Aagaard, Kevin; Eash, Josh D.; Ford, Walt; Heglund, Patricia J.; McDowell, Michelle; Thogmartin, Wayne E.

    2018-01-01

    Recent evidence suggests wild rice (Zizania palustris), an important resource for migrating waterfowl, is declining in parts of central North America, providing motivation to rigorously quantify the relationship between waterfowl and wild rice. A hierarchical mixed-effects model was applied to data on waterfowl abundance for 16 species, wild rice stem density, and two measures of water depth (true water depth at vegetation sampling locations and water surface elevation). Results provide evidence for an effect of true water depth (TWD) on wild rice abundance (posterior mean estimate for TWD coefficient, β TWD = 0.92, 95% confidence interval = 0.11—1.74), but not for an effect of wild rice stem density or water surface elevation on local waterfowl abundance (posterior mean values for relevant parameters overlapped 0). Refined protocols for sampling design and more consistent sampling frequency to increase data quality should be pursued to overcome issues that may have obfuscated relationships evaluated here.

  11. Matter Gravitates, but Does Gravity Matter?

    ERIC Educational Resources Information Center

    Groetsch, C. W.

    2011-01-01

    The interplay of physical intuition, computational evidence, and mathematical rigor in a simple trajectory model is explored. A thought experiment based on the model is used to elicit student conjectures on the influence of a physical parameter; a mathematical model suggests a computational investigation of the conjectures, and rigorous analysis…

  12. Academic Rigor in General Education, Introductory Astronomy Courses for Nonscience Majors

    ERIC Educational Resources Information Center

    Brogt, Erik; Draeger, John D.

    2015-01-01

    We discuss a model of academic rigor and apply this to a general education introductory astronomy course. We argue that even without central tenets of professional astronomy-the use of mathematics--the course can still be considered academically rigorous when expectations, goals, assessments, and curriculum are properly aligned.

  13. Wildfire, climate, and invasive grass interactions negatively impact an indicator species by reshaping sagebrush ecosystems.

    PubMed

    Coates, Peter S; Ricca, Mark A; Prochazka, Brian G; Brooks, Matthew L; Doherty, Kevin E; Kroger, Travis; Blomberg, Erik J; Hagen, Christian A; Casazza, Michael L

    2016-10-25

    Iconic sagebrush ecosystems of the American West are threatened by larger and more frequent wildfires that can kill sagebrush and facilitate invasion by annual grasses, creating a cycle that alters sagebrush ecosystem recovery post disturbance. Thwarting this accelerated grass-fire cycle is at the forefront of current national conservation efforts, yet its impacts on wildlife populations inhabiting these ecosystems have not been quantified rigorously. Within a Bayesian framework, we modeled 30 y of wildfire and climatic effects on population rates of change of a sagebrush-obligate species, the greater sage-grouse, across the Great Basin of western North America. Importantly, our modeling also accounted for variation in sagebrush recovery time post fire as determined by underlying soil properties that influence ecosystem resilience to disturbance and resistance to invasion. Our results demonstrate that the cumulative loss of sagebrush to direct and indirect effects of wildfire has contributed strongly to declining sage-grouse populations over the past 30 y at large spatial scales. Moreover, long-lasting effects from wildfire nullified pulses of sage-grouse population growth that typically follow years of higher precipitation. If wildfire trends continue unabated, model projections indicate sage-grouse populations will be reduced to 43% of their current numbers over the next three decades. Our results provide a timely example of how altered fire regimes are disrupting recovery of sagebrush ecosystems and leading to substantial declines of a widespread indicator species. Accordingly, we present scenario-based stochastic projections to inform conservation actions that may help offset the adverse effects of wildfire on sage-grouse and other wildlife populations.

  14. Wildfire, climate, and invasive grass interactions negatively impact an indicator species by reshaping sagebrush ecosystems

    USGS Publications Warehouse

    Coates, Peter S.; Ricca, Mark; Prochazka, Brian; Brooks, Matthew L.; Doherty, Kevin E.; Kroger, Travis; Blomberg, Erik J.; Hagen, Christian A.; Casazza, Michael L.

    2016-01-01

    Iconic sagebrush ecosystems of the American West are threatened by larger and more frequent wildfires that can kill sagebrush and facilitate invasion by annual grasses, creating a cycle that alters sagebrush ecosystem recovery post disturbance. Thwarting this accelerated grass–fire cycle is at the forefront of current national conservation efforts, yet its impacts on wildlife populations inhabiting these ecosystems have not been quantified rigorously. Within a Bayesian framework, we modeled 30 y of wildfire and climatic effects on population rates of change of a sagebrush-obligate species, the greater sage-grouse, across the Great Basin of western North America. Importantly, our modeling also accounted for variation in sagebrush recovery time post fire as determined by underlying soil properties that influence ecosystem resilience to disturbance and resistance to invasion. Our results demonstrate that the cumulative loss of sagebrush to direct and indirect effects of wildfire has contributed strongly to declining sage-grouse populations over the past 30 y at large spatial scales. Moreover, long-lasting effects from wildfire nullified pulses of sage-grouse population growth that typically follow years of higher precipitation. If wildfire trends continue unabated, model projections indicate sage-grouse populations will be reduced to 43% of their current numbers over the next three decades. Our results provide a timely example of how altered fire regimes are disrupting recovery of sagebrush ecosystems and leading to substantial declines of a widespread indicator species. Accordingly, we present scenario-based stochastic projections to inform conservation actions that may help offset the adverse effects of wildfire on sage-grouse and other wildlife populations.

  15. Pipeline issues

    NASA Technical Reports Server (NTRS)

    Eisley, Joe T.

    1990-01-01

    The declining pool of graduates, the lack of rigorous preparation in science and mathematics, and the declining interest in science and engineering careers at the precollege level promises a shortage of technically educated personnel at the college level for industry, government, and the universities in the next several decades. The educational process, which starts out with a large number of students at the elementary level, but with an ever smaller number preparing for science and engineering at each more advanced educational level, is in a state of crisis. These pipeline issues, so called because the educational process is likened to a series of ever smaller constrictions in a pipe, were examined in a workshop at the Space Grant Conference and a summary of the presentations and the results of the discussion, and the conclusions of the workshop participants are reported.

  16. Tri-critical behavior of the Blume-Emery-Griffiths model on a Kagomé lattice: Effective-field theory and Rigorous bounds

    NASA Astrophysics Data System (ADS)

    Santos, Jander P.; Sá Barreto, F. C.

    2016-01-01

    Spin correlation identities for the Blume-Emery-Griffiths model on Kagomé lattice are derived and combined with rigorous correlation inequalities lead to upper bounds on the critical temperature. From the spin correlation identities the mean field approximation and the effective field approximation results for the magnetization, the critical frontiers and the tricritical points are obtained. The rigorous upper bounds on the critical temperature improve over those effective-field type theories results.

  17. Inflammation and immune system activation in aging: a mathematical approach.

    PubMed

    Nikas, Jason B

    2013-11-19

    Memory and learning declines are consequences of normal aging. Since those functions are associated with the hippocampus, I analyzed the global gene expression data from post-mortem hippocampal tissue of 25 old (age ≥ 60 yrs) and 15 young (age ≤ 45 yrs) cognitively intact human subjects. By employing a rigorous, multi-method bioinformatic approach, I identified 36 genes that were the most significant in terms of differential expression; and by employing mathematical modeling, I demonstrated that 7 of the 36 genes were able to discriminate between the old and young subjects with high accuracy. Remarkably, 90% of the known genes from those 36 most significant genes are associated with either inflammation or immune system activation. This suggests that chronic inflammation and immune system over-activity may underlie the aging process of the human brain, and that potential anti-inflammatory treatments targeting those genes may slow down this process and alleviate its symptoms.

  18. Developing a Student Conception of Academic Rigor

    ERIC Educational Resources Information Center

    Draeger, John; del Prado Hill, Pixita; Mahler, Ronnie

    2015-01-01

    In this article we describe models of academic rigor from the student point of view. Drawing on a campus-wide survey, focus groups, and interviews with students, we found that students explained academic rigor in terms of workload, grading standards, level of difficulty, level of interest, and perceived relevance to future goals. These findings…

  19. Long-term effects of wildfire on greater sage-grouse - integrating population and ecosystem concepts for management in the Great Basin

    USGS Publications Warehouse

    Coates, Peter S.; Ricca, Mark A.; Prochazka, Brian G.; Doherty, Kevin E.; Brooks, Matthew L.; Casazza, Michael L.

    2015-09-10

    Greater sage-grouse (Centrocercus urophasianus; hereinafter, sage-grouse) are a sagebrush obligate species that has declined concomitantly with the loss and fragmentation of sagebrush ecosystems across most of its geographical range. The species currently is listed as a candidate for federal protection under the Endangered Species Act (ESA). Increasing wildfire frequency and changing climate frequently are identified as two environmental drivers that contribute to the decline of sage-grouse populations, yet few studies have rigorously quantified their effects on sage-grouse populations across broad spatial scales and long time periods. To help inform a threat assessment within the Great Basin for listing sage-grouse in 2015 under the ESA, we conducted an extensive analysis of wildfire and climatic effects on sage-grouse population growth derived from 30 years of lek-count data collected across the hydrographic Great Basin of Western North America. Annual (1984–2013) patterns of wildfire were derived from an extensive dataset of remotely sensed 30-meter imagery and precipitation derived from locally downscaled spatially explicit data. In the sagebrush ecosystem, underlying soil conditions also contribute strongly to variation in resilience to disturbance and resistance to plant community changes (R&R). Thus, we developed predictions from models of post-wildfire recovery and chronic effects of wildfire based on three spatially explicit R&R classes derived from soil moisture and temperature regimes. We found evidence of an interaction between the effects of wildfire (chronically affected burned area within 5 kilometers of a lek) and climatic conditions (spring through fall precipitation) after accounting for a consistent density-dependent effect. Specifically, burned areas near leks nullifies population growth that normally follows years with relatively high precipitation. In models, this effect results in long-term population declines for sage-grouse despite cyclic periods of high precipitation. Based on 30-year projections of burn and recovery rates, our population model predicted steady and substantial long-term declines in population size across the Great Basin. Further, example management scenarios that may help offset adverse wildfire effects are provided by models of varying levels of fire suppression and post-wildfire restoration that focus on areas especially important to sage-grouse populations. These models illustrate how sage-grouse population persistence likely will be compromised as sagebrush ecosystems and sage-grouse habitat are degraded by wildfire, especially in a warmer and drier climate, and by invasion of annual grasses that can increase wildfire frequency and size in the Great Basin.

  20. Changing perspectives on pearly mussels, North America's most imperiled animals

    USGS Publications Warehouse

    Strayer, David L.; Downing, John A.; Haag, Wendell R.; King, Timothy L.; Layzer, James B.; Newton, Teresa J.; Nichols, S. Jerrine

    2004-01-01

    Pearly mussels (Unionacea) are widespread, abundant, and important in freshwater ecosystems around the world. Catastrophic declines in pearly mussel populations in North America and other parts of the world have led to a flurry of research on mussel biology, ecology, and conservation. Recent research on mussel feeding, life history, spatial patterning, and declines has augmented, modified, or overturned long-held ideas about the ecology of these animals. Pearly mussel research has begun to benefit from and contribute to current ideas about suspension feeding, life-history theory, metapopulations, flow refuges, spatial patterning and its effects, and management of endangered species. At the same time, significant gaps in understanding and apparent paradoxes in pearly mussel ecology have been exposed. To conserve remaining mussel populations, scientists and managers must simultaneously and aggressively pursue both rigorous research and conservation actions.

  1. Perceived decline in intimate partner violence against women in Bangladesh: qualitative evidence.

    PubMed

    Schuler, Sidney Ruth; Lenzi, Rachel; Nazneen, Sohela; Bates, Lisa M

    2013-09-01

    The Bangladesh government, nongovernmental organizations, donors, and advocacy groups have attempted various interventions to promote gender equality and reduce intimate partner violence (IPV) against women, but rigorous evaluations of these interventions are rare and few published studies have yet to show that any of them has had a substantial impact. This study presents qualitative evidence from four villages in central and northern Bangladesh drawn from 11 group discussions (6 with men, 5 with women), 16 open-ended interviews with men, and 62 women's life history narratives. The findings strongly suggest that IPV is declining in these villages as women's economic roles expand and they gain a stronger sense of their rights. Periodic surveys are recommended to measure trends in the incidence of IPV in settings where transitions in gender systems are under way. © 2013 The Population Council, Inc.

  2. A model comparison approach shows stronger support for economic models of fertility decline

    PubMed Central

    Shenk, Mary K.; Towner, Mary C.; Kress, Howard C.; Alam, Nurul

    2013-01-01

    The demographic transition is an ongoing global phenomenon in which high fertility and mortality rates are replaced by low fertility and mortality. Despite intense interest in the causes of the transition, especially with respect to decreasing fertility rates, the underlying mechanisms motivating it are still subject to much debate. The literature is crowded with competing theories, including causal models that emphasize (i) mortality and extrinsic risk, (ii) the economic costs and benefits of investing in self and children, and (iii) the cultural transmission of low-fertility social norms. Distinguishing between models, however, requires more comprehensive, better-controlled studies than have been published to date. We use detailed demographic data from recent fieldwork to determine which models produce the most robust explanation of the rapid, recent demographic transition in rural Bangladesh. To rigorously compare models, we use an evidence-based statistical approach using model selection techniques derived from likelihood theory. This approach allows us to quantify the relative evidence the data give to alternative models, even when model predictions are not mutually exclusive. Results indicate that fertility, measured as either total fertility or surviving children, is best explained by models emphasizing economic factors and related motivations for parental investment. Our results also suggest important synergies between models, implicating multiple causal pathways in the rapidity and degree of recent demographic transitions. PMID:23630293

  3. Rigorous mathematical modelling for a Fast Corrector Power Supply in TPS

    NASA Astrophysics Data System (ADS)

    Liu, K.-B.; Liu, C.-Y.; Chien, Y.-C.; Wang, B.-S.; Wong, Y. S.

    2017-04-01

    To enhance the stability of beam orbit, a Fast Orbit Feedback System (FOFB) eliminating undesired disturbances was installed and tested in the 3rd generation synchrotron light source of Taiwan Photon Source (TPS) of National Synchrotron Radiation Research Center (NSRRC). The effectiveness of the FOFB greatly depends on the output performance of Fast Corrector Power Supply (FCPS); therefore, the design and implementation of an accurate FCPS is essential. A rigorous mathematical modelling is very useful to shorten design time and improve design performance of a FCPS. A rigorous mathematical modelling derived by the state-space averaging method for a FCPS in the FOFB of TPS composed of a full-bridge topology is therefore proposed in this paper. The MATLAB/SIMULINK software is used to construct the proposed mathematical modelling and to conduct the simulations of the FCPS. Simulations for the effects of the different resolutions of ADC on the output accuracy of the FCPS are investigated. A FCPS prototype is realized to demonstrate the effectiveness of the proposed rigorous mathematical modelling for the FCPS. Simulation and experimental results show that the proposed mathematical modelling is helpful for selecting the appropriate components to meet the accuracy requirements of a FCPS.

  4. Comparison of different cooling regimes within a shortened liquid cooling/warming garment on physiological and psychological comfort during exercise

    NASA Technical Reports Server (NTRS)

    Leon, Gloria R.; Koscheyev, Victor S.; Coca, Aitor; List, Nathan

    2004-01-01

    The aim of this study was to compare the effectiveness of different cooling regime intensities to maintain physiological and subjective comfort during physical exertion levels comparable to that engaged in during extravehicular activities (EVA) in space. We studied eight subjects (six males, two females) donned in our newly developed physiologically based shortened liquid cooling/warming garment (SLCWG). Rigorous (condition 1) and mild (condition 2) water temperature cooling regimes were compared at physical exertion levels comparable to that performed during EVA to ascertain the effectiveness of a lesser intensity of cooling in maintaining thermal comfort, thus reducing energy consumption in the portable life support system. Exercise intensity was varied across stages of the session. Finger temperature, rectal temperature, and subjective perception of overall body and hand comfort were assessed. Finger temperature was significantly higher in the rigorous cooling condition and showed a consistent increase across exercise stages, likely due to the restriction of heat extraction because of the intensive cold. In the mild cooling condition, finger temperature exhibited an overall decline with cooling, indicating greater heat extraction from the body. Rectal temperature was not significantly different between conditions, and showed a steady increase over exercise stages in both rigorous and mild cooling conditions. Ratings of overall comfort were 30% higher (more positive) and more stable in mild cooling (p<0.001). The mild cooling regime was more effective than rigorous cooling in allowing the process of heat exchange to occur, thus maintaining thermal homeostasis and subjective comfort during physical exertion.

  5. The effect of temperature on the mechanical aspects of rigor mortis in a liquid paraffin model.

    PubMed

    Ozawa, Masayoshi; Iwadate, Kimiharu; Matsumoto, Sari; Asakura, Kumiko; Ochiai, Eriko; Maebashi, Kyoko

    2013-11-01

    Rigor mortis is an important phenomenon to estimate the postmortem interval in forensic medicine. Rigor mortis is affected by temperature. We measured stiffness of rat muscles using a liquid paraffin model to monitor the mechanical aspects of rigor mortis at five temperatures (37, 25, 10, 5 and 0°C). At 37, 25 and 10°C, the progression of stiffness was slower in cooler conditions. At 5 and 0°C, the muscle stiffness increased immediately after the muscles were soaked in cooled liquid paraffin and then muscles gradually became rigid without going through a relaxed state. This phenomenon suggests that it is important to be careful when estimating the postmortem interval in cold seasons. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Postoperative cognitive dysfunction and its relationship to cognitive reserve in elderly total joint replacement patients.

    PubMed

    Scott, J E; Mathias, J L; Kneebone, A C; Krishnan, J

    2017-06-01

    Whether total joint replacement (TJR) patients are susceptible to postoperative cognitive dysfunction (POCD) remains unclear due to inconsistencies in research methodologies. Moreover, cognitive reserve may moderate the development of POCD after TJR, but has not been investigated in this context. The current study investigated POCD after TJR, and its relationship with cognitive reserve, using a more rigorous methodology than has previously been utilized. Fifty-three older adults (aged 50+) scheduled for TJR were assessed pre and post surgery (6 months). Forty-five healthy controls matched for age, gender, and premorbid IQ were re-assessed after an equivalent interval. Cognition, cognitive reserve, and physical and mental health were all measured. Standardized regression-based methods were used to assess cognitive changes, while controlling for the confounding effect of repeated cognitive testing. TJR patients only demonstrated a significant decline in Trail Making Test Part B (TMT B) performance, compared to controls. Cognitive reserve only predicted change in TMT B scores among a subset of TJR patients. Specifically, patients who showed the most improvement pre to post surgery had significantly higher reserve than those who showed the greatest decline. The current study provides limited evidence of POCD after TJR when examined using a rigorous methodology, which controlled for practice effects. Cognitive reserve only predicted performance within a subset of the TJR sample. However, the role of reserve in more cognitively compromised patients remains to be determined.

  7. Understanding the low uptake of bone-anchored hearing aids: a review.

    PubMed

    Powell, R; Wearden, A; Pardesi, S M; Green, K

    2017-03-01

    Bone-anchored hearing aids improve hearing for patients for whom conventional behind-the-ear aids are problematic. However, uptake of bone-anchored hearing aids is low and it is important to understand why this is the case. A narrative review was conducted. Studies examining why people accept or decline bone-anchored hearing aids and satisfaction levels of people with bone-anchored hearing aids were reviewed. Reasons for declining bone-anchored hearing aids included limited perceived benefits, concerns about surgery, aesthetic concerns and treatment cost. No studies providing in-depth analysis of the reasons for declining or accepting bone-anchored hearing aids were identified. Studies of patient satisfaction showed that most participants reported benefits with bone-anchored hearing aids. However, most studies used cross-sectional and/or retrospective designs and only included people with bone-anchored hearing aids. Important avenues for further research are in-depth qualitative research designed to fully understand the decision-making process for bone-anchored hearing aids and rigorous quantitative research comparing satisfaction of people who receive bone-anchored hearing aids with those who receive alternative (or no) treatments.

  8. Peer Assessment with Online Tools to Improve Student Modeling

    ERIC Educational Resources Information Center

    Atkins, Leslie J.

    2012-01-01

    Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to…

  9. Spatial scaling and multi-model inference in landscape genetics: Martes americana in northern Idaho

    Treesearch

    Tzeidle N. Wasserman; Samuel A. Cushman; Michael K. Schwartz; David O. Wallin

    2010-01-01

    Individual-based analyses relating landscape structure to genetic distances across complex landscapes enable rigorous evaluation of multiple alternative hypotheses linking landscape structure to gene flow. We utilize two extensions to increase the rigor of the individual-based causal modeling approach to inferring relationships between landscape patterns and gene flow...

  10. Detection and attribution of temperature changes in the mountainous Western United States

    USGS Publications Warehouse

    Bonfils, Celine; Santer, B.D.; Pierce, D.W.; Hidalgo, H.G.; Bala, G.; Das, T.; Barnett, T.P.; Cayan, D.R.; Doutriaux, C.; Wood, A.W.; Mirin, A.; Nozawa, T.

    2008-01-01

    Large changes in the hydrology of the western United States have been observed since the mid-twentieth century. These include a reduction in the amount of precipitation arriving as snow, a decline in snowpack at low and midelevations, and a shift toward earlier arrival of both snowmelt and the centroid (center of mass) of streamflows. To project future water supply reliability, it is crucial to obtain a better understanding of the underlying cause or causes for these changes. A regional warming is often posited as the cause of these changes without formal testing of different competitive explanations for the warming. In this study, a rigorous detection and attribution analysis is performed to determine the causes of the late winter/early spring changes in hydrologically relevant temperature variables over mountain ranges of the western United States. Natural internal climate variability, as estimated from two long control climate model simulations, is insufficient to explain the rapid increase in daily minimum and maximum temperatures, the sharp decline in frost days, and the rise in degree-days above 0??C (a simple proxy for temperature driven snowmelt). These observed changes are also inconsistent with the model-predicted responses to variability in solar irradiance and volcanic activity. The observations are consistent with climite simulations that include the combined effects of anthropogenic greenhouse gases and aerosols. It is found that, for each temperature variable considered, an anthropogenic signal is identifiable in observational fields. The results are robust to uncertainties in model-estimated fingerprints and natural variability noise, to the choice of statistical down-scaling method, and to various processing options in the detection and attribution method. ?? 2008 American Meteorological Society.

  11. Reconstructing the historical distribution of the Amur Leopard (Panthera pardus orientalis) in Northeast China based on historical records

    PubMed Central

    Yang, Li; Huang, Mujiao; Zhang, Rui; Lv, Jiang; Ren, Yueheng; Jiang, Zhe; Zhang, Wei; Luan, Xiaofeng

    2016-01-01

    Abstract The range of the Amur leopard (Panthera pardus orientalis) has decreased dramatically over the last 100 years. This species is still under extreme risk of extinction and conservation efforts are rigorous. Understanding the long-term dynamics of the population decline would be helpful to offer insight into the mechanism behind the decline and endangerment and improve conservation perspectives and strategies. Historical data collection has been the challenge for reconstructing the historical distribution. In China, new gazetteers having systematic compilation and considerable local ecological data can be considered as an important complementary for reconstruction. Therefore, we have set up a data set (mainly based on the new gazetteers) in order to identify the historical range of the Amur Leopard from the 1950s to 2014. The result shows that the Amur leopard was historically widely distributed with large populations in Northeastern China, but it presented a sharp decline after the 1970s. The decline appeared from the plains to the mountains and northeast to southwest since the 1950s. Long-term historical data, mainly from new gazetteers, demonstrates that such resources are capable of tracking species change through time and offers an opportunity to reduce data shortage and enhance understanding in conservation. PMID:27408548

  12. Reconstructing the historical distribution of the Amur Leopard (Panthera pardus orientalis) in Northeast China based on historical records.

    PubMed

    Yang, Li; Huang, Mujiao; Zhang, Rui; Lv, Jiang; Ren, Yueheng; Jiang, Zhe; Zhang, Wei; Luan, Xiaofeng

    2016-01-01

    The range of the Amur leopard (Panthera pardus orientalis) has decreased dramatically over the last 100 years. This species is still under extreme risk of extinction and conservation efforts are rigorous. Understanding the long-term dynamics of the population decline would be helpful to offer insight into the mechanism behind the decline and endangerment and improve conservation perspectives and strategies. Historical data collection has been the challenge for reconstructing the historical distribution. In China, new gazetteers having systematic compilation and considerable local ecological data can be considered as an important complementary for reconstruction. Therefore, we have set up a data set (mainly based on the new gazetteers) in order to identify the historical range of the Amur Leopard from the 1950s to 2014. The result shows that the Amur leopard was historically widely distributed with large populations in Northeastern China, but it presented a sharp decline after the 1970s. The decline appeared from the plains to the mountains and northeast to southwest since the 1950s. Long-term historical data, mainly from new gazetteers, demonstrates that such resources are capable of tracking species change through time and offers an opportunity to reduce data shortage and enhance understanding in conservation.

  13. Catastrophic Decline of World's Largest Primate: 80% Loss of Grauer's Gorilla (Gorilla beringei graueri) Population Justifies Critically Endangered Status.

    PubMed

    Plumptre, Andrew J; Nixon, Stuart; Kujirakwinja, Deo K; Vieilledent, Ghislain; Critchlow, Rob; Williamson, Elizabeth A; Nishuli, Radar; Kirkby, Andrew E; Hall, Jefferson S

    2016-01-01

    Grauer's gorilla (Gorilla beringei graueri), the World's largest primate, is confined to eastern Democratic Republic of Congo (DRC) and is threatened by civil war and insecurity. During the war, armed groups in mining camps relied on hunting bushmeat, including gorillas. Insecurity and the presence of several militia groups across Grauer's gorilla's range made it very difficult to assess their population size. Here we use a novel method that enables rigorous assessment of local community and ranger-collected data on gorilla occupancy to evaluate the impacts of civil war on Grauer's gorilla, which prior to the war was estimated to number 16,900 individuals. We show that gorilla numbers in their stronghold of Kahuzi-Biega National Park have declined by 87%. Encounter rate data of gorilla nests at 10 sites across its range indicate declines of 82-100% at six of these sites. Spatial occupancy analysis identifies three key areas as the most critical sites for the remaining populations of this ape and that the range of this taxon is around 19,700 km2. We estimate that only 3,800 Grauer's gorillas remain in the wild, a 77% decline in one generation, justifying its elevation to Critically Endangered status on the IUCN Red List of Threatened Species.

  14. Intractable Seizures and Rehabilitation in Ciguatera Poisoning.

    PubMed

    Derian, Armen; Khurana, Seema; Rothenberg, Joshua; Plumlee, Charles

    2017-05-01

    Ciguatera fish poisoning is the most frequently reported seafood toxin illness associated with the ingestion of contaminated tropical fish. Diagnosis relies on a history of recent tropical fish ingestion and subsequent development of gastrointestinal, cardiovascular, and neurological symptoms. Ciguatera poisoning usually has a self-limited time course, and its management involves symptomatic control and supportive care. This case report presents an uncommon case of ciguatera poisoning with prolonged intractable seizures refractory to standard antiseizure medications. The patient also had significant functional decline that responded to rigorous inpatient rehabilitation not previously described in literature.

  15. Near Identifiability of Dynamical Systems

    NASA Technical Reports Server (NTRS)

    Hadaegh, F. Y.; Bekey, G. A.

    1987-01-01

    Concepts regarding approximate mathematical models treated rigorously. Paper presents new results in analysis of structural identifiability, equivalence, and near equivalence between mathematical models and physical processes they represent. Helps establish rigorous mathematical basis for concepts related to structural identifiability and equivalence revealing fundamental requirements, tacit assumptions, and sources of error. "Structural identifiability," as used by workers in this field, loosely translates as meaning ability to specify unique mathematical model and set of model parameters that accurately predict behavior of corresponding physical system.

  16. Is women’s empowerment contributing to a decline in intimate partner violence against women in Bangladesh? Evidence from a qualitative study

    PubMed Central

    Schuler, Sidney Ruth; Lenzi, Rachel; Nazneen, Sohela; Bates, Lisa M.

    2013-01-01

    The Bangladesh government, nongovernmental organizations, donors, and advocacy groups have attempted various interventions to promote gender equality and reduce intimate partner violence (IPV) against women, but rigorous evaluations of these interventions are rare and few published studies have yet to show that any of them has had a substantial impact. This study presents qualitative evidence from four villages in central and northern Bangladesh drawn from 11 group discussions (6 with men, 5 with women), 16 open-ended interviews with men, and 62 women’s life history narratives. The findings strongly suggest that IPV is declining in these villages as women’s economic roles expand and they gain a stronger sense of their rights. Periodic surveys are recommended to measure trends in the incidence of IPV in settings where transitions in gender systems are under way. PMID:24006072

  17. Boosting beauty in an economic decline: mating, spending, and the lipstick effect.

    PubMed

    Hill, Sarah E; Rodeheffer, Christopher D; Griskevicius, Vladas; Durante, Kristina; White, Andrew Edward

    2012-08-01

    Although consumer spending typically declines in economic recessions, some observers have noted that recessions appear to increase women's spending on beauty products--the so-called lipstick effect. Using both historical spending data and rigorous experiments, the authors examine how and why economic recessions influence women's consumer behavior. Findings revealed that recessionary cues--whether naturally occurring or experimentally primed--decreased desire for most products (e.g., electronics, household items). However, these cues consistently increased women's desire for products that increase attractiveness to mates--the first experimental demonstration of the lipstick effect. Additional studies show that this effect is driven by women's desire to attract mates with resources and depends on the perceived mate attraction function served by these products. In addition to showing how and why economic recessions influence women's desire for beauty products, this research provides novel insights into women's mating psychology, consumer behavior, and the relationship between the two.

  18. Inferring species interactions through joint mark–recapture analysis

    USGS Publications Warehouse

    Yackulic, Charles B.; Korman, Josh; Yard, Michael D.; Dzul, Maria C.

    2018-01-01

    Introduced species are frequently implicated in declines of native species. In many cases, however, evidence linking introduced species to native declines is weak. Failure to make strong inferences regarding the role of introduced species can hamper attempts to predict population viability and delay effective management responses. For many species, mark–recapture analysis is the more rigorous form of demographic analysis. However, to our knowledge, there are no mark–recapture models that allow for joint modeling of interacting species. Here, we introduce a two‐species mark–recapture population model in which the vital rates (and capture probabilities) of one species are allowed to vary in response to the abundance of the other species. We use a simulation study to explore bias and choose an approach to model selection. We then use the model to investigate species interactions between endangered humpback chub (Gila cypha) and introduced rainbow trout (Oncorhynchus mykiss) in the Colorado River between 2009 and 2016. In particular, we test hypotheses about how two environmental factors (turbidity and temperature), intraspecific density dependence, and rainbow trout abundance are related to survival, growth, and capture of juvenile humpback chub. We also project the long‐term effects of different rainbow trout abundances on adult humpback chub abundances. Our simulation study suggests this approach has minimal bias under potentially challenging circumstances (i.e., low capture probabilities) that characterized our application and that model selection using indicator variables could reliably identify the true generating model even when process error was high. When the model was applied to rainbow trout and humpback chub, we identified negative relationships between rainbow trout abundance and the survival, growth, and capture probability of juvenile humpback chub. Effects on interspecific interactions on survival and capture probability were strongly supported, whereas support for the growth effect was weaker. Environmental factors were also identified to be important and in many cases stronger than interspecific interactions, and there was still substantial unexplained variation in growth and survival rates. The general approach presented here for combining mark–recapture data for two species is applicable in many other systems and could be modified to model abundance of the invader via other modeling approaches.

  19. Effects of Chilling and Partial Freezing on Rigor Mortis Changes of Bighead Carp (Aristichthys nobilis) Fillets: Cathepsin Activity, Protein Degradation and Microstructure of Myofibrils.

    PubMed

    Lu, Han; Liu, Xiaochang; Zhang, Yuemei; Wang, Hang; Luo, Yongkang

    2015-12-01

    To investigate the effects of chilling and partial freezing on rigor mortis changes in bighead carp (Aristichthys nobilis), pH, cathepsin B, cathepsin B+L activities, SDS-PAGE of sarcoplasmic and myofibrillar proteins, texture, and changes in microstructure of fillets at 4 °C and -3 °C were determined at 0, 2, 4, 8, 12, 24, 48, and 72 h after slaughter. The results indicated that pH of fillets (6.50 to 6.80) was appropriate for cathepsin function during the rigor mortis. For fillets that were chilled and partially frozen, the cathepsin activity in lysosome increased consistently during the first 12 h, followed by a decrease from the 12 to 24 h, which paralleled an increase in activity in heavy mitochondria, myofibrils and sarcoplasm. There was no significant difference in cathepsin activity in lysosomes between fillets at 4 °C and -3 °C (P > 0.05). Partially frozen fillets had greater cathepsin activity in heavy mitochondria than chilled samples from the 48 to 72 h. In addition, partially frozen fillets showed higher cathepsin activity in sarcoplasm and lower cathepsin activity in myofibrils compared with chilled fillets. Correspondingly, we observed degradation of α-actinin (105 kDa) by cathepsin L in chilled fillets and degradation of creatine kinase (41 kDa) by cathepsin B in partially frozen fillets during the rigor mortis. The decline of hardness for both fillets might be attributed to the accumulation of cathepsin in myofibrils from the 8 to 24 h. The lower cathepsin activity in myofibrils for fillets that were partially frozen might induce a more intact cytoskeletal structure than fillets that were chilled. © 2015 Institute of Food Technologists®

  20. Forward modelling of global gravity fields with 3D density structures and an application to the high-resolution ( 2 km) gravity fields of the Moon

    NASA Astrophysics Data System (ADS)

    Šprlák, M.; Han, S.-C.; Featherstone, W. E.

    2017-12-01

    Rigorous modelling of the spherical gravitational potential spectra from the volumetric density and geometry of an attracting body is discussed. Firstly, we derive mathematical formulas for the spatial analysis of spherical harmonic coefficients. Secondly, we present a numerically efficient algorithm for rigorous forward modelling. We consider the finite-amplitude topographic modelling methods as special cases, with additional postulates on the volumetric density and geometry. Thirdly, we implement our algorithm in the form of computer programs and test their correctness with respect to the finite-amplitude topography routines. For this purpose, synthetic and realistic numerical experiments, applied to the gravitational field and geometry of the Moon, are performed. We also investigate the optimal choice of input parameters for the finite-amplitude modelling methods. Fourth, we exploit the rigorous forward modelling for the determination of the spherical gravitational potential spectra inferred by lunar crustal models with uniform, laterally variable, radially variable, and spatially (3D) variable bulk density. Also, we analyse these four different crustal models in terms of their spectral characteristics and band-limited radial gravitation. We demonstrate applicability of the rigorous forward modelling using currently available computational resources up to degree and order 2519 of the spherical harmonic expansion, which corresponds to a resolution of 2.2 km on the surface of the Moon. Computer codes, a user manual and scripts developed for the purposes of this study are publicly available to potential users.

  1. A critical question for NEC researchers: Can we create a consensus definition of NEC that facilitates research progress?

    PubMed

    Gordon, Phillip V; Swanson, Jonathan R; MacQueen, Brianna C; Christensen, Robert D

    2017-02-01

    In the last decades the reported incidence of preterm necrotizing enterocolitis (NEC) has been declining in large part due to implementing comprehensive NEC prevention initiatives, including breast milk feeding, standardized feeding protocols, transfusion guidelines, and antibiotic stewardship and improving the rigor with which non-NEC cases are excluded from NEC data. However, after more than 60 years of NEC research in animal models, the promise of a "magic bullet" to prevent NEC has yet to materialize. There are also serious issues involving clinical NEC research. There is a lack of a common, comprehensive definition of NEC. National datasets have their own unique definition and staging definitions. Even within academia, randomized trials and single center studies have widely disparate definitions. This makes NEC metadata of very limited value. The world of neonatology needs a comprehensive, universal, consensus definition of NEC. It also needs a de-identified, international data warehouse. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Turbulent particle transport in streams: can exponential settling be reconciled with fluid mechanics?

    PubMed

    McNair, James N; Newbold, J Denis

    2012-05-07

    Most ecological studies of particle transport in streams that focus on fine particulate organic matter or benthic invertebrates use the Exponential Settling Model (ESM) to characterize the longitudinal pattern of particle settling on the bed. The ESM predicts that if particles are released into a stream, the proportion that have not yet settled will decline exponentially with transport time or distance and will be independent of the release elevation above the bed. To date, no credible basis in fluid mechanics has been established for this model, nor has it been rigorously tested against more-mechanistic alternative models. One alternative is the Local Exchange Model (LEM), which is a stochastic advection-diffusion model that includes both longitudinal and vertical spatial dimensions and is based on classical fluid mechanics. The LEM predicts that particle settling will be non-exponential in the near field but will become exponential in the far field, providing a new theoretical justification for far-field exponential settling that is based on plausible fluid mechanics. We review properties of the ESM and LEM and compare these with available empirical evidence. Most evidence supports the prediction of both models that settling will be exponential in the far field but contradicts the ESM's prediction that a single exponential distribution will hold for all transport times and distances. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Rotation and anisotropy of galaxies revisited

    NASA Astrophysics Data System (ADS)

    Binney, James

    2005-11-01

    The use of the tensor virial theorem (TVT) as a diagnostic of anisotropic velocity distributions in galaxies is revisited. The TVT provides a rigorous global link between velocity anisotropy, rotation and shape, but the quantities appearing in it are not easily estimated observationally. Traditionally, use has been made of a centrally averaged velocity dispersion and the peak rotation velocity. Although this procedure cannot be rigorously justified, tests on model galaxies show that it works surprisingly well. With the advent of integral-field spectroscopy it is now possible to establish a rigorous connection between the TVT and observations. The TVT is reformulated in terms of sky-averages, and the new formulation is tested on model galaxies.

  4. Rigor of cell fate decision by variable p53 pulses and roles of cooperative gene expression by p53

    PubMed Central

    Murakami, Yohei; Takada, Shoji

    2012-01-01

    Upon DNA damage, the cell fate decision between survival and apoptosis is largely regulated by p53-related networks. Recent experiments found a series of discrete p53 pulses in individual cells, which led to the hypothesis that the cell fate decision upon DNA damage is controlled by counting the number of p53 pulses. Under this hypothesis, Sun et al. (2009) modeled the Bax activation switch in the apoptosis signal transduction pathway that can rigorously “count” the number of uniform p53 pulses. Based on experimental evidence, here we use variable p53 pulses with Sun et al.’s model to investigate how the variability in p53 pulses affects the rigor of the cell fate decision by the pulse number. Our calculations showed that the experimentally anticipated variability in the pulse sizes reduces the rigor of the cell fate decision. In addition, we tested the roles of the cooperativity in PUMA expression by p53, finding that lower cooperativity is plausible for more rigorous cell fate decision. This is because the variability in the p53 pulse height is more amplified in PUMA expressions with more cooperative cases. PMID:27857606

  5. Catastrophic Decline of World's Largest Primate: 80% Loss of Grauer's Gorilla (Gorilla beringei graueri) Population Justifies Critically Endangered Status

    PubMed Central

    Nixon, Stuart; Kujirakwinja, Deo K.; Vieilledent, Ghislain; Critchlow, Rob; Williamson, Elizabeth A.; Nishuli, Radar; Kirkby, Andrew E.; Hall, Jefferson S.

    2016-01-01

    Grauer’s gorilla (Gorilla beringei graueri), the World’s largest primate, is confined to eastern Democratic Republic of Congo (DRC) and is threatened by civil war and insecurity. During the war, armed groups in mining camps relied on hunting bushmeat, including gorillas. Insecurity and the presence of several militia groups across Grauer’s gorilla’s range made it very difficult to assess their population size. Here we use a novel method that enables rigorous assessment of local community and ranger-collected data on gorilla occupancy to evaluate the impacts of civil war on Grauer’s gorilla, which prior to the war was estimated to number 16,900 individuals. We show that gorilla numbers in their stronghold of Kahuzi-Biega National Park have declined by 87%. Encounter rate data of gorilla nests at 10 sites across its range indicate declines of 82–100% at six of these sites. Spatial occupancy analysis identifies three key areas as the most critical sites for the remaining populations of this ape and that the range of this taxon is around 19,700 km2. We estimate that only 3,800 Grauer’s gorillas remain in the wild, a 77% decline in one generation, justifying its elevation to Critically Endangered status on the IUCN Red List of Threatened Species. PMID:27760201

  6. Towards a Credibility Assessment of Models and Simulations

    NASA Technical Reports Server (NTRS)

    Blattnig, Steve R.; Green, Lawrence L.; Luckring, James M.; Morrison, Joseph H.; Tripathi, Ram K.; Zang, Thomas A.

    2008-01-01

    A scale is presented to evaluate the rigor of modeling and simulation (M&S) practices for the purpose of supporting a credibility assessment of the M&S results. The scale distinguishes required and achieved levels of rigor for a set of M&S elements that contribute to credibility including both technical and process measures. The work has its origins in an interest within NASA to include a Credibility Assessment Scale in development of a NASA standard for models and simulations.

  7. Peer Review of EPA's Draft BMDS Document: Exponential ...

    EPA Pesticide Factsheets

    BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling.

  8. A Regional Seismic Travel Time Model for North America

    DTIC Science & Technology

    2010-09-01

    velocity at the Moho, the mantle velocity gradient, and the average crustal velocity. After tomography across Eurasia, rigorous tests find that Pn...velocity gradient, and the average crustal velocity. After tomography across Eurasia rigorous tests find that Pn travel time residuals are reduced...and S-wave velocity in the crustal layers and in the upper mantle. A good prior model is essential because the RSTT tomography inversion is invariably

  9. Pre rigor processing, ageing and freezing on tenderness and colour stability of lamb loins.

    PubMed

    Kim, Yuan H Brad; Luc, Genevieve; Rosenvold, Katja

    2013-10-01

    Forty eight lamb carcasses with temperature and pH monitored were obtained from two commercial plants. At 24h post mortem both loins (M. longissimus) from each carcass were randomly allocated to a) unaged frozen at -18°C, (b) aged at -1.5°C for 2weeks before freezing, (c) aged for 3 weeks before freezing and (d) aged for 9 weeks without freezing. Shear force, colour stability and proteolysis were analyzed. Carcasses with a slower temperature and more rapid pH decline had more calpain autolysis, slightly higher shear force and less colour stable compared to that counterpart in general (P<0.05). However, the shear force values of the loins were all acceptable (<6 kgF) regardless of different pre rigor processing and ageing/freezing treatments. Furthermore, the loins aged for 2 weeks-then-frozen/thawed had a similar shear force to the loins aged only for 9 weeks suggesting that ageing-then-freezing would result in equivalent tenderness compared to aged only loins for the long-term storage. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Anthropogenically-induced changes in temperatures and implications for water resources in the western United States.

    NASA Astrophysics Data System (ADS)

    Bonfils, C.; Santer, B.; Pierce, D.; Hidalgo, H.; Bala, G.; Dash, T.; Barnett, T.; Cayan, D.; Doutriaux, C.; Wood, A.; Mirin, A.; Nosawa, T.

    2008-12-01

    Large changes in the hydrology of the western United States have been observed since the mid-20th century. These include a reduction in the amount of precipitation arriving as snow, a decline in snowpack at low and mid-elevations, and a shift towards earlier arrival of both snowmelt and the center of mass of streamflows. In order to project future water supply reliability, it is crucial to obtain a better understanding of the underlying cause or causes for these long-term changes. A regional warming is often posited as the cause of these changes, without formal testing of different competitive explanations for the warming. In this study, we perform a rigorous detection and attribution analysis to determine the causes of the late winter/early spring changes in hydrologically-relevant temperature variables over mountain ranges of the western U.S. Natural internal climate variability, as estimated from two long control climate model simulations, is insufficient to explain the rapid increase in daily minimum and maximum temperatures, the sharp decline in frost days, and the rise in degree-days above 0°C (a simple proxy for temperature-driven snowmelt). The observations are however consistent with climate simulations that include the combined effects of anthropogenic greenhouse gases and aerosols. We also address the benefits of conducting multivariate versus univariate detection and attribution analysis, with, for instance, a focus on changes in snowmelt, streamflow peaks and minimum temperature. With models of climate change unanimously projecting an acceleration of warming in the western United States, serious implications for water infrastructures and water supply sustainability can be expected, increasing already the necessity of developing adaptation measures in water resources management.

  11. Rigorous simulations of a helical core fiber by the use of transformation optics formalism.

    PubMed

    Napiorkowski, Maciej; Urbanczyk, Waclaw

    2014-09-22

    We report for the first time on rigorous numerical simulations of a helical-core fiber by using a full vectorial method based on the transformation optics formalism. We modeled the dependence of circular birefringence of the fundamental mode on the helix pitch and analyzed the effect of a birefringence increase caused by the mode displacement induced by a core twist. Furthermore, we analyzed the complex field evolution versus the helix pitch in the first order modes, including polarization and intensity distribution. Finally, we show that the use of the rigorous vectorial method allows to better predict the confinement loss of the guided modes compared to approximate methods based on equivalent in-plane bending models.

  12. Temporal shifts in top-down vs. bottom-up control of epiphytic algae in a seagrass ecosystem

    USGS Publications Warehouse

    Whalen, Matthew A.; Duffy, J. Emmett; Grace, James B.

    2013-01-01

    In coastal marine food webs, small invertebrate herbivores (mesograzers) have long been hypothesized to occupy an important position facilitating dominance of habitat-forming macrophytes by grazing competitively superior epiphytic algae. Because of the difficulty of manipulating mesograzers in the field, however, their impacts on community organization have rarely been rigorously documented. Understanding mesograzer impacts has taken on increased urgency in seagrass systems due to declines in seagrasses globally, caused in part by widespread eutrophication favoring seagrass overgrowth by faster-growing algae. Using cage-free field experiments in two seasons (fall and summer), we present experimental confirmation that mesograzer reduction and nutrients can promote blooms of epiphytic algae growing on eelgrass (Zostera marina). In this study, nutrient additions increased epiphytes only in the fall following natural decline of mesograzers. In the summer, experimental mesograzer reduction stimulated a 447% increase in epiphytes, appearing to exacerbate seasonal dieback of eelgrass. Using structural equation modeling, we illuminate the temporal dynamics of complex interactions between macrophytes, mesograzers, and epiphytes in the summer experiment. An unexpected result emerged from investigating the interaction network: drift macroalgae indirectly reduced epiphytes by providing structure for mesograzers, suggesting that the net effect of macroalgae on seagrass depends on macroalgal density. Our results show that mesograzers can control proliferation of epiphytic algae, that top-down and bottom-up forcing are temporally variable, and that the presence of macroalgae can strengthen top-down control of epiphytic algae, potentially contributing to eelgrass persistence.

  13. Inferring the nature of anthropogenic threats from long-term abundance records.

    PubMed

    Shoemaker, Kevin T; Akçakaya, H Resit

    2015-02-01

    Diagnosing the processes that threaten species persistence is critical for recovery planning and risk forecasting. Dominant threats are typically inferred by experts on the basis of a patchwork of informal methods. Transparent, quantitative diagnostic tools would contribute much-needed consistency, objectivity, and rigor to the process of diagnosing anthropogenic threats. Long-term census records, available for an increasingly large and diverse set of taxa, may exhibit characteristic signatures of specific threatening processes and thereby provide information for threat diagnosis. We developed a flexible Bayesian framework for diagnosing threats on the basis of long-term census records and diverse ancillary sources of information. We tested this framework with simulated data from artificial populations subjected to varying degrees of exploitation and habitat loss and several real-world abundance time series for which threatening processes are relatively well understood: bluefin tuna (Thunnus maccoyii) and Atlantic cod (Gadus morhua) (exploitation) and Red Grouse (Lagopus lagopus scotica) and Eurasian Skylark (Alauda arvensis) (habitat loss). Our method correctly identified the process driving population decline for over 90% of time series simulated under moderate to severe threat scenarios. Successful identification of threats approached 100% for severe exploitation and habitat loss scenarios. Our method identified threats less successfully when threatening processes were weak and when populations were simultaneously affected by multiple threats. Our method selected the presumed true threat model for all real-world case studies, although results were somewhat ambiguous in the case of the Eurasian Skylark. In the latter case, incorporation of an ancillary source of information (records of land-use change) increased the weight assigned to the presumed true model from 70% to 92%, illustrating the value of the proposed framework in bringing diverse sources of information into a common rigorous framework. Ultimately, our framework may greatly assist conservation organizations in documenting threatening processes and planning species recovery. © 2014 Society for Conservation Biology.

  14. Comparison of rigorous and simple vibrational models for the CO2 gasdynamic laser

    NASA Technical Reports Server (NTRS)

    Monson, D. J.

    1977-01-01

    The accuracy of a simple vibrational model for computing the gain in a CO2 gasdynamic laser is assessed by comparing results computed from it with results computed from a rigorous vibrational model. The simple model is that of Anderson et al. (1971), in which the vibrational kinetics are modeled by grouping the nonequilibrium vibrational degrees of freedom into two modes, to each of which there corresponds an equation describing vibrational relaxation. The two models agree fairly well in the computed gain at low temperatures, but the simple model predicts too high a gain at the higher temperatures of current interest. The sources of error contributing to the overestimation given by the simple model are determined by examining the simplified relaxation equations.

  15. Review of rigorous coupled-wave analysis and of homogeneous effective medium approximations for high spatial-frequency surface-relief gratings

    NASA Technical Reports Server (NTRS)

    Glytsis, Elias N.; Brundrett, David L.; Gaylord, Thomas K.

    1993-01-01

    A review of the rigorous coupled-wave analysis as applied to the diffraction of electro-magnetic waves by gratings is presented. The analysis is valid for any polarization, angle of incidence, and conical diffraction. Cascaded and/or multiplexed gratings as well as material anisotropy can be incorporated under the same formalism. Small period rectangular groove gratings can also be modeled using approximately equivalent uniaxial homogeneous layers (effective media). The ordinary and extraordinary refractive indices of these layers depend on the gratings filling factor, the refractive indices of the substrate and superstrate, and the ratio of the freespace wavelength to grating period. Comparisons of the homogeneous effective medium approximations with the rigorous coupled-wave analysis are presented. Antireflection designs (single-layer or multilayer) using the effective medium models are presented and compared. These ultra-short period antireflection gratings can also be used to produce soft x-rays. Comparisons of the rigorous coupled-wave analysis with experimental results on soft x-ray generation by gratings are also included.

  16. Verification of Compartmental Epidemiological Models using Metamorphic Testing, Model Checking and Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramanathan, Arvind; Steed, Chad A; Pullum, Laura L

    Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we buildmore » a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.« less

  17. High Astrometric Precision in the Calculation of the Coordinates of Orbiters in the GEO Ring

    NASA Astrophysics Data System (ADS)

    Lacruz, E.; Abad, C.; Downes, J. J.; Hernández-Pérez, F.; Casanova, D.; Tresaco, E.

    2018-04-01

    We present an astrometric method for the calculation of the positions of orbiters in the GEO ring with a high precision, through a rigorous astrometric treatment of observations with a 1-m class telescope, which are part of the CIDA survey of the GEO ring. We compute the distortion pattern to correct for the systematic errors introduced by the optics and electronics of the telescope, resulting in absolute mean errors of 0.16″ and 0.12″ in right ascension and declination, respectively. These correspond to ≍25 m at the mean distance of the GEO ring, and are thus good quality results.

  18. Inbreeding effects on immune response in free-living song sparrows (Melospiza melodia).

    PubMed

    Reid, Jane M; Arcese, Peter; Keller, Lukas F; Elliott, Kyle H; Sampson, Laura; Hasselquist, Dennis

    2007-03-07

    The consequences of inbreeding for host immunity to parasitic infection have broad implications for the evolutionary and dynamical impacts of parasites on populations where inbreeding occurs. To rigorously assess the magnitude and the prevalence of inbreeding effects on immunity, multiple components of host immune response should be related to inbreeding coefficient (f) in free-living individuals. We used a pedigreed, free-living population of song sparrows (Melospiza melodia) to test whether individual responses to widely used experimental immune challenges varied consistently with f. The patagial swelling response to phytohaemagglutinin declined markedly with f in both females and males in both 2002 and 2003, although overall inbreeding depression was greater in males. The primary antibody response to tetanus toxoid declined with f in females but not in males in both 2004 and 2005. Primary antibody responses to diphtheria toxoid were low but tended to decline with f in 2004. Overall inbreeding depression did not solely reflect particularly strong immune responses in outbred offspring of immigrant-native pairings or weak responses in highly inbred individuals. These data indicate substantial and apparently sex-specific inbreeding effects on immune response, implying that inbred hosts may be relatively susceptible to parasitic infection to differing degrees in males and females.

  19. Internal medicine residency redesign: proposal of the Internal Medicine Working Group.

    PubMed

    Horwitz, Ralph I; Kassirer, Jerome P; Holmboe, Eric S; Humphrey, Holly J; Verghese, Abraham; Croft, Carol; Kwok, Minjung; Loscalzo, Joseph

    2011-09-01

    Concerned with the quality of internal medicine training, many leaders in the field assembled to assess the state of the residency, evaluate the decline in interest in the specialty, and create a framework for invigorating the discipline. Although many external factors are responsible, we also found ourselves culpable: allowing senior role models to opt out of important training activities, ignoring a progressive atrophy of bedside skills, and focusing on lock-step curricula, lectures, and compiled diagnostic and therapeutic strategies. The group affirmed its commitment to a vision of internal medicine rooted in science and learned with mentors at the bedside. Key factors for new emphasis include patient-centered small group teaching, greater incorporation of clinical epidemiology and health services research, and better schedule control for trainees. Because previous proposals were weakened by lack of evidence, we propose to organize the Cooperative Educational Studies Group, a pool of training programs that will collect a common data set describing their programs, design interventions to be tested rigorously in multi-methodological approaches, and at the same time produce knowledge about high-quality practice. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Reinventing the High School Government Course: Rigor, Simulations, and Learning from Text

    ERIC Educational Resources Information Center

    Parker, Walter C.; Lo, Jane C.

    2016-01-01

    The high school government course is arguably the main site of formal civic education in the country today. This article presents the curriculum that resulted from a multiyear study aimed at improving the course. The pedagogic model, called "Knowledge in Action," centers on a rigorous form of project-based learning where the projects are…

  1. All Rigor and No Play Is No Way to Improve Learning

    ERIC Educational Resources Information Center

    Wohlwend, Karen; Peppler, Kylie

    2015-01-01

    The authors propose and discuss their Playshop curricular model, which they developed with teachers. Their studies suggest a playful approach supports even more rigor than the Common Core State Standards require for preschool and early grade children. Children keep their attention longer when learning comes in the form of something they can play…

  2. Scientific rigor through videogames.

    PubMed

    Treuille, Adrien; Das, Rhiju

    2014-11-01

    Hypothesis-driven experimentation - the scientific method - can be subverted by fraud, irreproducibility, and lack of rigorous predictive tests. A robust solution to these problems may be the 'massive open laboratory' model, recently embodied in the internet-scale videogame EteRNA. Deploying similar platforms throughout biology could enforce the scientific method more broadly. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Rigorous Photogrammetric Processing of CHANG'E-1 and CHANG'E-2 Stereo Imagery for Lunar Topographic Mapping

    NASA Astrophysics Data System (ADS)

    Di, K.; Liu, Y.; Liu, B.; Peng, M.

    2012-07-01

    Chang'E-1(CE-1) and Chang'E-2(CE-2) are the two lunar orbiters of China's lunar exploration program. Topographic mapping using CE-1 and CE-2 images is of great importance for scientific research as well as for preparation of landing and surface operation of Chang'E-3 lunar rover. In this research, we developed rigorous sensor models of CE-1 and CE-2 CCD cameras based on push-broom imaging principle with interior and exterior orientation parameters. Based on the rigorous sensor model, the 3D coordinate of a ground point in lunar body-fixed (LBF) coordinate system can be calculated by space intersection from the image coordinates of con-jugate points in stereo images, and the image coordinates can be calculated from 3D coordinates by back-projection. Due to uncer-tainties of the orbit and the camera, the back-projected image points are different from the measured points. In order to reduce these inconsistencies and improve precision, we proposed two methods to refine the rigorous sensor model: 1) refining EOPs by correcting the attitude angle bias, 2) refining the interior orientation model by calibration of the relative position of the two linear CCD arrays. Experimental results show that the mean back-projection residuals of CE-1 images are reduced to better than 1/100 pixel by method 1 and the mean back-projection residuals of CE-2 images are reduced from over 20 pixels to 0.02 pixel by method 2. Consequently, high precision DEM (Digital Elevation Model) and DOM (Digital Ortho Map) are automatically generated.

  4. Digital morphogenesis via Schelling segregation

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2018-04-01

    Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.

  5. Evaluating the links between climate, disease spread, and amphibian declines.

    PubMed

    Rohr, Jason R; Raffel, Thomas R; Romansic, John M; McCallum, Hamish; Hudson, Peter J

    2008-11-11

    Human alteration of the environment has arguably propelled the Earth into its sixth mass extinction event and amphibians, the most threatened of all vertebrate taxa, are at the forefront. Many of the worldwide amphibian declines have been caused by the chytrid fungus, Batrachochytrium dendrobatidis (Bd), and two contrasting hypotheses have been proposed to explain these declines. Positive correlations between global warming and Bd-related declines sparked the chytrid-thermal-optimum hypothesis, which proposes that global warming increased cloud cover in warm years that drove the convergence of daytime and nighttime temperatures toward the thermal optimum for Bd growth. In contrast, the spatiotemporal-spread hypothesis states that Bd-related declines are caused by the introduction and spread of Bd, independent of climate change. We provide a rigorous test of these hypotheses by evaluating (i) whether cloud cover, temperature convergence, and predicted temperature-dependent Bd growth are significant positive predictors of amphibian extinctions in the genus Atelopus and (ii) whether spatial structure in the timing of these extinctions can be detected without making assumptions about the location, timing, or number of Bd emergences. We show that there is spatial structure to the timing of Atelopus spp. extinctions but that the cause of this structure remains equivocal, emphasizing the need for further molecular characterization of Bd. We also show that the reported positive multi-decade correlation between Atelopus spp. extinctions and mean tropical air temperature in the previous year is indeed robust, but the evidence that it is causal is weak because numerous other variables, including regional banana and beer production, were better predictors of these extinctions. Finally, almost all of our findings were opposite to the predictions of the chytrid-thermal-optimum hypothesis. Although climate change is likely to play an important role in worldwide amphibian declines, more convincing evidence is needed of a causal link.

  6. Demography of the Pacific walrus (Odobenus rosmarus divergens): 1974-2006

    USGS Publications Warehouse

    Taylor, Rebecca L.; Udevitz, Mark S.

    2015-01-01

    Global climate change may fundamentally alter population dynamics of many species for which baseline population parameter estimates are imprecise or lacking. Historically, the Pacific walrus is thought to have been limited by harvest, but it may become limited by global warming-induced reductions in sea ice. Loss of sea ice, on which walruses rest between foraging bouts, may reduce access to food, thus lowering vital rates. Rigorous walrus survival rate estimates do not exist, and other population parameter estimates are out of date or have well-documented bias and imprecision. To provide useful population parameter estimates we developed a Bayesian, hidden process demographic model of walrus population dynamics from 1974 through 2006 that combined annual age-specific harvest estimates with five population size estimates, six standing age structure estimates, and two reproductive rate estimates. Median density independent natural survival was high for juveniles (0.97) and adults (0.99), and annual density dependent vital rates rose from 0.06 to 0.11 for reproduction, 0.31 to 0.59 for survival of neonatal calves, and 0.39 to 0.85 for survival of older calves, concomitant with a population decline. This integrated population model provides a baseline for estimating changing population dynamics resulting from changing harvests or sea ice.

  7. Assessing the Rigor of HS Curriculum in Admissions Decisions: A Functional Method, Plus Practical Advising for Prospective Students and High School Counselors

    ERIC Educational Resources Information Center

    Micceri, Theodore; Brigman, Leellen; Spatig, Robert

    2009-01-01

    An extensive, internally cross-validated analytical study using nested (within academic disciplines) Multilevel Modeling (MLM) on 4,560 students identified functional criteria for defining high school curriculum rigor and further determined which measures could best be used to help guide decision making for marginal applicants. The key outcome…

  8. A rigorous test of the accuracy of USGS digital elevation models in forested areas of Oregon and Washington.

    Treesearch

    Ward W. Carson; Stephen E. Reutebuch

    1997-01-01

    A procedure for performing a rigorous test of elevational accuracy of DEMs using independent ground coordinate data digitized photogrammetrically from aerial photography is presented. The accuracy of a sample set of 23 DEMs covering National Forests in Oregon and Washington was evaluated. Accuracy varied considerably between eastern and western parts of Oregon and...

  9. Accelerating Biomedical Discoveries through Rigor and Transparency.

    PubMed

    Hewitt, Judith A; Brown, Liliana L; Murphy, Stephanie J; Grieder, Franziska; Silberberg, Shai D

    2017-07-01

    Difficulties in reproducing published research findings have garnered a lot of press in recent years. As a funder of biomedical research, the National Institutes of Health (NIH) has taken measures to address underlying causes of low reproducibility. Extensive deliberations resulted in a policy, released in 2015, to enhance reproducibility through rigor and transparency. We briefly explain what led to the policy, describe its elements, provide examples and resources for the biomedical research community, and discuss the potential impact of the policy on translatability with a focus on research using animal models. Importantly, while increased attention to rigor and transparency may lead to an increase in the number of laboratory animals used in the near term, it will lead to more efficient and productive use of such resources in the long run. The translational value of animal studies will be improved through more rigorous assessment of experimental variables and data, leading to better assessments of the translational potential of animal models, for the benefit of the research community and society. Published by Oxford University Press on behalf of the Institute for Laboratory Animal Research 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  10. Image synthesis for SAR system, calibration and processor design

    NASA Technical Reports Server (NTRS)

    Holtzman, J. C.; Abbott, J. L.; Kaupp, V. H.; Frost, V. S.

    1978-01-01

    The Point Scattering Method of simulating radar imagery rigorously models all aspects of the imaging radar phenomena. Its computational algorithms operate on a symbolic representation of the terrain test site to calculate such parameters as range, angle of incidence, resolution cell size, etc. Empirical backscatter data and elevation data are utilized to model the terrain. Additionally, the important geometrical/propagation effects such as shadow, foreshortening, layover, and local angle of incidence are rigorously treated. Applications of radar image simulation to a proposed calibrated SAR system are highlighted: soil moisture detection and vegetation discrimination.

  11. Mathematical Rigor vs. Conceptual Change: Some Early Results

    NASA Astrophysics Data System (ADS)

    Alexander, W. R.

    2003-05-01

    Results from two different pedagogical approaches to teaching introductory astronomy at the college level will be presented. The first of these approaches is a descriptive, conceptually based approach that emphasizes conceptual change. This descriptive class is typically an elective for non-science majors. The other approach is a mathematically rigorous treatment that emphasizes problem solving and is designed to prepare students for further study in astronomy. The mathematically rigorous class is typically taken by science majors. It also fulfills an elective science requirement for these science majors. The Astronomy Diagnostic Test version 2 (ADT 2.0) was used as an assessment instrument since the validity and reliability have been investigated by previous researchers. The ADT 2.0 was administered as both a pre-test and post-test to both groups. Initial results show no significant difference between the two groups in the post-test. However, there is a slightly greater improvement for the descriptive class between the pre and post testing compared to the mathematically rigorous course. There was great care to account for variables. These variables included: selection of text, class format as well as instructor differences. Results indicate that the mathematically rigorous model, doesn't improve conceptual understanding any better than the conceptual change model. Additional results indicate that there is a similar gender bias in favor of males that has been measured by previous investigators. This research has been funded by the College of Science and Mathematics at James Madison University.

  12. Direct computation of orbital sunrise or sunset event parameters

    NASA Technical Reports Server (NTRS)

    Buglia, J. J.

    1986-01-01

    An analytical method is developed for determining the geometrical parameters which are needed to describe the viewing angles of the Sun relative to an orbiting spacecraft when the Sun rises or sets with respect to the spacecraft. These equations are rigorous and are frequently used for parametric studies relative to mission planning and for determining instrument parameters. The text is wholly self-contained in that no external reference to ephemerides or other astronomical tables is needed. Equations are presented which allow the computation of Greenwich sidereal time and right ascension and declination of the Sun generally to within a few seconds of arc, or a few tenths of a second in time.

  13. Conflict: Operational Realism versus Analytical Rigor in Defense Modeling and Simulation

    DTIC Science & Technology

    2012-06-14

    Campbell, Experimental and Quasi- Eperimental Designs for Generalized Causal Inference, Boston: Houghton Mifflin Company, 2002. [7] R. T. Johnson, G...experimentation? In order for an experiment to be considered rigorous, and the results valid, the experiment should be designed using established...addition to the interview, the pilots were administered a written survey, designed to capture their reactions regarding the level of realism present

  14. High-order computer-assisted estimates of topological entropy

    NASA Astrophysics Data System (ADS)

    Grote, Johannes

    The concept of Taylor Models is introduced, which offers highly accurate C0-estimates for the enclosures of functional dependencies, combining high-order Taylor polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified interval arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly nonlinear dynamical systems. A method to obtain sharp rigorous enclosures of Poincare maps for certain types of flows and surfaces is developed and numerical examples are presented. Differential algebraic techniques allow the efficient and accurate computation of polynomial approximations for invariant curves of certain planar maps around hyperbolic fixed points. Subsequently we introduce a procedure to extend these polynomial curves to verified Taylor Model enclosures of local invariant manifolds with C0-errors of size 10-10--10 -14, and proceed to generate the global invariant manifold tangle up to comparable accuracy through iteration in Taylor Model arithmetic. Knowledge of the global manifold structure up to finite iterations of the local manifold pieces enables us to find all homoclinic and heteroclinic intersections in the generated manifold tangle. Combined with the mapping properties of the homoclinic points and their ordering we are able to construct a subshift of finite type as a topological factor of the original planar system to obtain rigorous lower bounds for its topological entropy. This construction is fully automatic and yields homoclinic tangles with several hundred homoclinic points. As an example rigorous lower bounds for the topological entropy of the Henon map are computed, which to the best knowledge of the authors yield the largest such estimates published so far.

  15. Testability of evolutionary game dynamics based on experimental economics data

    NASA Astrophysics Data System (ADS)

    Wang, Yijia; Chen, Xiaojie; Wang, Zhijian

    2017-11-01

    Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.

  16. Effects of training and simulated combat stress on leg tourniquet application accuracy, time, and effectiveness.

    PubMed

    Schreckengaust, Richard; Littlejohn, Lanny; Zarow, Gregory J

    2014-02-01

    The lower extremity tourniquet failure rate remains significantly higher in combat than in preclinical testing, so we hypothesized that tourniquet placement accuracy, speed, and effectiveness would improve during training and decline during simulated combat. Navy Hospital Corpsman (N = 89), enrolled in a Tactical Combat Casualty Care training course in preparation for deployment, applied Combat Application Tourniquet (CAT) and the Special Operations Forces Tactical Tourniquet (SOFT-T) on day 1 and day 4 of classroom training, then under simulated combat, wherein participants ran an obstacle course to apply a tourniquet while wearing full body armor and avoiding simulated small arms fire (paint balls). Application time and pulse elimination effectiveness improved day 1 to day 4 (p < 0.005). Under simulated combat, application time slowed significantly (p < 0.001), whereas accuracy and effectiveness declined slightly. Pulse elimination was poor for CAT (25% failure) and SOFT-T (60% failure) even in classroom conditions following training. CAT was more quickly applied (p < 0.005) and more effective (p < 0.002) than SOFT-T. Training fostered fast and effective application of leg tourniquets while performance declined under simulated combat. The inherent efficacy of tourniquet products contributes to high failure rates under combat conditions, pointing to the need for superior tourniquets and for rigorous deployment preparation training in simulated combat scenarios. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  17. New tools for Content Innovation and data sharing: Enhancing reproducibility and rigor in biomechanics research.

    PubMed

    Guilak, Farshid

    2017-03-21

    We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Kinetics versus thermodynamics in materials modeling: The case of the di-vacancy in iron

    NASA Astrophysics Data System (ADS)

    Djurabekova, F.; Malerba, L.; Pasianot, R. C.; Olsson, P.; Nordlund, K.

    2010-07-01

    Monte Carlo models are widely used for the study of microstructural and microchemical evolution of materials under irradiation. However, they often link explicitly the relevant activation energies to the energy difference between local equilibrium states. We provide a simple example (di-vacancy migration in iron) in which a rigorous activation energy calculation, by means of both empirical interatomic potentials and density functional theory methods, clearly shows that such a link is not granted, revealing a migration mechanism that a thermodynamics-linked activation energy model cannot predict. Such a mechanism is, however, fully consistent with thermodynamics. This example emphasizes the importance of basing Monte Carlo methods on models where the activation energies are rigorously calculated, rather than deduced from widespread heuristic equations.

  19. A Rigorous Sharp Interface Limit of a Diffuse Interface Model Related to Tumor Growth

    NASA Astrophysics Data System (ADS)

    Rocca, Elisabetta; Scala, Riccardo

    2017-06-01

    In this paper, we study the rigorous sharp interface limit of a diffuse interface model related to the dynamics of tumor growth, when a parameter ɛ, representing the interface thickness between the tumorous and non-tumorous cells, tends to zero. More in particular, we analyze here a gradient-flow-type model arising from a modification of the recently introduced model for tumor growth dynamics in Hawkins-Daruud et al. (Int J Numer Math Biomed Eng 28:3-24, 2011) (cf. also Hilhorst et al. Math Models Methods Appl Sci 25:1011-1043, 2015). Exploiting the techniques related to both gradient flows and gamma convergence, we recover a condition on the interface Γ relating the chemical and double-well potentials, the mean curvature, and the normal velocity.

  20. A Mathematical Evaluation of the Core Conductor Model

    PubMed Central

    Clark, John; Plonsey, Robert

    1966-01-01

    This paper is a mathematical evaluation of the core conductor model where its three dimensionality is taken into account. The problem considered is that of a single, active, unmyelinated nerve fiber situated in an extensive, homogeneous, conducting medium. Expressions for the various core conductor parameters have been derived in a mathematically rigorous manner according to the principles of electromagnetic theory. The purpose of employing mathematical rigor in this study is to bring to light the inherent assumptions of the one dimensional core conductor model, providing a method of evaluating the accuracy of this linear model. Based on the use of synthetic squid axon data, the conclusion of this study is that the linear core conductor model is a good approximation for internal but not external parameters. PMID:5903155

  1. An Evidence-Based Construction of the Models of Decline of Functioning. Part 1: Two Major Models of Decline of Functioning

    ERIC Educational Resources Information Center

    Okawa, Yayoi; Nakamura, Shigemi; Kudo, Minako; Ueda, Satoshi

    2009-01-01

    The purpose of this study is to confirm the working hypothesis on two major models of functioning decline and two corresponding models of rehabilitation program in an older population through detailed interviews with the persons who have functioning declines and on-the-spot observations of key activities on home visits. A total of 542…

  2. ZY3-02 Laser Altimeter Footprint Geolocation Prediction

    PubMed Central

    Xie, Junfeng; Tang, Xinming; Mo, Fan; Li, Guoyuan; Zhu, Guangbin; Wang, Zhenming; Fu, Xingke; Gao, Xiaoming; Dou, Xianhui

    2017-01-01

    Successfully launched on 30 May 2016, ZY3-02 is the first Chinese surveying and mapping satellite equipped with a lightweight laser altimeter. Calibration is necessary before the laser altimeter becomes operational. Laser footprint location prediction is the first step in calibration that is based on ground infrared detectors, and it is difficult because the sample frequency of the ZY3-02 laser altimeter is 2 Hz, and the distance between two adjacent laser footprints is about 3.5 km. In this paper, we build an on-orbit rigorous geometric prediction model referenced to the rigorous geometric model of optical remote sensing satellites. The model includes three kinds of data that must be predicted: pointing angle, orbit parameters, and attitude angles. The proposed method is verified by a ZY3-02 laser altimeter on-orbit geometric calibration test. Five laser footprint prediction experiments are conducted based on the model, and the laser footprint prediction accuracy is better than 150 m on the ground. The effectiveness and accuracy of the on-orbit rigorous geometric prediction model are confirmed by the test results. The geolocation is predicted precisely by the proposed method, and this will give a reference to the geolocation prediction of future land laser detectors in other laser altimeter calibration test. PMID:28934160

  3. ZY3-02 Laser Altimeter Footprint Geolocation Prediction.

    PubMed

    Xie, Junfeng; Tang, Xinming; Mo, Fan; Li, Guoyuan; Zhu, Guangbin; Wang, Zhenming; Fu, Xingke; Gao, Xiaoming; Dou, Xianhui

    2017-09-21

    Successfully launched on 30 May 2016, ZY3-02 is the first Chinese surveying and mapping satellite equipped with a lightweight laser altimeter. Calibration is necessary before the laser altimeter becomes operational. Laser footprint location prediction is the first step in calibration that is based on ground infrared detectors, and it is difficult because the sample frequency of the ZY3-02 laser altimeter is 2 Hz, and the distance between two adjacent laser footprints is about 3.5 km. In this paper, we build an on-orbit rigorous geometric prediction model referenced to the rigorous geometric model of optical remote sensing satellites. The model includes three kinds of data that must be predicted: pointing angle, orbit parameters, and attitude angles. The proposed method is verified by a ZY3-02 laser altimeter on-orbit geometric calibration test. Five laser footprint prediction experiments are conducted based on the model, and the laser footprint prediction accuracy is better than 150 m on the ground. The effectiveness and accuracy of the on-orbit rigorous geometric prediction model are confirmed by the test results. The geolocation is predicted precisely by the proposed method, and this will give a reference to the geolocation prediction of future land laser detectors in other laser altimeter calibration test.

  4. Skill Assessment for Coupled Biological/Physical Models of Marine Systems.

    PubMed

    Stow, Craig A; Jolliff, Jason; McGillicuddy, Dennis J; Doney, Scott C; Allen, J Icarus; Friedrichs, Marjorie A M; Rose, Kenneth A; Wallhead, Philip

    2009-02-20

    Coupled biological/physical models of marine systems serve many purposes including the synthesis of information, hypothesis generation, and as a tool for numerical experimentation. However, marine system models are increasingly used for prediction to support high-stakes decision-making. In such applications it is imperative that a rigorous model skill assessment is conducted so that the model's capabilities are tested and understood. Herein, we review several metrics and approaches useful to evaluate model skill. The definition of skill and the determination of the skill level necessary for a given application is context specific and no single metric is likely to reveal all aspects of model skill. Thus, we recommend the use of several metrics, in concert, to provide a more thorough appraisal. The routine application and presentation of rigorous skill assessment metrics will also serve the broader interests of the modeling community, ultimately resulting in improved forecasting abilities as well as helping us recognize our limitations.

  5. Meat quality and rigor mortis development in broiler chickens with gas-induced anoxia and postmortem electrical stimulation.

    PubMed

    Sams, A R; Dzuik, C S

    1999-10-01

    This study was conducted to evaluate the combined rigor-accelerating effects of postmortem electrical stimulation (ES) and argon-induced anoxia (Ar) of broiler chickens. One hundred broilers were processed in the following treatments: untreated controls, ES, Ar, or Ar with ES (Ar + ES). Breast fillets were harvested at 1 h postmortem for all treatments or at 1 and 6 h postmortem for the control carcasses. Fillets were sampled for pH and ratio of inosine to adenosine (R-value) and were then individually quick frozen (IQF) or aged on ice (AOI) until 24 h postmortem. Color was measured in the AOI fillets at 24 h postmortem. All fillets were then cooked and evaluated for Allo-Kramer shear value. The Ar treatment accelerated the normal pH decline, whereas the ES and AR + ES treatments yielded even lower pH values at 1 h postmortem. The Ar + ES treatment had a greater R-value than the ES treatment, which was greater than either the Ar or 1-h controls, which, in turn, were not different from each other. The ES treatment had the lowest L* value, and ES, Ar, and Ar + ES produced significantly higher a* values than the 1-h controls. For the IQF fillets, the ES and Ar + ES treatments were not different in shear value but were lower than Ar, which was lower than the 1-h controls. The same was true for the AOI fillets except that the ES and the Ar treatments were not different. These results indicated that although ES and Ar had rigor-accelerating and tenderizing effects, ES seemed to be more effective than Ar; there was little enhancement when Ar was added to the ES treatment and fillets were deboned at 1 h postmortem.

  6. Calibration and Stokes Imaging with Full Embedded Element Primary Beam Model for the Murchison Widefield Array

    NASA Astrophysics Data System (ADS)

    Sokolowski, M.; Colegate, T.; Sutinjo, A. T.; Ung, D.; Wayth, R.; Hurley-Walker, N.; Lenc, E.; Pindor, B.; Morgan, J.; Kaplan, D. L.; Bell, M. E.; Callingham, J. R.; Dwarakanath, K. S.; For, Bi-Qing; Gaensler, B. M.; Hancock, P. J.; Hindson, L.; Johnston-Hollitt, M.; Kapińska, A. D.; McKinley, B.; Offringa, A. R.; Procopio, P.; Staveley-Smith, L.; Wu, C.; Zheng, Q.

    2017-11-01

    The Murchison Widefield Array (MWA), located in Western Australia, is one of the low-frequency precursors of the international Square Kilometre Array (SKA) project. In addition to pursuing its own ambitious science programme, it is also a testbed for wide range of future SKA activities ranging from hardware, software to data analysis. The key science programmes for the MWA and SKA require very high dynamic ranges, which challenges calibration and imaging systems. Correct calibration of the instrument and accurate measurements of source flux densities and polarisations require precise characterisation of the telescope's primary beam. Recent results from the MWA GaLactic Extragalactic All-sky Murchison Widefield Array (GLEAM) survey show that the previously implemented Average Embedded Element (AEE) model still leaves residual polarisations errors of up to 10-20% in Stokes Q. We present a new simulation-based Full Embedded Element (FEE) model which is the most rigorous realisation yet of the MWA's primary beam model. It enables efficient calculation of the MWA beam response in arbitrary directions without necessity of spatial interpolation. In the new model, every dipole in the MWA tile (4 × 4 bow-tie dipoles) is simulated separately, taking into account all mutual coupling, ground screen, and soil effects, and therefore accounts for the different properties of the individual dipoles within a tile. We have applied the FEE beam model to GLEAM observations at 200-231 MHz and used false Stokes parameter leakage as a metric to compare the models. We have determined that the FEE model reduced the magnitude and declination-dependent behaviour of false polarisation in Stokes Q and V while retaining low levels of false polarisation in Stokes U.

  7. Circular instead of hierarchical: methodological principles for the evaluation of complex interventions

    PubMed Central

    Walach, Harald; Falkenberg, Torkel; Fønnebø, Vinjar; Lewith, George; Jonas, Wayne B

    2006-01-01

    Background The reasoning behind evaluating medical interventions is that a hierarchy of methods exists which successively produce improved and therefore more rigorous evidence based medicine upon which to make clinical decisions. At the foundation of this hierarchy are case studies, retrospective and prospective case series, followed by cohort studies with historical and concomitant non-randomized controls. Open-label randomized controlled studies (RCTs), and finally blinded, placebo-controlled RCTs, which offer most internal validity are considered the most reliable evidence. Rigorous RCTs remove bias. Evidence from RCTs forms the basis of meta-analyses and systematic reviews. This hierarchy, founded on a pharmacological model of therapy, is generalized to other interventions which may be complex and non-pharmacological (healing, acupuncture and surgery). Discussion The hierarchical model is valid for limited questions of efficacy, for instance for regulatory purposes and newly devised products and pharmacological preparations. It is inadequate for the evaluation of complex interventions such as physiotherapy, surgery and complementary and alternative medicine (CAM). This has to do with the essential tension between internal validity (rigor and the removal of bias) and external validity (generalizability). Summary Instead of an Evidence Hierarchy, we propose a Circular Model. This would imply a multiplicity of methods, using different designs, counterbalancing their individual strengths and weaknesses to arrive at pragmatic but equally rigorous evidence which would provide significant assistance in clinical and health systems innovation. Such evidence would better inform national health care technology assessment agencies and promote evidence based health reform. PMID:16796762

  8. The KP Approximation Under a Weak Coriolis Forcing

    NASA Astrophysics Data System (ADS)

    Melinand, Benjamin

    2018-02-01

    In this paper, we study the asymptotic behavior of weakly transverse water-waves under a weak Coriolis forcing in the long wave regime. We derive the Boussinesq-Coriolis equations in this setting and we provide a rigorous justification of this model. Then, from these equations, we derive two other asymptotic models. When the Coriolis forcing is weak, we fully justify the rotation-modified Kadomtsev-Petviashvili equation (also called Grimshaw-Melville equation). When the Coriolis forcing is very weak, we rigorously justify the Kadomtsev-Petviashvili equation. This work provides the first mathematical justification of the KP approximation under a Coriolis forcing.

  9. Imaging 2D optical diffuse reflectance in skeletal muscle

    NASA Astrophysics Data System (ADS)

    Ranasinghesagara, Janaka; Yao, Gang

    2007-04-01

    We discovered a unique pattern of optical reflectance from fresh prerigor skeletal muscles, which can not be described using existing theories. A numerical fitting function was developed to quantify the equiintensity contours of acquired reflectance images. Using this model, we studied the changes of reflectance profile during stretching and rigor process. We found that the prominent anisotropic features diminished after rigor completion. These results suggested that muscle sarcomere structures played important roles in modulating light propagation in whole muscle. When incorporating the sarcomere diffraction in a Monte Carlo model, we showed that the resulting reflectance profiles quantitatively resembled the experimental observation.

  10. Primates decline rapidly in unprotected forests: evidence from a monitoring program with data constraints.

    PubMed

    Rovero, Francesco; Mtui, Arafat; Kitegile, Amani; Jacob, Philipo; Araldi, Alessandro; Tenan, Simone

    2015-01-01

    Growing threats to primates in tropical forests make robust and long-term population abundance assessments increasingly important for conservation. Concomitantly, monitoring becomes particularly relevant in countries with primate habitat. Yet monitoring schemes in these countries often suffer from logistic constraints and/or poor rigor in data collection, and a lack of consideration of sources of bias in analysis. To address the need for feasible monitoring schemes and flexible analytical tools for robust trend estimates, we analyzed data collected by local technicians on abundance of three species of arboreal monkey in the Udzungwa Mountains of Tanzania (two Colobus species and one Cercopithecus), an area of international importance for primate endemism and conservation. We counted primate social groups along eight line transects in two forest blocks in the area, one protected and one unprotected, over a span of 11 years. We applied a recently proposed open metapopulation model to estimate abundance trends while controlling for confounding effects of observer, site, and season. Primate populations were stable in the protected forest, while the colobines, including the endemic Udzungwa red colobus, declined severely in the unprotected forest. Targeted hunting pressure at this second site is the most plausible explanation for the trend observed. The unexplained variability in detection probability among transects was greater than the variability due to observers, indicating consistency in data collection among observers. There were no significant differences in both primate abundance and detectability between wet and dry seasons, supporting the choice of sampling during the dry season only based on minimizing practical constraints. Results show that simple monitoring routines implemented by trained local technicians can effectively detect changes in primate populations in tropical countries. The hierarchical Bayesian model formulation adopted provides a flexible tool to determine temporal trends with full account for any imbalance in the data set and for imperfect detection.

  11. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    PubMed

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  12. Optical simulations of organic light-emitting diodes through a combination of rigorous electromagnetic solvers and Monte Carlo ray-tracing methods

    NASA Astrophysics Data System (ADS)

    Bahl, Mayank; Zhou, Gui-Rong; Heller, Evan; Cassarly, William; Jiang, Mingming; Scarmozzino, Rob; Gregory, G. Groot

    2014-09-01

    Over the last two decades there has been extensive research done to improve the design of Organic Light Emitting Diodes (OLEDs) so as to enhance light extraction efficiency, improve beam shaping, and allow color tuning through techniques such as the use of patterned substrates, photonic crystal (PCs) gratings, back reflectors, surface texture, and phosphor down-conversion. Computational simulation has been an important tool for examining these increasingly complex designs. It has provided insights for improving OLED performance as a result of its ability to explore limitations, predict solutions, and demonstrate theoretical results. Depending upon the focus of the design and scale of the problem, simulations are carried out using rigorous electromagnetic (EM) wave optics based techniques, such as finite-difference time-domain (FDTD) and rigorous coupled wave analysis (RCWA), or through ray optics based technique such as Monte Carlo ray-tracing. The former are typically used for modeling nanostructures on the OLED die, and the latter for modeling encapsulating structures, die placement, back-reflection, and phosphor down-conversion. This paper presents the use of a mixed-level simulation approach which unifies the use of EM wave-level and ray-level tools. This approach uses rigorous EM wave based tools to characterize the nanostructured die and generate both a Bidirectional Scattering Distribution function (BSDF) and a far-field angular intensity distribution. These characteristics are then incorporated into the ray-tracing simulator to obtain the overall performance. Such mixed-level approach allows for comprehensive modeling of the optical characteristic of OLEDs and can potentially lead to more accurate performance than that from individual modeling tools alone.

  13. Can innovative health financing policies increase access to MDG-related services? Evidence from Rwanda.

    PubMed

    Sekabaraga, Claude; Diop, Francois; Soucat, Agnes

    2011-11-01

    Ensuring financial access to health services is a critical challenge for poor countries if they are to reach the health Millennium Development Goals (MDGs). This article examines the case of Rwanda, a country which has championed innovative health care financing policies. Between 2000 and 2007, Rwanda has improved financial access for the poor, increased utilization of health services and reduced out-of-pocket payments for health care. Poor groups' utilization has increased for all health services, sometimes dramatically. Use of assisted deliveries, for example, increased from 12.1% to 42.7% among the poorest quintile; payments at the point of delivery have also been reduced; and catastrophic expenditures have declined. Part of these achievements is likely linked to innovative health financing policies, particularly the expansion of micro-insurance ('mutuelles') and performance-based financing. The paper concludes that the Rwanda experience provides a useful example of effective implementation of policies that reduce the financial barrier to health services, hereby contributing to the health MDGs. Today's main challenge is to build the sustainability of this system. Finally, the paper proposes a simple set of rigorous metrics to assess the impact of health financing policies and calls for implementing rigorous impact evaluation of health care financing policies in low-income countries.

  14. Rigorous high-precision enclosures of fixed points and their invariant manifolds

    NASA Astrophysics Data System (ADS)

    Wittig, Alexander N.

    The well established concept of Taylor Models is introduced, which offer highly accurate C0 enclosures of functional dependencies, combining high-order polynomial approximation of functions and rigorous estimates of the truncation error, performed using verified arithmetic. The focus of this work is on the application of Taylor Models in algorithms for strongly non-linear dynamical systems. A method is proposed to extend the existing implementation of Taylor Models in COSY INFINITY from double precision coefficients to arbitrary precision coefficients. Great care is taken to maintain the highest efficiency possible by adaptively adjusting the precision of higher order coefficients in the polynomial expansion. High precision operations are based on clever combinations of elementary floating point operations yielding exact values for round-off errors. An experimental high precision interval data type is developed and implemented. Algorithms for the verified computation of intrinsic functions based on the High Precision Interval datatype are developed and described in detail. The application of these operations in the implementation of High Precision Taylor Models is discussed. An application of Taylor Model methods to the verification of fixed points is presented by verifying the existence of a period 15 fixed point in a near standard Henon map. Verification is performed using different verified methods such as double precision Taylor Models, High Precision intervals and High Precision Taylor Models. Results and performance of each method are compared. An automated rigorous fixed point finder is implemented, allowing the fully automated search for all fixed points of a function within a given domain. It returns a list of verified enclosures of each fixed point, optionally verifying uniqueness within these enclosures. An application of the fixed point finder to the rigorous analysis of beam transfer maps in accelerator physics is presented. Previous work done by Johannes Grote is extended to compute very accurate polynomial approximations to invariant manifolds of discrete maps of arbitrary dimension around hyperbolic fixed points. The algorithm presented allows for automatic removal of resonances occurring during construction. A method for the rigorous enclosure of invariant manifolds of continuous systems is introduced. Using methods developed for discrete maps, polynomial approximations of invariant manifolds of hyperbolic fixed points of ODEs are obtained. These approximations are outfit with a sharp error bound which is verified to rigorously contain the manifolds. While we focus on the three dimensional case, verification in higher dimensions is possible using similar techniques. Integrating the resulting enclosures using the verified COSY VI integrator, the initial manifold enclosures are expanded to yield sharp enclosures of large parts of the stable and unstable manifolds. To demonstrate the effectiveness of this method, we construct enclosures of the invariant manifolds of the Lorenz system and show pictures of the resulting manifold enclosures. To the best of our knowledge, these enclosures are the largest verified enclosures of manifolds in the Lorenz system in existence.

  15. Novel Method of Production Decline Analysis

    NASA Astrophysics Data System (ADS)

    Xie, Shan; Lan, Yifei; He, Lei; Jiao, Yang; Wu, Yong

    2018-02-01

    ARPS decline curves is the most commonly used in oil and gas field due to its minimal data requirements and ease application. And prediction of production decline which is based on ARPS analysis rely on known decline type. However, when coefficient index are very approximate under different decline type, it is difficult to directly recognize decline trend of matched curves. Due to difficulties above, based on simulation results of multi-factor response experiments, a new dynamic decline prediction model is introduced with using multiple linear regression of influence factors. First of all, according to study of effect factors of production decline, interaction experimental schemes are designed. Based on simulated results, annual decline rate is predicted by decline model. Moreover, the new method is applied in A gas filed of Ordos Basin as example to illustrate reliability. The result commit that the new model can directly predict decline tendency without needing recognize decline style. From arithmetic aspect, it also take advantage of high veracity. Finally, the new method improves the evaluation method of gas well production decline in low permeability gas reservoir, which also provides technical support for further understanding of tight gas field development laws.

  16. Comparison of the Effectiveness of a Traditional Intermediate Algebra Course With That of a Less Rigorous Intermediate Algebra Course in Preparing Students for Success in a Subsequent Mathematics Course

    ERIC Educational Resources Information Center

    Sworder, Steven C.

    2007-01-01

    An experimental two-track intermediate algebra course was offered at Saddleback College, Mission Viejo, CA, between the Fall, 2002 and Fall, 2005 semesters. One track was modeled after the existing traditional California community college intermediate algebra course and the other track was a less rigorous intermediate algebra course in which the…

  17. Cognition in multiple sclerosis

    PubMed Central

    Benedict, Ralph; Enzinger, Christian; Filippi, Massimo; Geurts, Jeroen J.; Hamalainen, Paivi; Hulst, Hanneke; Inglese, Matilde; Leavitt, Victoria M.; Rocca, Maria A.; Rosti-Otajarvi, Eija M.; Rao, Stephen

    2018-01-01

    Cognitive decline is recognized as a prevalent and debilitating symptom of multiple sclerosis (MS), especially deficits in episodic memory and processing speed. The field aims to (1) incorporate cognitive assessment into standard clinical care and clinical trials, (2) utilize state-of-the-art neuroimaging to more thoroughly understand neural bases of cognitive deficits, and (3) develop effective, evidence-based, clinically feasible interventions to prevent or treat cognitive dysfunction, which are lacking. There are obstacles to these goals. Our group of MS researchers and clinicians with varied expertise took stock of the current state of the field, and we identify several important practical and theoretical challenges, including key knowledge gaps and methodologic limitations related to (1) understanding and measurement of cognitive deficits, (2) neuroimaging of neural bases and correlates of deficits, and (3) development of effective treatments. This is not a comprehensive review of the extensive literature, but instead a statement of guidelines and priorities for the field. For instance, we provide recommendations for improving the scientific basis and methodologic rigor for cognitive rehabilitation research. Toward this end, we call for multidisciplinary collaborations toward development of biologically based theoretical models of cognition capable of empirical validation and evidence-based refinement, providing the scientific context for effective treatment discovery. PMID:29343470

  18. An attempt to obtain a detailed declination chart from the United States magnetic anomaly map

    USGS Publications Warehouse

    Alldredge, L.R.

    1989-01-01

    Modern declination charts of the United States show almost no details. It was hoped that declination details could be derived from the information contained in the existing magnetic anomaly map of the United States. This could be realized only if all of the survey data were corrected to a common epoch, at which time a main-field vector model was known, before the anomaly values were computed. Because this was not done, accurate declination values cannot be determined. In spite of this conclusion, declination values were computed using a common main-field model for the entire United States to see how well they compared with observed values. The computed detailed declination values were found to compare less favourably with observed values of declination than declination values computed from the IGRF 1985 model itself. -from Author

  19. Sandhoff Disease

    MedlinePlus

    ... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ... virus-delivered gene therapy seen in an animal model of Tay-Sachs and Sandhoff diseases for use ...

  20. Nonstructural carbon dynamics are best predicted by the combination of photosynthesis and plant hydraulics during both bark beetle induced mortality and herbaceous plant response to drought

    NASA Astrophysics Data System (ADS)

    Ewers, B. E.; Mackay, D. S.; Guadagno, C.; Peckham, S. D.; Pendall, E.; Borkhuu, B.; Aston, T.; Frank, J. M.; Massman, W. J.; Reed, D. E.; Yarkhunova, Y.; Weinig, C.

    2012-12-01

    Recent work has shown that nonstructural carbon (NSC) provides both a signal and consequence of water stress in plants. The dynamics of NSC are likely not solely a result of the balance of photosynthesis and respiration (carbon starvation hypothesis) but also the availability of NSC for plant functions due to hydraulic condition. Further, plant hydraulics regulates photosynthesis both directly through stomatal conductance and indirectly through leaf water status control over leaf biochemistry. To test these hypotheses concerning NSC in response to a wide variety of plant perturbations, we used a model that combines leaf biochemical controls over photosynthesis (Farquhar model) with dynamic plant hydraulic conductance (Sperry model). This model (Terrestrial Regional Ecosystem Exchange Simulator; TREES) simulates the dynamics of NSC through a carbon budget approach that responds to plant hydraulic status. We tested TREES on two dramatically different datasets. The first dataset is from lodgepole pine and Engelmann spruce trees dying from bark beetles that carry blue-stain fungi which block xylem and cause hydraulic failure. The second data set is from Brassica rapa, a small herbaceous plant whose accessions are used in a variety of crops. The Brassica rapa plants include two parents whose circadian clock periods are different; NSC is known to provide inputs to the circadian clock likely modified by drought. Thus, drought may interact with clock control to constrain how NSC changes over the day. The Brassica rapa plants were grown in growth chamber conditions where drought was precisely controlled. The connection between these datasets is that both provide rigorous tests of our understanding of plant NSC dynamics and use similar leaf and whole plant gas exchange and NSC laboratory methods. Our results show that NSC decline (<10% in the whole plant) is less precipitous than expected from carbon starvation alone because both C uptake and use are impacted by water stress. The model is able to capture this relatively small decline in NSC by limiting NSC utilization through loss of plant hydraulic conductance. Our findings imply that NSC dynamics in plants undergoing water stress cannot be explained solely by carbon starvation or hydraulic failure but rather from the combination of both hypotheses. Our future work will determine whether additional environmental factors such as seasonality and plant developmental state alter the response of NSC to water stress.

  1. Elimination of the Out-of-Pocket Charge for Children's Primary Care Visits: An Application of Value-Based Insurance Design.

    PubMed

    Sepúlveda, Martín-J; Roebuck, M Christopher; Fronstin, Paul; Vidales-Calderon, Pablo; Parikh, Ashish; Rhee, Kyu

    2016-08-01

    To evaluate the impact of a value-based insurance design for primary care among children. A retrospective analysis of health care claims data on 25 950 children (<18 years of age) was conducted. Individuals were enrolled in a large employer's health plans when zero out-of-pocket cost for primary care physician visits was implemented. A rigorous propensity score matching process was used to generate a control group of equal size from a database of other employer-sponsored insurees. Multivariate difference-in-differences models estimated the effect of zero out-of-pocket cost on 21 health services and cost outcomes 24 months after intervention. Zero out-of-pocket cost for primary care was associated with significant increases (P < .01) in primary care physician visits (+32 per 100 children), as well as decreases in emergency department (-5 per 100 children) and specialist physician visits (-12 per 100 children). The number of prescription drug fills also declined (-20 per 100 children), yet medication adherence for 3 chronic conditions was unaffected. The receipt of well child visits and 4 recommended vaccinations were all significantly (P < .05) greater under the new plan design feature. Employer costs for primary care increased significantly (P < .01) in association with greater utilization ($29 per child), but specialist visit costs declined (-$12 per child) and total health care costs per child did not exhibit a statistically significant increase. This novel application of value-based insurance design warrants broader deployment and assessment of its longer term outcomes. As with recommended preventive services, policymakers should consider exempting primary care from health insurance cost-sharing. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Educating Family Caregivers for Older Adults About Delirium: A Systematic Review.

    PubMed

    Bull, Margaret J; Boaz, Lesley; Jermé, Martha

    2016-06-01

    Delirium in older adults is considered a medical emergency; it contributes to a cascade of functional decline and to increased mortality. Early recognition of delirium symptoms is critical to prevent these negative consequences. Family caregivers who are educated about delirium could partner with nurses and other healthcare professionals in early recognition of delirium symptoms. Before implementing such partnership models, it is important to examine the effectiveness of educating family caregivers about delirium. To examine whether providing education on delirium to family caregivers improved their knowledge, emotional state, or response in reducing the incidence of delirium in older adults. For this systematic review, we conducted literature searches in CINAHL, Cochrane Library, Medline, PsycINFO, Web of Science, Social Sciences in ProQuest, Dissertations and Theses, and the Virginia Henderson Global Nursing eRepository for studies published in the English language between January 2000 and June 2015. Criteria for inclusion were: (a) primary focus on educating family caregivers for older adults about delirium; (b) use of experimental, quasi-experimental, or comparative design; (c) measured family caregiver outcomes of delirium knowledge, emotional state, or response in reducing delirium incidence in older adults; and (d) published in the English language. Articles were appraised using Melnyk's rapid critical appraisal guides. Seven studies met the review criteria. Four studies found that family caregivers' delirium knowledge increased; two noted that delirium incidence in older adults declined; and one study reported less distress following receipt of education. Providing family caregivers with information about delirium can be beneficial for both family caregivers and older adults. However, rigorous evaluation of education programs for family caregivers about delirium is needed. © 2016 Sigma Theta Tau International.

  3. Effects of streptozotocin-induced diabetes on bladder and erectile (dys)function in the same rat in vivo.

    PubMed

    Christ, George J; Hsieh, Yi; Zhao, Weixin; Schenk, Gregory; Venkateswarlu, Karicheti; Wang, Hong-Zhan; Tar, Moses T; Melman, Arnold

    2006-05-01

    To establish the methods, feasibility and utility of evaluating the impact of diabetes on bladder and erectile function in the same rat, as more than half of diabetic patients have bladder dysfunction, and half of diabetic men have erectile dysfunction, but the severity of coincident disease has not been rigorously assessed. In all, 16 F-344 rats had diabetes induced by streptozotocin (STZ), and were divided into insulin-treated (five) and untreated (11), and compared with age-matched controls (10), all assessed in parallel. All STZ rats were diabetic for 8-11 weeks. Cystometric studies were conducted on all rats, with cavernosometric studies conducted on a subset of rats. There were insulin-reversible increases in the following cystometric variables; bladder weight, bladder capacity, micturition volume, residual volume, micturition pressure and spontaneous activity (P < 0.05, in all, one-way analysis of variance, anova). Cavernosometry showed a diabetes-related, insulin-reversible decline in the cavernosal nerve-stimulated intracavernosal pressure (ICP) response at all levels of current stimulation (P < 0.05, in all one-way anova). Plotting erectile capacity (i.e. ICP) against bladder capacity showed no correlation between the extent of the decline in erectile capacity and the magnitude of the increase in bladder capacity. These studies extend previous work to indicate that the extent of diabetes-related bladder and erectile dysfunction can vary in the same rat. As such, these findings highlight the importance of evaluating the impact of diabetes on multiple organ systems in the lower urinary tract. Future studies using this model system should lead to a better understanding of the initiation, development, progression and coincidence of these common diabetic complications.

  4. A program to increase seat belt use along the Texas-Mexico border.

    PubMed

    Cohn, Lawrence D; Hernandez, Delia; Byrd, Theresa; Cortes, Miguel

    2002-12-01

    A school-based, bilingual intervention was developed to increase seat belt use among families living along the Texas-Mexico border. The intervention sought to increase seat belt use by changing perceived norms within the community (i.e., making the nonuse of seat belts less socially acceptable). The intervention was implemented in more than 110 classrooms and involved more than 2100 children. Blind coding, validity checks, and reliability estimates contributed to a rigorous program evaluation. Seat belt use increased by 10% among children riding in the front seat of motor vehicles in the intervention community, as compared with a small but nonsignificant decline in use among control community children. Seat belt use among drivers did not increase.

  5. Wernicke-Korsakoff Syndrome

    MedlinePlus

    ... Coordinating Committees CounterACT Rigor & Transparency Scientific Resources Animal Models Cell/Tissue/DNA Clinical and Translational Resources Gene ... modulation of certain nerve cells in a rodent model of amnesia produced by by thiamine deficiency. The ...

  6. Interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations.

    PubMed

    Simic, Vladimir

    2016-06-01

    As the number of end-of-life vehicles (ELVs) is estimated to increase to 79.3 million units per year by 2020 (e.g., 40 million units were generated in 2010), there is strong motivation to effectively manage this fast-growing waste flow. Intensive work on management of ELVs is necessary in order to more successfully tackle this important environmental challenge. This paper proposes an interval-parameter chance-constraint programming model for end-of-life vehicles management under rigorous environmental regulations. The proposed model can incorporate various uncertainty information in the modeling process. The complex relationships between different ELV management sub-systems are successfully addressed. Particularly, the formulated model can help identify optimal patterns of procurement from multiple sources of ELV supply, production and inventory planning in multiple vehicle recycling factories, and allocation of sorted material flows to multiple final destinations under rigorous environmental regulations. A case study is conducted in order to demonstrate the potentials and applicability of the proposed model. Various constraint-violation probability levels are examined in detail. Influences of parameter uncertainty on model solutions are thoroughly investigated. Useful solutions for the management of ELVs are obtained under different probabilities of violating system constraints. The formulated model is able to tackle a hard, uncertainty existing ELV management problem. The presented model has advantages in providing bases for determining long-term ELV management plans with desired compromises between economic efficiency of vehicle recycling system and system-reliability considerations. The results are helpful for supporting generation and improvement of ELV management plans. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Derivation of phase functions from multiply scattered sunlight transmitted through a hazy atmosphere

    NASA Technical Reports Server (NTRS)

    Weinman, J. A.; Twitty, J. T.; Browning, S. R.; Herman, B. M.

    1975-01-01

    The intensity of sunlight multiply scattered in model atmospheres is derived from the equation of radiative transfer by an analytical small-angle approximation. The approximate analytical solutions are compared to rigorous numerical solutions of the same problem. Results obtained from an aerosol-laden model atmosphere are presented. Agreement between the rigorous and the approximate solutions is found to be within a few per cent. The analytical solution to the problem which considers an aerosol-laden atmosphere is then inverted to yield a phase function which describes a single scattering event at small angles. The effect of noisy data on the derived phase function is discussed.

  8. Fast synthesis of topographic mask effects based on rigorous solutions

    NASA Astrophysics Data System (ADS)

    Yan, Qiliang; Deng, Zhijie; Shiely, James

    2007-10-01

    Topographic mask effects can no longer be ignored at technology nodes of 45 nm, 32 nm and beyond. As feature sizes become comparable to the mask topographic dimensions and the exposure wavelength, the popular thin mask model breaks down, because the mask transmission no longer follows the layout. A reliable mask transmission function has to be derived from Maxwell equations. Unfortunately, rigorous solutions of Maxwell equations are only manageable for limited field sizes, but impractical for full-chip optical proximity corrections (OPC) due to the prohibitive runtime. Approximation algorithms are in demand to achieve a balance between acceptable computation time and tolerable errors. In this paper, a fast algorithm is proposed and demonstrated to model topographic mask effects for OPC applications. The ProGen Topographic Mask (POTOMAC) model synthesizes the mask transmission functions out of small-sized Maxwell solutions from a finite-difference-in-time-domain (FDTD) engine, an industry leading rigorous simulator of topographic mask effect from SOLID-E. The integral framework presents a seamless solution to the end user. Preliminary results indicate the overhead introduced by POTOMAC is contained within the same order of magnitude in comparison to the thin mask approach.

  9. An intervention to help community-based organizations implement an evidence-based HIV prevention intervention: the Mpowerment Project technology exchange system.

    PubMed

    Kegeles, Susan M; Rebchook, Gregory; Pollack, Lance; Huebner, David; Tebbetts, Scott; Hamiga, John; Sweeney, David; Zovod, Benjamin

    2012-03-01

    Considerable resources have been spent developing and rigorously testing HIV prevention intervention models, but such models do not impact the AIDS pandemic unless they are implemented effectively by community-based organizations (CBOs) and health departments. The Mpowerment Project (MP) is being implemented by CBOs around the U.S. It is a multilevel, evidence-based HIV prevention program for young gay/bisexual men that targets individual, interpersonal, social, and structural issues by using empowerment and community mobilization methods. This paper discusses the development of an intervention to help CBOs implement the MP called the Mpowerment Project Technology Exchange System (MPTES); CBOs' uptake, utilization and perceptions of the MPTES components; and issues that arose during technical assistance. The seven-component MPTES was provided to 49 CBOs implementing the MP that were followed longitudinally for up to two years. Except for the widely used program manual, other program materials were used early in implementing the MP and then their use declined. In contrast, once technical assistance was proactively provided, its usage remained constant over time, as did requests for technical assistance. CBOs expressed substantial positive feedback about the MPTES, but felt that it needs more focus on diversity issues, describing real world implementation approaches, and providing guidance on how to adapt the MP to diverse populations.

  10. A METHODOLOGY FOR ESTIMATING UNCERTAINTY OF A DISTRIBUTED HYDROLOGIC MODEL: APPLICATION TO POCONO CREEK WATERSHED

    EPA Science Inventory

    Utility of distributed hydrologic and water quality models for watershed management and sustainability studies should be accompanied by rigorous model uncertainty analysis. However, the use of complex watershed models primarily follows the traditional {calibrate/validate/predict}...

  11. Mechanical properties of frog skeletal muscles in iodoacetic acid rigor.

    PubMed Central

    Mulvany, M J

    1975-01-01

    1. Methods have been developed for describing the length: tension characteristics of frog skeletal muscles which go into rigor at 4 degrees C following iodoacetic acid poisoning either in the presence of Ca2+ (Ca-rigor) or its absence (Ca-free-rigor). 2. Such rigor muscles showed less resistance to slow stretch (slow rigor resistance) that to fast stretch (fast rigor resistance). The slow and fast rigor resistances of Ca-free-rigor muscles were much lower than those of Ca-rigor muscles. 3. The slow rigor resistance of Ca-rigor muscles was proportional to the amount of overlap between the contractile filaments present when the muscles were put into rigor. 4. Withdrawing Ca2+ from Ca-rigor muscles (induced-Ca-free rigor) reduced their slow and fast rigor resistances. Readdition of Ca2+ (but not Mg2+, Mn2+ or Sr2+) reversed the effect. 5. The slow and fast rigor resistances of Ca-rigor muscles (but not of Ca-free-rigor muscles) decreased with time. 6.The sarcomere structure of Ca-rigor and induced-Ca-free rigor muscles stretched by 0.2lo was destroyed in proportion to the amount of stretch, but the lengths of the remaining intact sarcomeres were essentially unchanged. This suggests that there had been a successive yielding of the weakeast sarcomeres. 7. The difference between the slow and fast rigor resistance and the effect of calcium on these resistances are discussed in relation to possible variations in the strength of crossbridges between the thick and thin filaments. Images Plate 1 Plate 2 PMID:1082023

  12. A Rigorous Investigation on the Ground State of the Penson-Kolb Model

    NASA Astrophysics Data System (ADS)

    Yang, Kai-Hua; Tian, Guang-Shan; Han, Ru-Qi

    2003-05-01

    By using either numerical calculations or analytical methods, such as the bosonization technique, the ground state of the Penson-Kolb model has been previously studied by several groups. Some physicists argued that, as far as the existence of superconductivity in this model is concerned, it is canonically equivalent to the negative-U Hubbard model. However, others did not agree. In the present paper, we shall investigate this model by an independent and rigorous approach. We show that the ground state of the Penson-Kolb model is nondegenerate and has a nonvanishing overlap with the ground state of the negative-U Hubbard model. Furthermore, we also show that the ground states of both the models have the same good quantum numbers and may have superconducting long-range order at the same momentum q = 0. Our results support the equivalence between these models. The project partially supported by the Special Funds for Major State Basic Research Projects (G20000365) and National Natural Science Foundation of China under Grant No. 10174002

  13. Modeling of profilometry with laser focus sensors

    NASA Astrophysics Data System (ADS)

    Bischoff, Jörg; Manske, Eberhard; Baitinger, Henner

    2011-05-01

    Metrology is of paramount importance in submicron patterning. Particularly, line width and overlay have to be measured very accurately. Appropriated metrology techniques are scanning electron microscopy and optical scatterometry. The latter is non-invasive, highly accurate and enables optical cross sections of layer stacks but it requires periodic patterns. Scanning laser focus sensors are a viable alternative enabling the measurement of non-periodic features. Severe limitations are imposed by the diffraction limit determining the edge location accuracy. It will be shown that the accuracy can be greatly improved by means of rigorous modeling. To this end, a fully vectorial 2.5-dimensional model has been developed based on rigorous Maxwell solvers and combined with models for the scanning and various autofocus principles. The simulations are compared with experimental results. Moreover, the simulations are directly utilized to improve the edge location accuracy.

  14. Agricultural model intercomparison and improvement project: Overview of model intercomparisons

    USDA-ARS?s Scientific Manuscript database

    Improvement of crop simulation models to better estimate growth and yield is one of the objectives of the Agricultural Model Intercomparison and Improvement Project (AgMIP). The overall goal of AgMIP is to provide an assessment of crop model through rigorous intercomparisons and evaluate future clim...

  15. Oak decline risk rating for the southeastern United States

    Treesearch

    S. Oak; F. Tainter; J. Williams; D. Starkey

    1996-01-01

    Oak decline risk rating models were developed for upland hardwood forests in the southeastern United States using data gathered during regional oak decline surveys. Stepwise discriminant analyses were used to relate 12 stand and site variables with major oak decline incidence for each of three subregions plus one incorporating all subregions. The best model for the...

  16. Impact of topographic mask models on scanner matching solutions

    NASA Astrophysics Data System (ADS)

    Tyminski, Jacek K.; Pomplun, Jan; Renwick, Stephen P.

    2014-03-01

    Of keen interest to the IC industry are advanced computational lithography applications such as Optical Proximity Correction of IC layouts (OPC), scanner matching by optical proximity effect matching (OPEM), and Source Optimization (SO) and Source-Mask Optimization (SMO) used as advanced reticle enhancement techniques. The success of these tasks is strongly dependent on the integrity of the lithographic simulators used in computational lithography (CL) optimizers. Lithographic mask models used by these simulators are key drivers impacting the accuracy of the image predications, and as a consequence, determine the validity of these CL solutions. Much of the CL work involves Kirchhoff mask models, a.k.a. thin masks approximation, simplifying the treatment of the mask near-field images. On the other hand, imaging models for hyper-NA scanner require that the interactions of the illumination fields with the mask topography be rigorously accounted for, by numerically solving Maxwell's Equations. The simulators used to predict the image formation in the hyper-NA scanners must rigorously treat the masks topography and its interaction with the scanner illuminators. Such imaging models come at a high computational cost and pose challenging accuracy vs. compute time tradeoffs. Additional complication comes from the fact that the performance metrics used in computational lithography tasks show highly non-linear response to the optimization parameters. Finally, the number of patterns used for tasks such as OPC, OPEM, SO, or SMO range from tens to hundreds. These requirements determine the complexity and the workload of the lithography optimization tasks. The tools to build rigorous imaging optimizers based on first-principles governing imaging in scanners are available, but the quantifiable benefits they might provide are not very well understood. To quantify the performance of OPE matching solutions, we have compared the results of various imaging optimization trials obtained with Kirchhoff mask models to those obtained with rigorous models involving solutions of Maxwell's Equations. In both sets of trials, we used sets of large numbers of patterns, with specifications representative of CL tasks commonly encountered in hyper-NA imaging. In this report we present OPEM solutions based on various mask models and discuss the models' impact on hyper- NA scanner matching accuracy. We draw conclusions on the accuracy of results obtained with thin mask models vs. the topographic OPEM solutions. We present various examples representative of the scanner image matching for patterns representative of the current generation of IC designs.

  17. Probability bounds analysis for nonlinear population ecology models.

    PubMed

    Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A

    2015-09-01

    Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals. Copyright © 2015. Published by Elsevier Inc.

  18. Advanced EUV mask and imaging modeling

    NASA Astrophysics Data System (ADS)

    Evanschitzky, Peter; Erdmann, Andreas

    2017-10-01

    The exploration and optimization of image formation in partially coherent EUV projection systems with complex source shapes requires flexible, accurate, and efficient simulation models. This paper reviews advanced mask diffraction and imaging models for the highly accurate and fast simulation of EUV lithography systems, addressing important aspects of the current technical developments. The simulation of light diffraction from the mask employs an extended rigorous coupled wave analysis (RCWA) approach, which is optimized for EUV applications. In order to be able to deal with current EUV simulation requirements, several additional models are included in the extended RCWA approach: a field decomposition and a field stitching technique enable the simulation of larger complex structured mask areas. An EUV multilayer defect model including a database approach makes the fast and fully rigorous defect simulation and defect repair simulation possible. A hybrid mask simulation approach combining real and ideal mask parts allows the detailed investigation of the origin of different mask 3-D effects. The image computation is done with a fully vectorial Abbe-based approach. Arbitrary illumination and polarization schemes and adapted rigorous mask simulations guarantee a high accuracy. A fully vectorial sampling-free description of the pupil with Zernikes and Jones pupils and an optimized representation of the diffraction spectrum enable the computation of high-resolution images with high accuracy and short simulation times. A new pellicle model supports the simulation of arbitrary membrane stacks, pellicle distortions, and particles/defects on top of the pellicle. Finally, an extension for highly accurate anamorphic imaging simulations is included. The application of the models is demonstrated by typical use cases.

  19. Single toxin dose-response models revisited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demidenko, Eugene, E-mail: eugened@dartmouth.edu

    The goal of this paper is to offer a rigorous analysis of the sigmoid shape single toxin dose-response relationship. The toxin efficacy function is introduced and four special points, including maximum toxin efficacy and inflection points, on the dose-response curve are defined. The special points define three phases of the toxin effect on mortality: (1) toxin concentrations smaller than the first inflection point or (2) larger then the second inflection point imply low mortality rate, and (3) concentrations between the first and the second inflection points imply high mortality rate. Probabilistic interpretation and mathematical analysis for each of the fourmore » models, Hill, logit, probit, and Weibull is provided. Two general model extensions are introduced: (1) the multi-target hit model that accounts for the existence of several vital receptors affected by the toxin, and (2) model with a nonzero mortality at zero concentration to account for natural mortality. Special attention is given to statistical estimation in the framework of the generalized linear model with the binomial dependent variable as the mortality count in each experiment, contrary to the widespread nonlinear regression treating the mortality rate as continuous variable. The models are illustrated using standard EPA Daphnia acute (48 h) toxicity tests with mortality as a function of NiCl or CuSO{sub 4} toxin. - Highlights: • The paper offers a rigorous study of a sigmoid dose-response relationship. • The concentration with highest mortality rate is rigorously defined. • A table with four special points for five morality curves is presented. • Two new sigmoid dose-response models have been introduced. • The generalized linear model is advocated for estimation of sigmoid dose-response relationship.« less

  20. Peer Assessment with Online Tools to Improve Student Modeling

    NASA Astrophysics Data System (ADS)

    Atkins, Leslie J.

    2012-11-01

    Introductory physics courses often require students to develop precise models of phenomena and represent these with diagrams, including free-body diagrams, light-ray diagrams, and maps of field lines. Instructors expect that students will adopt a certain rigor and precision when constructing these diagrams, but we want that rigor and precision to be an aid to sense-making rather than meeting seemingly arbitrary requirements set by the instructor. By giving students the authority to develop their own models and establish requirements for their diagrams, the sense that these are arbitrary requirements diminishes and students are more likely to see modeling as a sense-making activity. The practice of peer assessment can help students take ownership; however, it can be difficult for instructors to manage. Furthermore, it is not without risk: students can be reluctant to critique their peers, they may view this as the job of the instructor, and there is no guarantee that students will employ greater rigor and precision as a result of peer assessment. In this article, we describe one approach for peer assessment that can establish norms for diagrams in a way that is student driven, where students retain agency and authority in assessing and improving their work. We show that such an approach does indeed improve students' diagrams and abilities to assess their own work, without sacrificing students' authority and agency.

  1. Separating intrinsic from extrinsic fluctuations in dynamic biological systems

    PubMed Central

    Paulsson, Johan

    2011-01-01

    From molecules in cells to organisms in ecosystems, biological populations fluctuate due to the intrinsic randomness of individual events and the extrinsic influence of changing environments. The combined effect is often too complex for effective analysis, and many studies therefore make simplifying assumptions, for example ignoring either intrinsic or extrinsic effects to reduce the number of model assumptions. Here we mathematically demonstrate how two identical and independent reporters embedded in a shared fluctuating environment can be used to identify intrinsic and extrinsic noise terms, but also how these contributions are qualitatively and quantitatively different from what has been previously reported. Furthermore, we show for which classes of biological systems the noise contributions identified by dual-reporter methods correspond to the noise contributions predicted by correct stochastic models of either intrinsic or extrinsic mechanisms. We find that for broad classes of systems, the extrinsic noise from the dual-reporter method can be rigorously analyzed using models that ignore intrinsic stochasticity. In contrast, the intrinsic noise can be rigorously analyzed using models that ignore extrinsic stochasticity only under very special conditions that rarely hold in biology. Testing whether the conditions are met is rarely possible and the dual-reporter method may thus produce flawed conclusions about the properties of the system, particularly about the intrinsic noise. Our results contribute toward establishing a rigorous framework to analyze dynamically fluctuating biological systems. PMID:21730172

  2. Separating intrinsic from extrinsic fluctuations in dynamic biological systems.

    PubMed

    Hilfinger, Andreas; Paulsson, Johan

    2011-07-19

    From molecules in cells to organisms in ecosystems, biological populations fluctuate due to the intrinsic randomness of individual events and the extrinsic influence of changing environments. The combined effect is often too complex for effective analysis, and many studies therefore make simplifying assumptions, for example ignoring either intrinsic or extrinsic effects to reduce the number of model assumptions. Here we mathematically demonstrate how two identical and independent reporters embedded in a shared fluctuating environment can be used to identify intrinsic and extrinsic noise terms, but also how these contributions are qualitatively and quantitatively different from what has been previously reported. Furthermore, we show for which classes of biological systems the noise contributions identified by dual-reporter methods correspond to the noise contributions predicted by correct stochastic models of either intrinsic or extrinsic mechanisms. We find that for broad classes of systems, the extrinsic noise from the dual-reporter method can be rigorously analyzed using models that ignore intrinsic stochasticity. In contrast, the intrinsic noise can be rigorously analyzed using models that ignore extrinsic stochasticity only under very special conditions that rarely hold in biology. Testing whether the conditions are met is rarely possible and the dual-reporter method may thus produce flawed conclusions about the properties of the system, particularly about the intrinsic noise. Our results contribute toward establishing a rigorous framework to analyze dynamically fluctuating biological systems.

  3. Comparing an annual and daily time-step model for predicting field-scale P loss

    USDA-ARS?s Scientific Manuscript database

    Several models with varying degrees of complexity are available for describing P movement through the landscape. The complexity of these models is dependent on the amount of data required by the model, the number of model parameters needed to be estimated, the theoretical rigor of the governing equa...

  4. A rigorous approach to investigating common assumptions about disease transmission: Process algebra as an emerging modelling methodology for epidemiology.

    PubMed

    McCaig, Chris; Begon, Mike; Norman, Rachel; Shankland, Carron

    2011-03-01

    Changing scale, for example, the ability to move seamlessly from an individual-based model to a population-based model, is an important problem in many fields. In this paper, we introduce process algebra as a novel solution to this problem in the context of models of infectious disease spread. Process algebra allows us to describe a system in terms of the stochastic behaviour of individuals, and is a technique from computer science. We review the use of process algebra in biological systems, and the variety of quantitative and qualitative analysis techniques available. The analysis illustrated here solves the changing scale problem: from the individual behaviour we can rigorously derive equations to describe the mean behaviour of the system at the level of the population. The biological problem investigated is the transmission of infection, and how this relates to individual interactions.

  5. New activity-based funding model for Australian private sector overnight rehabilitation cases: the rehabilitation Australian National Sub-Acute and Non-Acute Patient (AN-SNAP) model.

    PubMed

    Hanning, Brian; Predl, Nicolle

    2015-09-01

    Traditional overnight rehabilitation payment models in the private sector are not based on a rigorous classification system and vary greatly between contracts with no consideration of patient complexity. The payment rates are not based on relative cost and the length-of-stay (LOS) point at which a reduced rate applies (step downs) varies markedly. The rehabilitation Australian National Sub-Acute and Non-Acute Patient (AN-SNAP) model (RAM), which has been in place for over 2 years in some private hospitals, bases payment on a rigorous classification system, relative cost and industry LOS. RAM is in the process of being rolled out more widely. This paper compares and contrasts RAM with traditional overnight rehabilitation payment models. It considers the advantages of RAM for hospitals and Australian Health Service Alliance. It also considers payment model changes in the context of maintaining industry consistency with Electronic Claims Lodgement and Information Processing System Environment (ECLIPSE) and health reform generally.

  6. Random Matrix Theory and the Anderson Model

    NASA Astrophysics Data System (ADS)

    Bellissard, Jean

    2004-08-01

    This paper is devoted to a discussion of possible strategies to prove rigorously the existence of a metal-insulator Anderson transition for the Anderson model in dimension d≥3. The possible criterions used to define such a transition are presented. It is argued that at low disorder the lowest order in perturbation theory is described by a random matrix model. Various simplified versions for which rigorous results have been obtained in the past are discussed. It includes a free probability approach, the Wegner n-orbital model and a class of models proposed by Disertori, Pinson, and Spencer, Comm. Math. Phys. 232:83-124 (2002). At last a recent work by Magnen, Rivasseau, and the author, Markov Process and Related Fields 9:261-278 (2003) is summarized: it gives a toy modeldescribing the lowest order approximation of Anderson model and it is proved that, for d=2, its density of states is given by the semicircle distribution. A short discussion of its extension to d≥3 follows.

  7. Finite state projection based bounds to compare chemical master equation models using single-cell data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, Zachary; Neuert, Gregor; Department of Pharmacology, School of Medicine, Vanderbilt University, Nashville, Tennessee 37232

    2016-08-21

    Emerging techniques now allow for precise quantification of distributions of biological molecules in single cells. These rapidly advancing experimental methods have created a need for more rigorous and efficient modeling tools. Here, we derive new bounds on the likelihood that observations of single-cell, single-molecule responses come from a discrete stochastic model, posed in the form of the chemical master equation. These strict upper and lower bounds are based on a finite state projection approach, and they converge monotonically to the exact likelihood value. These bounds allow one to discriminate rigorously between models and with a minimum level of computational effort.more » In practice, these bounds can be incorporated into stochastic model identification and parameter inference routines, which improve the accuracy and efficiency of endeavors to analyze and predict single-cell behavior. We demonstrate the applicability of our approach using simulated data for three example models as well as for experimental measurements of a time-varying stochastic transcriptional response in yeast.« less

  8. Modeling of geomagnetic field secular variations observed in the Balkan area for purposes of regional topographic mapping

    NASA Astrophysics Data System (ADS)

    Metodiev, Metodi; Trifonova, Petya; Buchvarov, Ivan

    2014-05-01

    The most significant of the Earth's magnetic field elements is the geomagnetic declination, which is widely used in geodesy, cartography and their associated navigational systems. The geomagnetic declination is incorporated in the naval navigation maps and is used in the navigation process. It is also a very important factor for aviation where declination data have major importance for every airport (civil or military). As the geomagnetic field changes with time but maps of the geomagnetic declination are not published annually and are reduced to an epoch in the past, it is necessary to define two additional parameters in the maps, needed to determine the value of the geomagnetic declination for a particular moment in the future: 1) estimated value of the annual declination variation and 2) a table with the average diurnal variation of the declination for a given month and hour. The goal of our research is to analyze the annual mean values of geomagnetic declination on the territory of the Balkan Peninsula for obtaining of a best fitting model of that parameter which can be used for prediction of the declination value for the next 10 years. The same study was performed in 1990 for the purposes of Bulgarian declination map's preparation. As a result, a linear model of the declination annual variation was obtained for the neighboring observatories and repeat stations data, and a map of the obtained values for the Bulgarian territory was drawn. We use the latest version of the GFZ Reference Internal Magnetic Model (GRIMM-3.0) to compare the magnetic field evolution predicted by that model between 2001 and 2010 to the data collected in five independent geomagnetic observatories in the Balkan region (PAG, SUA, PEG, IZN, GCK) over the same time interval. We conclude that the geomagnetic core field secular variation in this area is well described by the global model. The observed small-scale differences might indicate induced lithospheric anomalies but it is still an open question in geomagnetism whether induction by the slowly changing main field in conductive structures in the lithosphere is a measurable part of what is observed as secular variation at and above the Earth's surface. In our study we test different time-scale periods and different order polynomials to create the most appropriate prediction model and to estimate our results. We find that linear models which are used to determine the annual declination variation in cartography provide enough accurate information for the declination map's users.

  9. Parent Management Training-Oregon Model: Adapting Intervention with Rigorous Research.

    PubMed

    Forgatch, Marion S; Kjøbli, John

    2016-09-01

    Parent Management Training-Oregon Model (PMTO(®) ) is a set of theory-based parenting programs with status as evidence-based treatments. PMTO has been rigorously tested in efficacy and effectiveness trials in different contexts, cultures, and formats. Parents, the presumed agents of change, learn core parenting practices, specifically skill encouragement, limit setting, monitoring/supervision, interpersonal problem solving, and positive involvement. The intervention effectively prevents and ameliorates children's behavior problems by replacing coercive interactions with positive parenting practices. Delivery format includes sessions with individual families in agencies or families' homes, parent groups, and web-based and telehealth communication. Mediational models have tested parenting practices as mechanisms of change for children's behavior and found support for the theory underlying PMTO programs. Moderating effects include children's age, maternal depression, and social disadvantage. The Norwegian PMTO implementation is presented as an example of how PMTO has been tailored to reach diverse populations as delivered by multiple systems of care throughout the nation. An implementation and research center in Oslo provides infrastructure and promotes collaboration between practitioners and researchers to conduct rigorous intervention research. Although evidence-based and tested within a wide array of contexts and populations, PMTO must continue to adapt to an ever-changing world. © 2016 Family Process Institute.

  10. Analytical Versus Numerical Estimates of Water-Level Declines Caused by Pumping, and a Case Study of the Iao Aquifer, Maui, Hawaii

    USGS Publications Warehouse

    Oki, Delwyn S.; Meyer, William

    2001-01-01

    Comparisons were made between model-calculated water levels from a one-dimensional analytical model referred to as RAM (Robust Analytical Model) and those from numerical ground-water flow models using a sharp-interface model code. RAM incorporates the horizontal-flow assumption and the Ghyben-Herzberg relation to represent flow in a one-dimensional unconfined aquifer that contains a body of freshwater floating on denser saltwater. RAM does not account for the presence of a low-permeability coastal confining unit (caprock), which impedes the discharge of fresh ground water from the aquifer to the ocean, nor for the spatial distribution of ground-water withdrawals from wells, which is significant because water-level declines are greatest in the vicinity of withdrawal wells. Numerical ground-water flow models can readily account for discharge through a coastal confining unit and for the spatial distribution of ground-water withdrawals from wells. For a given aquifer hydraulic-conductivity value, recharge rate, and withdrawal rate, model-calculated steady-state water-level declines from RAM can be significantly less than those from numerical ground-water flow models. The differences between model-calculated water-level declines from RAM and those from numerical models are partly dependent on the hydraulic properties of the aquifer system and the spatial distribution of ground-water withdrawals from wells. RAM invariably predicts the greatest water-level declines at the inland extent of the aquifer where the freshwater body is thickest and the potential for saltwater intrusion is lowest. For cases in which a low-permeability confining unit overlies the aquifer near the coast, however, water-level declines calculated from numerical models may exceed those from RAM even at the inland extent of the aquifer. Since 1990, RAM has been used by the State of Hawaii Commission on Water Resource Management for establishing sustainable-yield values for the State?s aquifers. Data from the Iao aquifer, which lies on the northeastern flank of the West Maui Volcano and which is confined near the coast by caprock, are now available to evaluate the predictive capability of RAM for this system. In 1995 and 1996, withdrawal from the Iao aquifer reached the 20 million gallon per day sustainable-yield value derived using RAM. However, even before 1996, water levels in the aquifer had declined significantly below those predicted by RAM, and continued to decline in 1997. To halt the decline of water levels and to preclude the intrusion of salt-water into the four major well fields in the aquifer, it was necessary to reduce withdrawal from the aquifer system below the sustainable-yield value derived using RAM. In the Iao aquifer, the decline of measured water levels below those predicted by RAM is consistent with the results of the numerical model analysis. Relative to model-calculated water-level declines from numerical ground-water flow models, (1) RAM underestimates water-level declines in areas where a low-permeability confining unit exists, and (2) RAM underestimates water-level declines in the vicinity of withdrawal wells.

  11. Determinants of the Rigor of State Protection Policies for Persons With Dementia in Assisted Living.

    PubMed

    Nattinger, Matthew C; Kaskie, Brian

    2017-01-01

    Continued growth in the number of individuals with dementia residing in assisted living (AL) facilities raises concerns about their safety and protection. However, unlike federally regulated nursing facilities, AL facilities are state-regulated and there is a high degree of variation among policies designed to protect persons with dementia. Despite the important role these protection policies have in shaping the quality of life of persons with dementia residing in AL facilities, little is known about their formation. In this research, we examined the adoption of AL protection policies pertaining to staffing, the physical environment, and the use of chemical restraints. For each protection policy type, we modeled policy rigor using an innovative point-in-time approach, incorporating variables associated with state contextual, institutional, political, and external factors. We found that the rate of state AL protection policy adoptions remained steady over the study period, with staffing policies becoming less rigorous over time. Variables reflecting institutional policy making, including legislative professionalism and bureaucratic oversight, were associated with the rigor of state AL dementia protection policies. As we continue to evaluate the mechanisms contributing to the rigor of AL protection policies, it seems that organized advocacy efforts might expand their role in educating state policy makers about the importance of protecting persons with dementia residing in AL facilities and moving to advance appropriate policies.

  12. Heterogeneous nucleation on convex spherical substrate surfaces: A rigorous thermodynamic formulation of Fletcher's classical model and the new perspectives derived.

    PubMed

    Qian, Ma; Ma, Jie

    2009-06-07

    Fletcher's spherical substrate model [J. Chem. Phys. 29, 572 (1958)] is a basic model for understanding the heterogeneous nucleation phenomena in nature. However, a rigorous thermodynamic formulation of the model has been missing due to the significant complexities involved. This has not only left the classical model deficient but also likely obscured its other important features, which would otherwise have helped to better understand and control heterogeneous nucleation on spherical substrates. This work presents a rigorous thermodynamic formulation of Fletcher's model using a novel analytical approach and discusses the new perspectives derived. In particular, it is shown that the use of an intermediate variable, a selected geometrical angle or pseudocontact angle between the embryo and spherical substrate, revealed extraordinary similarities between the first derivatives of the free energy change with respect to embryo radius for nucleation on spherical and flat substrates. Enlightened by the discovery, it was found that there exists a local maximum in the difference between the equivalent contact angles for nucleation on spherical and flat substrates due to the existence of a local maximum in the difference between the shape factors for nucleation on spherical and flat substrate surfaces. This helps to understand the complexity of the heterogeneous nucleation phenomena in a practical system. Also, it was found that the unfavorable size effect occurs primarily when R<5r( *) (R: radius of substrate and r( *): critical embryo radius) and diminishes rapidly with increasing value of R/r( *) beyond R/r( *)=5. This finding provides a baseline for controlling the size effects in heterogeneous nucleation.

  13. Extreme Response Style: Which Model Is Best?

    ERIC Educational Resources Information Center

    Leventhal, Brian

    2017-01-01

    More robust and rigorous psychometric models, such as multidimensional Item Response Theory models, have been advocated for survey applications. However, item responses may be influenced by construct-irrelevant variance factors such as preferences for extreme response options. Through empirical and simulation methods, this study evaluates the use…

  14. Rigorous derivation of the effective model describing a non-isothermal fluid flow in a vertical pipe filled with porous medium

    NASA Astrophysics Data System (ADS)

    Beneš, Michal; Pažanin, Igor

    2018-03-01

    This paper reports an analytical investigation of non-isothermal fluid flow in a thin (or long) vertical pipe filled with porous medium via asymptotic analysis. We assume that the fluid inside the pipe is cooled (or heated) by the surrounding medium and that the flow is governed by the prescribed pressure drop between pipe's ends. Starting from the dimensionless Darcy-Brinkman-Boussinesq system, we formally derive a macroscopic model describing the effective flow at small Brinkman-Darcy number. The asymptotic approximation is given by the explicit formulae for the velocity, pressure and temperature clearly acknowledging the effects of the cooling (heating) and porous structure. The theoretical error analysis is carried out to indicate the order of accuracy and to provide a rigorous justification of the effective model.

  15. Development of rigor mortis is not affected by muscle volume.

    PubMed

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  16. Hypothesis testing of a change point during cognitive decline among Alzheimer's disease patients.

    PubMed

    Ji, Ming; Xiong, Chengjie; Grundman, Michael

    2003-10-01

    In this paper, we present a statistical hypothesis test for detecting a change point over the course of cognitive decline among Alzheimer's disease patients. The model under the null hypothesis assumes a constant rate of cognitive decline over time and the model under the alternative hypothesis is a general bilinear model with an unknown change point. When the change point is unknown, however, the null distribution of the test statistics is not analytically tractable and has to be simulated by parametric bootstrap. When the alternative hypothesis that a change point exists is accepted, we propose an estimate of its location based on the Akaike's Information Criterion. We applied our method to a data set from the Neuropsychological Database Initiative by implementing our hypothesis testing method to analyze Mini Mental Status Exam scores based on a random-slope and random-intercept model with a bilinear fixed effect. Our result shows that despite large amount of missing data, accelerated decline did occur for MMSE among AD patients. Our finding supports the clinical belief of the existence of a change point during cognitive decline among AD patients and suggests the use of change point models for the longitudinal modeling of cognitive decline in AD research.

  17. Evidence for adaptive radiation from a phylogenetic study of plant defenses

    PubMed Central

    Agrawal, Anurag A.; Fishbein, Mark; Halitschke, Rayko; Hastings, Amy P.; Rabosky, Daniel L.; Rasmann, Sergio

    2009-01-01

    One signature of adaptive radiation is a high level of trait change early during the diversification process and a plateau toward the end of the radiation. Although the study of the tempo of evolution has historically been the domain of paleontologists, recently developed phylogenetic tools allow for the rigorous examination of trait evolution in a tremendous diversity of organisms. Enemy-driven adaptive radiation was a key prediction of Ehrlich and Raven's coevolutionary hypothesis [Ehrlich PR, Raven PH (1964) Evolution 18:586–608], yet has remained largely untested. Here we examine patterns of trait evolution in 51 North American milkweed species (Asclepias), using maximum likelihood methods. We study 7 traits of the milkweeds, ranging from seed size and foliar physiological traits to defense traits (cardenolides, latex, and trichomes) previously shown to impact herbivores, including the monarch butterfly. We compare the fit of simple random-walk models of trait evolution to models that incorporate stabilizing selection (Ornstein-Ulenbeck process), as well as time-varying rates of trait evolution. Early bursts of trait evolution were implicated for 2 traits, while stabilizing selection was implicated for several others. We further modeled the relationship between trait change and species diversification while allowing rates of trait evolution to vary during the radiation. Species-rich lineages underwent a proportionately greater decline in latex and cardenolides relative to species-poor lineages, and the rate of trait change was most rapid early in the radiation. An interpretation of this result is that reduced investment in defensive traits accelerated diversification, and disproportionately so, early in the adaptive radiation of milkweeds. PMID:19805160

  18. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    PubMed

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  19. Bioeconomic and market models

    Treesearch

    Richard Haynes; Darius Adams; Peter Ince; John Mills; Ralph Alig

    2006-01-01

    The United States has a century of experience with the development of models that describe markets for forest products and trends in resource conditions. In the last four decades, increasing rigor in policy debates has stimulated the development of models to support policy analysis. Increasingly, research has evolved (often relying on computer-based models) to increase...

  20. A square-force cohesion model and its extraction from bulk measurements

    NASA Astrophysics Data System (ADS)

    Liu, Peiyuan; Lamarche, Casey; Kellogg, Kevin; Hrenya, Christine

    2017-11-01

    Cohesive particles remain poorly understood, with order of magnitude differences exhibited for prior, physical predictions of agglomerate size. A major obstacle lies in the absence of robust models of particle-particle cohesion, thereby precluding accurate prediction of the behavior of cohesive particles. Rigorous cohesion models commonly contain parameters related to surface roughness, to which cohesion shows extreme sensitivity. However, both roughness measurement and its distillation into these model parameters are challenging. Accordingly, we propose a ``square-force'' model, where cohesive force remains constant until a cut-off separation. Via DEM simulations, we demonstrate validity of the square-force model as surrogate of more rigorous models, when its two parameters are selected to match the two key quantities governing dense and dilute granular flows, namely maximum cohesive force and critical cohesive energy, respectively. Perhaps more importantly, we establish a method to extract the parameters in the square-force model via defluidization, due to its ability to isolate the effects of the two parameters. Thus, instead of relying on complicated scans of individual grains, determination of particle-particle cohesion from simple bulk measurements becomes feasible. Dow Corning Corporation.

  1. The alterations in adenosine nucleotides and lactic acid in striated muscles of rats during Rigor mortis following death with drowning or cervical dislocation.

    PubMed

    Pençe, Halime Hanim; Pençe, Sadrettin; Kurtul, Naciye; Yilmaz, Necat; Kocoglu, Hasan; Bakan, Ebubekir

    2003-01-01

    In this study, adenosine triphosphate (ATP), adenosine diphosphate (ADP), adenosine monophosphate (AMP) and lactic acid in the muscles of masseter, triceps, and quadriceps obtained from right and left sides of Spraque-Dawley rats following death were investigated. The samples were taken immediately and 120 minutes after death occurred. The rats were killed either by cervical dislocation or drowning. ATP concentrations in the muscles of masseter, triceps, and quadriceps were lower in samples obtained 120 minutes after death than in those obtained immediately after death. ADP, AMP, and lactic acid concentrations in these muscles were higher in samples obtained 120 minutes after death than those obtained immediately after death. A positive linear correlation was determined between ATP and ADP concentrations in quadriceps muscles of the rats killed with cervical dislocation and in triceps muscles of the rats killed with drowning. When rats killed with cervical dislocation and with drowning were compared, ADP, AMP, and lactic acid concentrations were lower in the former than in the latter for both times (immediately and 120 minutes after death occurred). In the case of drowning, ATP is consumed faster because of hard exercise or severe physical activity, resulting in a faster rigor mortis. Higher lactic acid levels were determined in muscles of the rats killed with drowning than the other group. In the control and electric shock rats, ATP decreased in different levels in the three different muscle types mentioned above in control group, being much decline in masseter and then in quadriceps. This may be caused by lower mass and less glycogen storage of masseter. No different ATP levels were measured in drowning group with respect to the muscle type possibly because of the severe activity of triceps and quadriceps and because of smaller mass of masseter. One can conclude that the occurrence of rigor mortis is closely related to the mode of death.

  2. Fractional Stochastic Differential Equations Satisfying Fluctuation-Dissipation Theorem

    NASA Astrophysics Data System (ADS)

    Li, Lei; Liu, Jian-Guo; Lu, Jianfeng

    2017-10-01

    We propose in this work a fractional stochastic differential equation (FSDE) model consistent with the over-damped limit of the generalized Langevin equation model. As a result of the `fluctuation-dissipation theorem', the differential equations driven by fractional Brownian noise to model memory effects should be paired with Caputo derivatives, and this FSDE model should be understood in an integral form. We establish the existence of strong solutions for such equations and discuss the ergodicity and convergence to Gibbs measure. In the linear forcing regime, we show rigorously the algebraic convergence to Gibbs measure when the `fluctuation-dissipation theorem' is satisfied, and this verifies that satisfying `fluctuation-dissipation theorem' indeed leads to the correct physical behavior. We further discuss possible approaches to analyze the ergodicity and convergence to Gibbs measure in the nonlinear forcing regime, while leave the rigorous analysis for future works. The FSDE model proposed is suitable for systems in contact with heat bath with power-law kernel and subdiffusion behaviors.

  3. Rigorous description of holograms of particles illuminated by an astigmatic elliptical Gaussian beam

    NASA Astrophysics Data System (ADS)

    Yuan, Y. J.; Ren, K. F.; Coëtmellec, S.; Lebrun, D.

    2009-02-01

    The digital holography is a non-intrusive optical metrology and well adapted for the measurement of the size and velocity field of particles in the spray of a fluid. The simplified model of an opaque disk is often used in the treatment of the diagrams and therefore the refraction and the third dimension diffraction of the particle are not taken into account. We present in this paper a rigorous description of the holographic diagrams and evaluate the effects of the refraction and the third dimension diffraction by comparison to the opaque disk model. It is found that the effects are important when the real part of the refractive index is near unity or the imaginary part is non zero but small.

  4. Engineering education as a complex system

    NASA Astrophysics Data System (ADS)

    Gattie, David K.; Kellam, Nadia N.; Schramski, John R.; Walther, Joachim

    2011-12-01

    This paper presents a theoretical basis for cultivating engineering education as a complex system that will prepare students to think critically and make decisions with regard to poorly understood, ill-structured issues. Integral to this theoretical basis is a solution space construct developed and presented as a benchmark for evaluating problem-solving orientations that emerge within students' thinking as they progress through an engineering curriculum. It is proposed that the traditional engineering education model, while analytically rigorous, is characterised by properties that, although necessary, are insufficient for preparing students to address complex issues of the twenty-first century. A Synthesis and Design Studio model for engineering education is proposed, which maintains the necessary rigor of analysis within a uniquely complex yet sufficiently structured learning environment.

  5. A Prospective Test of Cognitive Vulnerability Models of Depression with Adolescent Girls

    ERIC Educational Resources Information Center

    Bohon, Cara; Stice, Eric; Burton, Emily; Fudell, Molly; Nolen-Hoeksema, Susan

    2008-01-01

    This study sought to provide a more rigorous prospective test of two cognitive vulnerability models of depression with longitudinal data from 496 adolescent girls. Results supported the cognitive vulnerability model in that stressors predicted future increases in depressive symptoms and onset of clinically significant major depression for…

  6. Learning, Judgment, and the Rooted Particular

    ERIC Educational Resources Information Center

    McCabe, David

    2012-01-01

    This article begins by acknowledging the general worry that scholarship in the humanities lacks the rigor and objectivity of other scholarly fields. In considering the validity of that criticism, I distinguish two models of learning: the covering law model exemplified by the natural sciences, and the model of rooted particularity that…

  7. Kinetic Modeling of Human Hepatic Glucose Metabolism in Type 2 Diabetes Mellitus Predicts Higher Risk of Hypoglycemic Events in Rigorous Insulin Therapy*

    PubMed Central

    König, Matthias; Holzhütter, Hermann-Georg

    2012-01-01

    A major problem in the insulin therapy of patients with diabetes type 2 (T2DM) is the increased occurrence of hypoglycemic events which, if left untreated, may cause confusion or fainting and in severe cases seizures, coma, and even death. To elucidate the potential contribution of the liver to hypoglycemia in T2DM we applied a detailed kinetic model of human hepatic glucose metabolism to simulate changes in glycolysis, gluconeogenesis, and glycogen metabolism induced by deviations of the hormones insulin, glucagon, and epinephrine from their normal plasma profiles. Our simulations reveal in line with experimental and clinical data from a multitude of studies in T2DM, (i) significant changes in the relative contribution of glycolysis, gluconeogenesis, and glycogen metabolism to hepatic glucose production and hepatic glucose utilization; (ii) decreased postprandial glycogen storage as well as increased glycogen depletion in overnight fasting and short term fasting; and (iii) a shift of the set point defining the switch between hepatic glucose production and hepatic glucose utilization to elevated plasma glucose levels, respectively, in T2DM relative to normal, healthy subjects. Intriguingly, our model simulations predict a restricted gluconeogenic response of the liver under impaired hormonal signals observed in T2DM, resulting in an increased risk of hypoglycemia. The inability of hepatic glucose metabolism to effectively counterbalance a decline of the blood glucose level becomes even more pronounced in case of tightly controlled insulin treatment. Given this Janus face mode of action of insulin, our model simulations underline the great potential that normalization of the plasma glucagon profile may have for the treatment of T2DM. PMID:22977253

  8. Longitudinal effects of religious involvement on religious coping and health behaviors in a national sample of African Americans.

    PubMed

    Holt, Cheryl L; Roth, David L; Huang, Jin; Park, Crystal L; Clark, Eddie M

    2017-08-01

    Many studies have examined associations between religious involvement and health, linking various dimensions of religion with a range of physical health outcomes and often hypothesizing influences on health behaviors. However, far fewer studies have examined explanatory mechanisms of the religion-health connection, and most have overwhelmingly relied on cross-sectional analyses. Given the relatively high levels of religious involvement among African Americans and the important role that religious coping styles may play in health, the present study tested a longitudinal model of religious coping as a potential mediator of a multidimensional religious involvement construct (beliefs; behaviors) on multiple health behaviors (e.g., diet, physical activity, alcohol use, cancer screening). A national probability sample of African Americans was enrolled in the RHIAA (Religion and Health In African Americans) study and three waves of telephone interviews were conducted over a 5-year period (N = 565). Measurement models were fit followed by longitudinal structural models. Positive religious coping decreased modestly over time in the sample, but these reductions were attenuated for participants with stronger religious beliefs and behaviors. Decreases in negative religious coping were negligible and were not associated with either religious beliefs or religious behaviors. Religious coping was not associated with change in any of the health behaviors over time, precluding the possibility of a longitudinal mediational effect. Thus, mediation observed in previous cross-sectional analyses was not confirmed in this more rigorous longitudinal model over a 5-year period. However, findings do point to the role that religious beliefs have in protecting against declines in positive religious coping over time, which may have implications for pastoral counseling and other faith-based interventions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Modelling Future Cardiovascular Disease Mortality in the United States: National Trends and Racial and Ethnic Disparities

    PubMed Central

    Pearson-Stuttard, Jonathan; Guzman-Castillo, Maria; Penalvo, Jose L.; Rehm, Colin D.; Afshin, Ashkan; Danaei, Goodarz; Kypridemos, Chris; Gaziano, Tom; Mozaffarian, Dariush; Capewell, Simon; O’Flaherty, Martin

    2016-01-01

    Background Accurate forecasting of cardiovascular disease (CVD) mortality is crucial to guide policy and programming efforts. Prior forecasts have often not incorporated past trends in rates of reduction in CVD mortality. This creates uncertainties about future trends in CVD mortality and disparities. Methods and Results To forecast US CVD mortality and disparities to 2030, we developed a hierarchical Bayesian model to determine and incorporate prior age, period and cohort (APC) effects from 1979–2012, stratified by age, gender and race; which we combined with expected demographic shifts to 2030. Data sources included the National Vital Statistics System, SEER single year population estimates, and US Bureau of Statistics 2012 National Population projections. We projected coronary disease and stroke deaths to 2030, first based on constant APC effects at 2012 values, as most commonly done (conventional); and then using more rigorous projections incorporating expected trends in APC effects (trend-based). We primarily evaluated absolute mortality. The conventional model projected total coronary and stroke deaths by 2030 to increase by approximately 18% (67,000 additional coronary deaths/year) and 50% (64,000 additional stroke deaths/year). Conversely, the trend-based model projected that coronary mortality would fall by 2030 by approximately 27% (79,000 fewer deaths/year); and stroke mortality would remain unchanged (200 fewer deaths/year). Health disparities will be improved in stroke deaths, but not coronary deaths. Conclusions After accounting for prior mortality trends and expected demographic shifts, total US coronary deaths are expected to decline, while stroke mortality will remain relatively constant. Health disparities in stroke, but not coronary, deaths will be improved but not eliminated. These APC approaches offer more plausible predictions than conventional estimates. PMID:26846769

  10. Conservation committee report. Falconry: Effects on raptor populations and management in North America

    USGS Publications Warehouse

    Braun, C.E.; Enderson, J.H.; Henny, C.J.; Meng, H.; Nye, A.G.

    1977-01-01

    The art of falconry in North America, practiced by a few individuals for many years, attracted little attention until the 1960?s. Presently about 2800 falconers are licensed in the United States with less than one half considered to be active. While interest in this art is expected to increase, we believe growth will be slow, probably 5 to 10% per year, due to rigorous demands on time and equipment required and restrictive regulations.....Many different species of raptors have been used in falconry. Presently 6 species are commonly used, especially the Red-tailed Hawk and American Kestrel. Present evidence suggests that only 2 races of the Peregrine Falcon are threatened in North America, and declines may have occurred in local populations of other species. Declines in populations of Peregrines are attributed to pesticide contamination of food chains. Apparent declines in other populations of raptors are also attributed to pesticides and locally to changes in land use and possibly indiscriminate shooting. Removal of raptors from wild populations for falconry has not had documentable adverse effects except possibly at local nesting sites. Continuation of the art of falconry under the framework of the recent federal regulations is not expected to have measurable impacts on region-wide populations. Management of raptors is poorly developed and relatively unexplored. Captive breeding of raptors holds much promise for production of birds both for re-establishment and as a source of birds for falconry. Falconers have contributed much to the continued improvement of the Cornell University Peregrine program in terms of breeding stocks and technique development.

  11. Predicting Speech Intelligibility Decline in Amyotrophic Lateral Sclerosis Based on the Deterioration of Individual Speech Subsystems

    PubMed Central

    Yunusova, Yana; Wang, Jun; Zinman, Lorne; Pattee, Gary L.; Berry, James D.; Perry, Bridget; Green, Jordan R.

    2016-01-01

    Purpose To determine the mechanisms of speech intelligibility impairment due to neurologic impairments, intelligibility decline was modeled as a function of co-occurring changes in the articulatory, resonatory, phonatory, and respiratory subsystems. Method Sixty-six individuals diagnosed with amyotrophic lateral sclerosis (ALS) were studied longitudinally. The disease-related changes in articulatory, resonatory, phonatory, and respiratory subsystems were quantified using multiple instrumental measures, which were subjected to a principal component analysis and mixed effects models to derive a set of speech subsystem predictors. A stepwise approach was used to select the best set of subsystem predictors to model the overall decline in intelligibility. Results Intelligibility was modeled as a function of five predictors that corresponded to velocities of lip and jaw movements (articulatory), number of syllable repetitions in the alternating motion rate task (articulatory), nasal airflow (resonatory), maximum fundamental frequency (phonatory), and speech pauses (respiratory). The model accounted for 95.6% of the variance in intelligibility, among which the articulatory predictors showed the most substantial independent contribution (57.7%). Conclusion Articulatory impairments characterized by reduced velocities of lip and jaw movements and resonatory impairments characterized by increased nasal airflow served as the subsystem predictors of the longitudinal decline of speech intelligibility in ALS. Declines in maximum performance tasks such as the alternating motion rate preceded declines in intelligibility, thus serving as early predictors of bulbar dysfunction. Following the rapid decline in speech intelligibility, a precipitous decline in maximum performance tasks subsequently occurred. PMID:27148967

  12. Predicting Speech Intelligibility Decline in Amyotrophic Lateral Sclerosis Based on the Deterioration of Individual Speech Subsystems.

    PubMed

    Rong, Panying; Yunusova, Yana; Wang, Jun; Zinman, Lorne; Pattee, Gary L; Berry, James D; Perry, Bridget; Green, Jordan R

    2016-01-01

    To determine the mechanisms of speech intelligibility impairment due to neurologic impairments, intelligibility decline was modeled as a function of co-occurring changes in the articulatory, resonatory, phonatory, and respiratory subsystems. Sixty-six individuals diagnosed with amyotrophic lateral sclerosis (ALS) were studied longitudinally. The disease-related changes in articulatory, resonatory, phonatory, and respiratory subsystems were quantified using multiple instrumental measures, which were subjected to a principal component analysis and mixed effects models to derive a set of speech subsystem predictors. A stepwise approach was used to select the best set of subsystem predictors to model the overall decline in intelligibility. Intelligibility was modeled as a function of five predictors that corresponded to velocities of lip and jaw movements (articulatory), number of syllable repetitions in the alternating motion rate task (articulatory), nasal airflow (resonatory), maximum fundamental frequency (phonatory), and speech pauses (respiratory). The model accounted for 95.6% of the variance in intelligibility, among which the articulatory predictors showed the most substantial independent contribution (57.7%). Articulatory impairments characterized by reduced velocities of lip and jaw movements and resonatory impairments characterized by increased nasal airflow served as the subsystem predictors of the longitudinal decline of speech intelligibility in ALS. Declines in maximum performance tasks such as the alternating motion rate preceded declines in intelligibility, thus serving as early predictors of bulbar dysfunction. Following the rapid decline in speech intelligibility, a precipitous decline in maximum performance tasks subsequently occurred.

  13. Animal reintroductions: an innovative assessment of survival

    USGS Publications Warehouse

    Muths, Erin L.; Bailey, Larissa L.; Watry, Mary Kay

    2014-01-01

    Quantitative evaluations of reintroductions are infrequent and assessments of milestones reached before a project is completed, or abandoned due to lack of funding, are rare. However, such assessments, which are promoted in adaptive management frameworks, are critical. Quantification can provide defensible estimates of biological success, such as the number of survivors from a released cohort, with associated cost per animal. It is unlikely that the global issues of endangered wildlife and population declines will abate, therefore, assurance colonies and reintroductions are likely to become more common. If such endeavors are to be successful biologically or achieve adequate funding, implementation must be more rigorous and accountable. We use a novel application of a multistate, robust design capture-recapture model to estimate survival of reintroduced tadpoles through metamorphosis (i.e., the number of individuals emerging from the pond) and thereby provide a quantitative measure of effort and success for an "in progress" reintroduction of toads. Our data also suggest that tadpoles released at later developmental stages have an increased probability of survival and that eggs laid in the wild hatched at higher rates than eggs laid by captive toads. We illustrate how an interim assessment can identify problems, highlight successes, and provide information for use in adjusting the effort or implementing a Decision-Theoretic adaptive management strategy.

  14. Preserving pre-rigor meat functionality for beef patty production.

    PubMed

    Claus, J R; Sørheim, O

    2006-06-01

    Three methods were examined for preserving pre-rigor meat functionality in beef patties. Hot-boned semimembranosus muscles were processed as follows: (1) pre-rigor ground, salted, patties immediately cooked; (2) pre-rigor ground, salted and stored overnight; (3) pre-rigor injected with brine; and (4) post-rigor ground and salted. Raw patties contained 60% lean beef, 19.7% beef fat trim, 1.7% NaCl, 3.6% starch, and 15% water. Pre-rigor processing occurred at 3-3.5h postmortem. Patties made from pre-rigor ground meat had higher pH values; greater protein solubility; firmer, more cohesive, and chewier texture; and substantially lower cooking losses than the other treatments. Addition of salt was sufficient to reduce the rate and extent of glycolysis. Brine injection of intact pre-rigor muscles resulted in some preservation of the functional properties but not as pronounced as with salt addition to pre-rigor ground meat.

  15. Estimation of the time since death--reconsidering the re-establishment of rigor mortis.

    PubMed

    Anders, Sven; Kunz, Michaela; Gehl, Axel; Sehner, Susanne; Raupach, Tobias; Beck-Bornholdt, Hans-Peter

    2013-01-01

    In forensic medicine, there is an undefined data background for the phenomenon of re-establishment of rigor mortis after mechanical loosening, a method used in establishing time since death in forensic casework that is thought to occur up to 8 h post-mortem. Nevertheless, the method is widely described in textbooks on forensic medicine. We examined 314 joints (elbow and knee) of 79 deceased at defined time points up to 21 h post-mortem (hpm). Data were analysed using a random intercept model. Here, we show that re-establishment occurred in 38.5% of joints at 7.5 to 19 hpm. Therefore, the maximum time span for the re-establishment of rigor mortis appears to be 2.5-fold longer than thought so far. These findings have major impact on the estimation of time since death in forensic casework.

  16. Reducing child global undernutrition at scale in Sofala Province, Mozambique, using Care Group Volunteers to communicate health messages to mothers.

    PubMed

    Davis, Thomas P; Wetzel, Carolyn; Hernandez Avilan, Emma; de Mendoza Lopes, Cecilia; Chase, Rachel P; Winch, Peter J; Perry, Henry B

    2013-03-01

    Undernutrition contributes to one-third of under-5 child mortality globally. Progress in achieving the Millennium Development Goal of reducing under-5 mortality is lagging in many countries, particularly in Africa. This paper shares evidence and insights from a low-cost behavior-change innovation in a rural area of Mozambique. About 50,000 households with pregnant women or children under 2 years old were organized into blocks of 12 households. One volunteer peer educator (Care Group Volunteer, or CGV) was selected for each block. Approximately 12 CGVs met together as a group every 2 weeks with a paid project promoter to learn a new child-survival health or nutrition message or skill. Then the CGVs shared the new message with mothers in their assigned blocks. Household surveys were conducted at baseline and endline to measure nutrition-related behaviors and childhood nutritional status. More than 90% of beneficiary mothers reported that they had been contacted by CGVs during the previous 2 weeks. In the early implementation project area, the percentage of children 0-23 months old with global undernutrition (weight-for-age with z-score of less than 2 standard deviations below the international standard mean) declined by 8.1 percentage points (P<0.001), from 25.9% (95% confidence interval [CI] = 22.2%-29.6%) at baseline to 17.8% at endline (95% CI = 14.6%-20.9%). In the delayed implementation area, global undernutrition declined by 11.5 percentage points (P<0.001), from 27.1% (95% CI = 23.6%-30.6%) to 15.6% (95% CI = 12.6%-18.6%). Total project costs were US$3.0 million, representing an average cost of US$0.55 per capita per year (among the entire population of 1.1 million people) and US$2.78 per beneficiary (mothers with young children) per year. Using the Care Group model can improve the level of global undernutrition in children at scale and at low cost. This model shows sufficient promise to merit further rigorous testing and broader application.

  17. Reducing child global undernutrition at scale in Sofala Province, Mozambique, using Care Group Volunteers to communicate health messages to mothers

    PubMed Central

    Davis, Thomas P; Wetzel, Carolyn; Hernandez Avilan, Emma; de Mendoza Lopes, Cecilia; Chase, Rachel P; Winch, Peter J; Perry, Henry B

    2013-01-01

    Background: Undernutrition contributes to one-third of under-5 child mortality globally. Progress in achieving the Millennium Development Goal of reducing under-5 mortality is lagging in many countries, particularly in Africa. This paper shares evidence and insights from a low-cost behavior-change innovation in a rural area of Mozambique. Intervention: About 50,000 households with pregnant women or children under 2 years old were organized into blocks of 12 households. One volunteer peer educator (Care Group Volunteer, or CGV) was selected for each block. Approximately 12 CGVs met together as a group every 2 weeks with a paid project promoter to learn a new child-survival health or nutrition message or skill. Then the CGVs shared the new message with mothers in their assigned blocks. Methods of evaluation: Household surveys were conducted at baseline and endline to measure nutrition-related behaviors and childhood nutritional status. Findings: More than 90% of beneficiary mothers reported that they had been contacted by CGVs during the previous 2 weeks. In the early implementation project area, the percentage of children 0–23 months old with global undernutrition (weight-for-age with z-score of less than 2 standard deviations below the international standard mean) declined by 8.1 percentage points (P<0.001), from 25.9% (95% confidence interval [CI] = 22.2%–29.6%) at baseline to 17.8% at endline (95% CI = 14.6%–20.9%). In the delayed implementation area, global undernutrition declined by 11.5 percentage points (P<0.001), from 27.1% (95% CI = 23.6%–30.6%) to 15.6% (95% CI = 12.6%–18.6%). Total project costs were US$3.0 million, representing an average cost of US$0.55 per capita per year (among the entire population of 1.1 million people) and US$2.78 per beneficiary (mothers with young children) per year. Conclusion: Using the Care Group model can improve the level of global undernutrition in children at scale and at low cost. This model shows sufficient promise to merit further rigorous testing and broader application. PMID:25276516

  18. CMS-Wave

    DTIC Science & Technology

    2014-10-27

    a phase-averaged spectral wind-wave generation and transformation model and its interface in the Surface-water Modeling System (SMS). Ambrose...applications of the Boussinesq (BOUSS-2D) wave model that provides more rigorous calculations for design and performance optimization of integrated...navigation systems . Together these wave models provide reliable predictions on regional and local spatial domains and cost-effective engineering solutions

  19. Complementary and Integrative Healthcare in a Long-term Care Facility: A Pilot Project.

    PubMed

    Evans, Roni; Vihstadt, Corrie; Westrom, Kristine; Baldwin, Lori

    2015-01-01

    The world's population is aging quickly, leading to increased challenges of how to care for individuals who can no longer independently care for themselves. With global social and economic pressures leading to declines in family support, increased reliance is being placed on community- and government-based facilities to provide long-term care (LTC) for many of society's older citizens. Complementary and integrative healthcare (CIH) is commonly used by older adults and may offer an opportunity to enhance LTC residents' wellbeing. Little work has been done, however, rigorously examining the safety and effectiveness of CIH for LTC residents. The goal of this work is to describe a pilot project to develop and evaluate one model of CIH in an LTC facility in the Midwestern United States. A prospective, mixed-methods pilot project was conducted in two main phases: (1) preparation and (2) implementation and evaluation. The preparation phase entailed assessment, CIH model design and development, and training. A CIH model including acupuncture, chiropractic, and massage therapy, guided by principles of collaborative integration, evidence informed practice, and sustainability, was applied in the implementation and evaluation phase. CIH services were provided for 16 months in the LTC facility. Quantitative data collection included pain, quality of life, and adverse events. Qualitative interviews of LTC residents, their family members, and LTC staff members queried perceptions of CIH services. A total of 46 LTC residents received CIH care, most commonly for musculoskeletal pain (61%). Participants were predominantly female (85%) and over the age of 80 years (67%). The median number of CIH treatments was 13, with a range of 1 to 92. Residents who were able to provide self-report data demonstrated, on average, a 15% decline in pain and a 4% improvement in quality of life. No serious adverse events related to treatment were documented; the most common mild and expected side effect was increased pain (63 reports over 859 treatments). Qualitative interviews revealed most residents, family members and LTC staff members felt CIH services were worthwhile due to perceived benefits including pain relief and enhanced psychological and social wellbeing. This project demonstrated that with extensive attention to preparation, one patient-centered model of CIH in LTC was feasible on several levels. Quantitative and qualitative data suggest that CIH can be safely implemented and might provide relief and enhanced wellbeing for residents. However, some aspects of model delivery and data collection were challenging, resulting in limitations, and should be addressed in future efforts.

  20. Tree decline and the future of Australian farmland biodiversity

    PubMed Central

    Fischer, Joern; Zerger, Andre; Gibbons, Phil; Stott, Jenny; Law, Bradley S.

    2010-01-01

    Farmland biodiversity is greatly enhanced by the presence of trees. However, farmland trees are declining worldwide, including in North America, Central America, and parts of southern Europe. We show that tree decline and its likely consequences are particularly severe in Australia's temperate agricultural zone, which is a threatened ecoregion. Using field data on trees, remotely sensed imagery, and a demographic model for trees, we predict that by 2100, the number of trees on an average farm will contract to two-thirds of its present level. Statistical habitat models suggest that this tree decline will negatively affect many currently common animal species, with predicted declines in birds and bats of up to 50% by 2100. Declines were predicted for 24 of 32 bird species modeled and for all of six bat species modeled. Widespread declines in trees, birds, and bats may lead to a reduction in economically important ecosystem services such as shade provision for livestock and pest control. Moreover, many other species for which we have no empirical data also depend on trees, suggesting that fundamental changes in ecosystem functioning are likely. We conclude that Australia's temperate agricultural zone has crossed a threshold and no longer functions as a self-sustaining woodland ecosystem. A regime shift is occurring, with a woodland system deteriorating into a treeless pasture system. Management options exist to reverse tree decline, but new policy settings are required to encourage their widespread adoption. PMID:20974946

  1. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  2. Rigorous evaluation of chemical measurement uncertainty: liquid chromatographic analysis methods using detector response factor calibration

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Nelson, Michael A.; Bedner, Mary

    2017-06-01

    Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).

  3. Evaluation of active transition, a website-delivered physical activity intervention for university students: pilot study.

    PubMed

    Kwan, Matthew; Faulkner, Guy; Bray, Steven

    2013-04-29

    While physical activity in individuals tends to decline steadily with age, there are certain periods where this decline occurs more rapidly, such as during early adulthood. Interventions aimed at attenuating the declines in physical activity during this transition period appear warranted. The purpose of the study was to test the feasibility and efficacy of a theoretically informed, website-delivered physical activity intervention aimed at students entering university. Using a quasi-experimental design, 65 participants (44 females; mean age 18.51, SD 0.91) were assigned to either an intervention (receiving website access plus weekly prompts) or comparison condition (receiving unprompted website access only), completing questionnaires at baseline and follow-up 8 weeks later. The intervention website, "Active Transition", was specifically designed to target students' physical activity cognitions and self-regulatory skills. Intervention usage was low, with only 47% (18/38) of participants assigned to the intervention condition logging into the website 2 or more times. Among the broader student sample, there were significant declines in students' physical activity behaviors (F1,63=18.10, P<.001), attitudes (F1,62=55.19, P<.001), and perceived behavioral control (F1,62 =17.56, P<.001). In comparisons between intervention users (29/65, individuals logging in 2 or more times) and non-users (36/65, individuals logging in once or not at all), there was a significant interaction effect for intervention usage and time on perceived behavioral control (F1,62=5.13, P=.03). Poor intervention usage suggests that future efforts need to incorporate innovative strategies to increase intervention uptake and better engage the student population. The findings, however, suggest that a website-delivered intervention aimed at this critical life stage may have positive impact on students' physical activity cognitions. Future studies with more rigorous sampling designs are required.

  4. Model-based assessment of estuary ecosystem health using the latent health factor index, with application to the richibucto estuary.

    PubMed

    Chiu, Grace S; Wu, Margaret A; Lu, Lin

    2013-01-01

    The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt-clay content-all regarded a priori as qualitatively important abiotic drivers-towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general.

  5. Modelling lamb carcase pH and temperature decline parameters: relationship to shear force and abattoir variation.

    PubMed

    Hopkins, David L; Holman, Benjamin W B; van de Ven, Remy J

    2015-02-01

    Carcase pH and temperature decline rates influence lamb tenderness; therefore pH decline parameters are beneficial when modelling tenderness. These include pH at temperature 18 °C (pH@Temp18), temperature when pH is 6 (Temp@pH6), and pH at 24 h post-mortem (pH24). This study aimed to establish a relationship between shear force (SF) as a proxy for tenderness and carcase pH decline parameters estimated using both linear and spline estimation models for the m. longissimus lumborum (LL). The study also compared abattoirs regarding their achievement of ideal pH decline, indicative of optimal tenderness. Based on SF measurements of LL and m. semimembranosus collected as part of the Information Nucleus slaughter programme (CRC for Sheep Industry Innovation) this study found significant relationships between tenderness and pH24LL, consistent across the meat cuts and ageing periods examined. Achievement of ideal pH decline was shown not to have significantly differed across abattoirs, although rates of pH decline varied significantly across years within abattoirs.

  6. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    ERIC Educational Resources Information Center

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  7. Evaluating habitat suitability models for nesting white-headed woodpeckers in unburned forest

    Treesearch

    Quresh S. Latif; Victoria A. Saab; Kim Mellen-Mclean; Jonathan G. Dudley

    2015-01-01

    Habitat suitability models can provide guidelines for species conservation by predicting where species of interest are likely to occur. Presence-only models are widely used but typically provide only relative indices of habitat suitability (HSIs), necessitating rigorous evaluation often using independently collected presence-absence data. We refined and evaluated...

  8. Conservatoire Students' Experiences and Perceptions of Instrument-Specific Master Classes

    ERIC Educational Resources Information Center

    Long, Marion; Creech, Andrea; Gaunt, Helena; Hallam, Susan

    2014-01-01

    Historically, in the professional training of musicians, the master-apprentice model has played a central role in instilling the methods and values of the discipline, contributing to the rigorous formation of talent. Expert professional musicians advocate that certain thinking skills can be modelled through the master-apprentice model, yet its…

  9. Increasing the reliability of ecological models using modern software engineering techniques

    Treesearch

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  10. Hysteresis in the trade cycle

    NASA Astrophysics Data System (ADS)

    Mc Namara, Hugh A.; Pokrovskii, Alexei V.

    2006-02-01

    The Kaldor model-one of the first nonlinear models of macroeconomics-is modified to incorporate a Preisach nonlinearity. The new dynamical system thus created shows highly complicated behaviour. This paper presents a rigorous (computer aided) proof of chaos in this new model, and of the existence of unstable periodic orbits of all minimal periods p>57.

  11. Designing an Educational Game with Ten Steps to Complex Learning

    ERIC Educational Resources Information Center

    Enfield, Jacob

    2012-01-01

    Few instructional design (ID) models exist which are specific for developing educational games. Moreover, those extant ID models have not been rigorously evaluated. No ID models were found which focus on educational games with complex learning objectives. "Ten Steps to Complex Learning" (TSCL) is based on the four component instructional…

  12. Vaporization and Zonal Mixing in Performance Modeling of Advanced LOX-Methane Rockets

    NASA Technical Reports Server (NTRS)

    Williams, George J., Jr.; Stiegemeier, Benjamin R.

    2013-01-01

    Initial modeling of LOX-Methane reaction control (RCE) 100 lbf thrusters and larger, 5500 lbf thrusters with the TDK/VIPER code has shown good agreement with sea-level and altitude test data. However, the vaporization and zonal mixing upstream of the compressible flow stage of the models leveraged empirical trends to match the sea-level data. This was necessary in part because the codes are designed primarily to handle the compressible part of the flow (i.e. contraction through expansion) and in part because there was limited data on the thrusters themselves on which to base a rigorous model. A more rigorous model has been developed which includes detailed vaporization trends based on element type and geometry, radial variations in mixture ratio within each of the "zones" associated with elements and not just between zones of different element types, and, to the extent possible, updated kinetic rates. The Spray Combustion Analysis Program (SCAP) was leveraged to support assumptions in the vaporization trends. Data of both thrusters is revisited and the model maintains a good predictive capability while addressing some of the major limitations of the previous version.

  13. Trans-dimensional and hierarchical Bayesian approaches toward rigorous estimation of seismic sources and structures in the Northeast Asia

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean

    2016-04-01

    A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.

  14. Considerations in the Design of Clinical Trials for Cognitive Aging

    PubMed Central

    Brinton, Roberta Diaz; Katz, Russell; Petersen, Ronald C.; Negash, Selam; Mungas, Dan; Aisen, Paul S.

    2012-01-01

    What will it take to develop interventions for the treatment of age-related cognitive decline? Session V of the Summit provided perspectives on the design of clinical trials to evaluate promising but unproven interventions, and some of the steps needed to accelerate the discovery and evaluation of promising treatments. It considered strategies to further characterize the biological and cognitive changes associated with normal aging and their translation into the development of new treatments. It provided regulatory, scientific, and clinical perspectives about neurocognitive aging treatments, their potential benefits and risks, and the strategies and endpoints needed to evaluate them in the most rapid, rigorous, and clinically meaningful way. It considered lessons learned from the study of Alzheimer's disease, the promising roles of biomarkers in neurocognitive aging research, and ways to help galvanize the scientific study and treatment of neurocognitive aging. PMID:22573913

  15. Considerations in the design of clinical trials for cognitive aging.

    PubMed

    Reiman, Eric M; Brinton, Roberta Diaz; Katz, Russell; Petersen, Ronald C; Negash, Selam; Mungas, Dan; Aisen, Paul S

    2012-06-01

    What will it take to develop interventions for the treatment of age-related cognitive decline? Session V of the Summit provided perspectives on the design of clinical trials to evaluate promising but unproven interventions, and some of the steps needed to accelerate the discovery and evaluation of promising treatments. It considered strategies to further characterize the biological and cognitive changes associated with normal aging and their translation into the development of new treatments. It provided regulatory, scientific, and clinical perspectives about neurocognitive aging treatments, their potential benefits and risks, and the strategies and endpoints needed to evaluate them in the most rapid, rigorous, and clinically meaningful way. It considered lessons learned from the study of Alzheimer's disease, the promising roles of biomarkers in neurocognitive aging research, and ways to help galvanize the scientific study and treatment of neurocognitive aging.

  16. Survival of timber rattlesnakes (Crotalus horridus) estimated by capture-recapture models in relation to age, sex, color morph, time, and birthplace

    USGS Publications Warehouse

    Brown, W.S.; Kery, M.; Hines, J.E.

    2007-01-01

    Juvenile survival is one of the least known elements of the life history of many species, in particular snakes. We conducted a mark–recapture study of Crotalus horridus from 1978–2002 in northeastern New York near the northern limits of the species' range. We marked 588 neonates and estimated annual age-, sex-, and morph-specific recapture and survival rates using the Cormack-Jolly-Seber (CJS) model. Wild-caught neonates (field-born, n  =  407) and neonates produced by captive-held gravid females (lab-born, n  =  181) allowed comparison of the birthplace, or lab treatment effect, in estimated survival. Recapture rates declined from about 10–20% over time while increasing from young to older age classes. Estimated survival rates (S ± 1 SE) in the first year were significantly higher among field-born (black morph: S  =  0.773 ± 0.203; yellow morph: S  =  0.531 ± 0.104) than among lab-born snakes (black morph: S  =  0.411 ± 0.131; yellow morph: S  =  0.301 ± 0.081). Lower birth weights combined with a lack of field exposure until release apparently contributed to the lower survival rate of lab-born snakes. Subsequent survival estimates for 2–4-yr-old snakes were S  =  0.845 ± 0.084 for the black morph and S  =  0.999 (SE not available) for the yellow morph, and for ≥5-yr-old snakes S  =  0.958 ± 0.039 (black morph) and S  =  0.822 ± 0.034 (yellow morph). The most parsimonious model overall contained an independent time trend for survival of each age, morph, and lab-treatment group. For snakes of the first two age groups (ages 1 yr and 2–4 yr), survival tended to decline over the years for both morphs, while for adult snakes (5 yr and older), survival was constant or even slightly increased. Our data on survival and recapture are among the first rigorous estimates of these parameters in a rattlesnake and among the few yet available for any viperid snake. These data are useful for analyses of the life-history strategy, population dynamics, and conservation of this long-lived snake.

  17. Predictive QSAR modeling workflow, model applicability domains, and virtual screening.

    PubMed

    Tropsha, Alexander; Golbraikh, Alexander

    2007-01-01

    Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.

  18. Progress in Modeling Nonlinear Dendritic Evolution in Two and Three Dimensions, and Its Mathematical Justification

    NASA Technical Reports Server (NTRS)

    Tanveer, S.; Foster, M. R.

    2002-01-01

    We report progress in three areas of investigation related to dendritic crystal growth. Those items include: 1. Selection of tip features dendritic crystal growth; 2) Investigation of nonlinear evolution for two-sided model; and 3) Rigorous mathematical justification.

  19. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  20. Resonant tunneling assisted propagation and amplification of plasmons in high electron mobility transistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhardwaj, Shubhendu; Sensale-Rodriguez, Berardi; Xing, Huili Grace

    A rigorous theoretical and computational model is developed for the plasma-wave propagation in high electron mobility transistor structures with electron injection from a resonant tunneling diode at the gate. We discuss the conditions in which low-loss and sustainable plasmon modes can be supported in such structures. The developed analytical model is used to derive the dispersion relation for these plasmon-modes. A non-linear full-wave-hydrodynamic numerical solver is also developed using a finite difference time domain algorithm. The developed analytical solutions are validated via the numerical solution. We also verify previous observations that were based on a simplified transmission line model. Itmore » is shown that at high levels of negative differential conductance, plasmon amplification is indeed possible. The proposed rigorous models can enable accurate design and optimization of practical resonant tunnel diode-based plasma-wave devices for terahertz sources, mixers, and detectors, by allowing a precise representation of their coupling when integrated with other electromagnetic structures.« less

  1. Including Magnetostriction in Micromagnetic Models

    NASA Astrophysics Data System (ADS)

    Conbhuí, Pádraig Ó.; Williams, Wyn; Fabian, Karl; Nagy, Lesleis

    2016-04-01

    The magnetic anomalies that identify crustal spreading are predominantly recorded by basalts formed at the mid-ocean ridges, whose magnetic signals are dominated by iron-titanium-oxides (Fe3-xTixO4), so called "titanomagnetites", of which the Fe2.4Ti0.6O4 (TM60) phase is the most common. With sufficient quantities of titanium present, these minerals exhibit strong magnetostriction. To date, models of these grains in the pseudo-single domain (PSD) range have failed to accurately account for this effect. In particular, a popular analytic treatment provided by Kittel (1949) for describing the magnetostrictive energy as an effective increase of the anisotropy constant can produce unphysical strains for non-uniform magnetizations. I will present a rigorous approach based on work by Brown (1966) and by Kroner (1958) for including magnetostriction in micromagnetic codes which is suitable for modelling hysteresis loops and finding remanent states in the PSD regime. Preliminary results suggest the more rigorously defined micromagnetic models exhibit higher coercivities and extended single domain ranges when compared to more simplistic approaches.

  2. A model for sex ratio decline in India.

    PubMed

    Thukral, A K

    1996-01-01

    "The sex ratio in India has declined from 972 females per 1,000 males in 1901 to 929 females per 1,000 males in 1991. A model [is] proposed for the quantitative analysis of the problem.... The study reveals that there has been a sex discriminated population growth in India in the twentieth century, although the rate of decline of the female has decreased. If the current trend of population growth continues, there will be a further decline in the [sex ratio]." excerpt

  3. Staying Clear of the Dragons.

    PubMed

    Elf, Johan

    2016-04-27

    A new, game-changing approach makes it possible to rigorously disprove models without making assumptions about the unknown parts of the biological system. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. The association between cognitive decline and incident depressive symptoms in a sample of older Puerto Rican adults with diabetes.

    PubMed

    Bell, Tyler; Dávila, Ana Luisa; Clay, Olivio; Markides, Kyriakos S; Andel, Ross; Crowe, Michael

    2017-08-01

    Older Puerto Rican adults have particularly high risk of diabetes compared to the general US population. Diabetes is associated with both higher depressive symptoms and cognitive decline, but less is known about the longitudinal relationship between cognitive decline and incident depressive symptoms in those with diabetes. This study investigated the association between cognitive decline and incident depressive symptoms in older Puerto Rican adults with diabetes over a four-year period. Households across Puerto Rico were visited to identify a population-based sample of adults aged 60 years and over for the Puerto Rican Elderly: Health Conditions study (PREHCO); 680 participants with diabetes at baseline and no baseline cognitive impairment were included in analyses. Cognitive decline and depressive symptoms were measured using the Mini-Mental Cabán (MMC) and Geriatric Depression Scale (GDS), respectively. We examined predictors of incident depressive symptoms (GDS ≥ 5 at follow-up but not baseline) and cognitive decline using regression modeling. In a covariate-adjusted logistic regression model, cognitive decline, female gender, and greater diabetes-related complications were each significantly associated with increased odds of incident depressive symptoms (p < 0.05). In a multiple regression model adjusted for covariates, incident depressive symptoms and older age were associated with greater cognitive decline, and higher education was related to less cognitive decline (p < 0.05). Incident depressive symptoms were more common for older Puerto Ricans with diabetes who also experienced cognitive decline. Efforts are needed to optimize diabetes management and monitor for depression and cognitive decline in this population.

  5. Addressing the vaccine confidence gap.

    PubMed

    Larson, Heidi J; Cooper, Louis Z; Eskola, Juhani; Katz, Samuel L; Ratzan, Scott

    2011-08-06

    Vaccines--often lauded as one of the greatest public health interventions--are losing public confidence. Some vaccine experts have referred to this decline in confidence as a crisis. We discuss some of the characteristics of the changing global environment that are contributing to increased public questioning of vaccines, and outline some of the specific determinants of public trust. Public decision making related to vaccine acceptance is neither driven by scientific nor economic evidence alone, but is also driven by a mix of psychological, sociocultural, and political factors, all of which need to be understood and taken into account by policy and other decision makers. Public trust in vaccines is highly variable and building trust depends on understanding perceptions of vaccines and vaccine risks, historical experiences, religious or political affiliations, and socioeconomic status. Although provision of accurate, scientifically based evidence on the risk-benefit ratios of vaccines is crucial, it is not enough to redress the gap between current levels of public confidence in vaccines and levels of trust needed to ensure adequate and sustained vaccine coverage. We call for more research not just on individual determinants of public trust, but on what mix of factors are most likely to sustain public trust. The vaccine community demands rigorous evidence on vaccine efficacy and safety and technical and operational feasibility when introducing a new vaccine, but has been negligent in demanding equally rigorous research to understand the psychological, social, and political factors that affect public trust in vaccines. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Crossing the Line: Examination of Student Demographic Changes Concomitant with Declining Academic Performance in Elementary Schools

    ERIC Educational Resources Information Center

    Hochbein, Craig; Duke, Daniel

    2011-01-01

    The purpose of this study is to examine the relationship between school decline and changes in school demographics. Using a population of 981 (N = 981) elementary schools, the authors identified samples of declining schools: Relational Decline (n = 510), Absolute Decline (n = 217), and Crossing the Line (n = 165). Latent growth models assessed…

  7. Principles to Products: Toward Realizing MOS 2.0

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Delp, Christopher L.

    2012-01-01

    This is a report on the Operations Revitalization Initiative, part of the ongoing NASA-funded Advanced Multi-Mission Operations Systems (AMMOS) program. We are implementing products that significantly improve efficiency and effectiveness of Mission Operations Systems (MOS) for deep-space missions. We take a multi-mission approach, in keeping with our organization's charter to "provide multi-mission tools and services that enable mission customers to operate at a lower total cost to NASA." Focusing first on architectural fundamentals of the MOS, we review the effort's progress. In particular, we note the use of stakeholder interactions and consideration of past lessons learned to motivate a set of Principles that guide the evolution of the AMMOS. Thus guided, we have created essential patterns and connections (detailed in companion papers) that are explicitly modeled and support elaboration at multiple levels of detail (system, sub-system, element...) throughout a MOS. This architecture is realized in design and implementation products that provide lifecycle support to a Mission at the system and subsystem level. The products include adaptable multi-mission engineering documentation that describes essentials such as operational concepts and scenarios, requirements, interfaces and agreements, information models, and mission operations processes. Because we have adopted a model-based system engineering method, these documents and their contents are meaningfully related to one another and to the system model. This means they are both more rigorous and reusable (from mission to mission) than standard system engineering products. The use of models also enables detailed, early (e.g., formulation phase) insight into the impact of changes (e.g., to interfaces or to software) that is rigorous and complete, allowing better decisions on cost or technical trades. Finally, our work provides clear and rigorous specification of operations needs to software developers, further enabling significant gains in productivity.

  8. Rigorous Approach in Investigation of Seismic Structure and Source Characteristicsin Northeast Asia: Hierarchical and Trans-dimensional Bayesian Inversion

    NASA Astrophysics Data System (ADS)

    Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.

    2015-12-01

    Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.

  9. Predictability of the geospace variations and measuring the capability to model the state of the system

    NASA Astrophysics Data System (ADS)

    Pulkkinen, A.

    2012-12-01

    Empirical modeling has been the workhorse of the past decades in predicting the state of the geospace. For example, numerous empirical studies have shown that global geoeffectiveness indices such as Kp and Dst are generally well predictable from the solar wind input. These successes have been facilitated partly by the strongly externally driven nature of the system. Although characterizing the general state of the system is valuable and empirical modeling will continue playing an important role, refined physics-based quantification of the state of the system has been the obvious next step in moving toward more mature science. Importantly, more refined and localized products are needed also for space weather purposes. Predictions of local physical quantities are necessary to make physics-based links to the impacts on specific systems. As we have introduced more localized predictions of the geospace state one central question is how predictable these local quantities are? This complex question can be addressed by rigorously measuring the model performance against the observed data. Space sciences community has made great advanced on this topic over the past few years and there are ongoing efforts in SHINE, CEDAR and GEM to carry out community-wide evaluations of the state-of-the-art solar and heliospheric, ionosphere-thermosphere and geospace models, respectively. These efforts will help establish benchmarks and thus provide means to measure the progress in the field analogous to monitoring of the improvement in lower atmospheric weather predictions carried out rigorously since 1980s. In this paper we will discuss some of the latest advancements in predicting the local geospace parameters and give an overview of some of the community efforts to rigorously measure the model performances. We will also briefly discuss some of the future opportunities for advancing the geospace modeling capability. These will include further development in data assimilation and ensemble modeling (e.g. taking into account uncertainty in the inflow boundary conditions).

  10. Urbanization may limit impacts of an invasive predator on native mammal diversity

    USGS Publications Warehouse

    Reichert, Brian E.; Sovie, Adia R.; Udell, Brad J.; Hart, Kristen M.; Borkhataria, Rena R.; Bonneau, Mathieu; Reed, Robert; McCleery, Robert A.

    2017-01-01

    AimOur understanding of the effects of invasive species on faunal diversity is limited in part because invasions often occur in modified landscapes where other drivers of community diversity can exacerbate or reduce the net impacts of an invader. Furthermore, rigorous assessments of the effects of invasive species on native communities that account for variation in sampling, species-specific detection and occurrence of rare species are lacking. Invasive Burmese pythons (Python molurus bivittatus) may be causing declines in medium- to large-sized mammals throughout the Greater Everglades Ecosystem (GEE); however, other factors such as urbanization, habitat changes and drastic alteration in water flow may also be influential in structuring mammal communities. The aim of this study was to gain an understanding of how mammal communities simultaneously facing invasive predators and intensively human-altered landscapes are influenced by these drivers and their interactions.LocationFlorida, USA.MethodsWe used data from trail cameras and scat searches with a hierarchical community model that accounts for undetected species to determine the relative influence of introduced Burmese pythons, urbanization, local hydrology, habitat types and interactive effects between pythons and urbanization on mammal species occurrence, site-level species richness, and turnover.ResultsPython density had significant negative effects on all species except coyotes. Despite these negative effects, occurrence of some generalist species increased significantly near urban areas. At the community level, pythons had the greatest impact on species richness, while turnover was greatest along the urbanization gradient where communities were increasingly similar as distance to urbanization decreased.Main conclusionsWe found evidence for an antagonistic interaction between pythons and urbanization where the impacts of pythons were reduced near urban development. Python-induced changes to mammal communities may be mediated near urban development, but elsewhere in the GEE, pythons are likely causing a fundamental restructuring of the food web, declines in ecosystem function, and creating complex and unpredictable cascading effects.

  11. Investigating the association between HIV/AIDS and recent fertility patterns in Kenya.

    PubMed

    Magadi, Monica Akinyi; Agwanda, Alfred O

    2010-07-01

    Findings from previous studies linking the HIV/AIDS epidemic and fertility of populations have remained inconclusive. In sub-Saharan Africa, demographic patterns point to the epidemic resulting in fertility reduction. However, evidence from the 2003 Kenya Demographic and Health Survey (KDHS) has revealed interesting patterns, with regions most adversely affected with HIV/AIDS showing the clearest reversal trend in fertility decline. While there is suggestive evidence that fertility behaviour in some parts of sub-Saharan Africa has changed in relation to the HIV/AIDS epidemic, more rigorous empirical analysis is necessary to better understand this relationship. In this paper, we examine individual and contextual community HIV/AIDS factors associated with fertility patterns in Kenya, paying particular attention to possible mechanisms of the association. Multilevel models are applied to the 2003 KDHS, introducing various proximate fertility determinants in successive stages, to explore possible mechanisms through which HIV/AIDS may be associated with fertility. The results corroborate findings from earlier studies of the fertility inhibiting effect of HIV among infected women. HIV-infected women have 40 percent lower odds of having had a recent birth than their uninfected counterparts of similar background characteristics. Further analysis suggests an association between HIV/AIDS and fertility that exists through proximate fertility determinants relating to sexual exposure, breastfeeding duration, and foetal loss. While HIV/AIDS may have contributed to reduced fertility, mainly through reduced sexual exposure, there is evidence that it has contributed to increased fertility, through reduced breastfeeding and increased desire for more children resulting from increased infant/child mortality (i.e. a replacement phenomenon). In communities at advanced stages of the HIV/AIDS epidemic, it is possible that infant/child mortality has reached appreciably high levels where the impact of replacement and reduced breastfeeding duration is substantial enough to result in a reversal of fertility decline. This provides a plausible explanation for the patterns observed in regions with particularly high HIV prevalence in Kenya. Crown Copyright 2010. Published by Elsevier Ltd. All rights reserved.

  12. Selection of nest-site habitat by interior least terns in relation to sandbar construction

    USGS Publications Warehouse

    Sherfy, M.H.; Stucker, J.H.; Buhl, D.A.

    2012-01-01

    Federally endangered interior least terns (Sternula antillarum) nest on bare or sparsely vegetated sandbars on midcontinent river systems. Loss of nesting habitat has been implicated as a cause of population declines, and managing these habitats is a major initiative in population recovery. One such initiative involves construction of mid-channel sandbars on the Missouri River, where natural sandbar habitat has declined in quantity and quality since the late 1990s. We evaluated nest-site habitat selection by least terns on constructed and natural sandbars by comparing vegetation, substrate, and debris variables at nest sites (na =a 798) and random points (na =a 1,113) in bare or sparsely vegetated habitats. Our logistic regression models revealed that a broader suite of habitat features was important in nest-site selection on constructed than on natural sandbars. Odds ratios for habitat variables indicated that avoidance of habitat features was the dominant nest-site selection process on both sandbar types, with nesting terns being attracted to nest-site habitat features (gravel and debris) and avoiding vegetation only on constructed sandbars, and avoiding silt and leaf litter on both sandbar types. Despite the seemingly uniform nature of these habitats, our results suggest that a complex suite of habitat features influences nest-site choice by least terns. However, nest-site selection in this social, colonially nesting species may be influenced by other factors, including spatial arrangement of bare sand habitat, proximity to other least terns, and prior habitat occupancy by piping plovers (Charadrius melodus). We found that nest-site selection was sensitive to subtle variation in habitat features, suggesting that rigor in maintaining habitat condition will be necessary in managing sandbars for the benefit of least terns. Further, management strategies that reduce habitat features that are avoided by least terns may be the most beneficial to nesting least terns. ?? 2011 The Wildlife Society.

  13. Selection of nest-site habitat by interior least terns in relation to sandbar construction

    USGS Publications Warehouse

    Sherfy, Mark H.; Stucker, Jennifer H.; Buhl, Deborah A.

    2012-01-01

    Federally endangered interior least terns (Sternula antillarum) nest on bare or sparsely vegetated sandbars on midcontinent river systems. Loss of nesting habitat has been implicated as a cause of population declines, and managing these habitats is a major initiative in population recovery. One such initiative involves construction of mid-channel sandbars on the Missouri River, where natural sandbar habitat has declined in quantity and quality since the late 1990s. We evaluated nest-site habitat selection by least terns on constructed and natural sandbars by comparing vegetation, substrate, and debris variables at nest sites (n = 798) and random points (n = 1,113) in bare or sparsely vegetated habitats. Our logistic regression models revealed that a broader suite of habitat features was important in nest-site selection on constructed than on natural sandbars. Odds ratios for habitat variables indicated that avoidance of habitat features was the dominant nest-site selection process on both sandbar types, with nesting terns being attracted to nest-site habitat features (gravel and debris) and avoiding vegetation only on constructed sandbars, and avoiding silt and leaf litter on both sandbar types. Despite the seemingly uniform nature of these habitats, our results suggest that a complex suite of habitat features influences nest-site choice by least terns. However, nest-site selection in this social, colonially nesting species may be influenced by other factors, including spatial arrangement of bare sand habitat, proximity to other least terns, and prior habitat occupancy by piping plovers (Charadrius melodus). We found that nest-site selection was sensitive to subtle variation in habitat features, suggesting that rigor in maintaining habitat condition will be necessary in managing sandbars for the benefit of least terns. Further, management strategies that reduce habitat features that are avoided by least terns may be the most beneficial to nesting least terns.

  14. A spatial-dynamic value transfer model of economic losses from a biological invasion

    Treesearch

    Thomas P. Holmes; Andrew M. Liebhold; Kent F. Kovacs; Betsy Von Holle

    2010-01-01

    Rigorous assessments of the economic impacts of introduced species at broad spatial scales are required to provide credible information to policy makers. We propose that economic models of aggregate damages induced by biological invasions need to link microeconomic analyses of site-specific economic damages with spatial-dynamic models of value change associated with...

  15. Pedagogy and the Intuitive Appeal of Learning Styles in Post-Compulsory Education in England

    ERIC Educational Resources Information Center

    Nixon, Lawrence; Gregson, Maggie; Spedding, Trish

    2007-01-01

    Despite the rigorous and robust evaluation of learning styles theories, models and inventories, little objective evidence in support of their effectiveness has been found. The lack of unambiguous evidence in support of these models and practices leaves the continued popularity of these models and instruments as a puzzle. Two related accounts of…

  16. A New Theory-to-Practice Model for Student Affairs: Integrating Scholarship, Context, and Reflection

    ERIC Educational Resources Information Center

    Reason, Robert D.; Kimball, Ezekiel W.

    2012-01-01

    In this article, we synthesize existing theory-to-practice approaches within the student affairs literature to arrive at a new model that incorporates formal and informal theory, institutional context, and reflective practice. The new model arrives at a balance between the rigor necessary for scholarly theory development and the adaptability…

  17. Experimental evaluation of rigor mortis. V. Effect of various temperatures on the evolution of rigor mortis.

    PubMed

    Krompecher, T

    1981-01-01

    Objective measurements were carried out to study the evolution of rigor mortis on rats at various temperatures. Our experiments showed that: (1) at 6 degrees C rigor mortis reaches full development between 48 and 60 hours post mortem, and is resolved at 168 hours post mortem; (2) at 24 degrees C rigor mortis reaches full development at 5 hours post mortem, and is resolved at 16 hours post mortem; (3) at 37 degrees C rigor mortis reaches full development at 3 hours post mortem, and is resolved at 6 hours post mortem; (4) the intensity of rigor mortis grows with increase in temperature (difference between values obtained at 24 degrees C and 37 degrees C); and (5) and 6 degrees C a "cold rigidity" was found, in addition to and independent of rigor mortis.

  18. Training a Joint and Expeditionary Mindset

    DTIC Science & Technology

    2006-12-01

    associated with the JEM constructs and for using them to create effective computer-mediated training scenarios. The pedagogic model enables development of...ensure the instructional rigor of scenarios and provide a sound basis for determining performance indicators. The pedagogical model enables development...and Subordinate Constructs ........................................................................... 3 Pedagogical Fram ew ork

  19. Wisconsin's Model Academic Standards for Music.

    ERIC Educational Resources Information Center

    Nikolay, Pauli; Grady, Susan; Stefonek, Thomas

    To assist parents and educators in preparing students for the 21st century, Wisconsin citizens have become involved in the development of challenging academic standards in 12 curricular areas. Having clear standards for students and teachers makes it possible to develop rigorous local curricula and valid, reliable assessments. This model of…

  20. 75 FR 2523 - Office of Innovation and Improvement; Overview Information; Arts in Education Model Development...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-15

    ... that is based on rigorous scientifically based research methods to assess the effectiveness of a...) Relies on measurements or observational methods that provide reliable and valid data across evaluators... of innovative, cohesive models that are based on research and have demonstrated that they effectively...

  1. A simple model for indentation creep

    NASA Astrophysics Data System (ADS)

    Ginder, Ryan S.; Nix, William D.; Pharr, George M.

    2018-03-01

    A simple model for indentation creep is developed that allows one to directly convert creep parameters measured in indentation tests to those observed in uniaxial tests through simple closed-form relationships. The model is based on the expansion of a spherical cavity in a power law creeping material modified to account for indentation loading in a manner similar to that developed by Johnson for elastic-plastic indentation (Johnson, 1970). Although only approximate in nature, the simple mathematical form of the new model makes it useful for general estimation purposes or in the development of other deformation models in which a simple closed-form expression for the indentation creep rate is desirable. Comparison to a more rigorous analysis which uses finite element simulation for numerical evaluation shows that the new model predicts uniaxial creep rates within a factor of 2.5, and usually much better than this, for materials creeping with stress exponents in the range 1 ≤ n ≤ 7. The predictive capabilities of the model are evaluated by comparing it to the more rigorous analysis and several sets of experimental data in which both the indentation and uniaxial creep behavior have been measured independently.

  2. Rigor and reproducibility in research with transcranial electrical stimulation: An NIMH-sponsored workshop.

    PubMed

    Bikson, Marom; Brunoni, Andre R; Charvet, Leigh E; Clark, Vincent P; Cohen, Leonardo G; Deng, Zhi-De; Dmochowski, Jacek; Edwards, Dylan J; Frohlich, Flavio; Kappenman, Emily S; Lim, Kelvin O; Loo, Colleen; Mantovani, Antonio; McMullen, David P; Parra, Lucas C; Pearson, Michele; Richardson, Jessica D; Rumsey, Judith M; Sehatpour, Pejman; Sommers, David; Unal, Gozde; Wassermann, Eric M; Woods, Adam J; Lisanby, Sarah H

    Neuropsychiatric disorders are a leading source of disability and require novel treatments that target mechanisms of disease. As such disorders are thought to result from aberrant neuronal circuit activity, neuromodulation approaches are of increasing interest given their potential for manipulating circuits directly. Low intensity transcranial electrical stimulation (tES) with direct currents (transcranial direct current stimulation, tDCS) or alternating currents (transcranial alternating current stimulation, tACS) represent novel, safe, well-tolerated, and relatively inexpensive putative treatment modalities. This report seeks to promote the science, technology and effective clinical applications of these modalities, identify research challenges, and suggest approaches for addressing these needs in order to achieve rigorous, reproducible findings that can advance clinical treatment. The National Institute of Mental Health (NIMH) convened a workshop in September 2016 that brought together experts in basic and human neuroscience, electrical stimulation biophysics and devices, and clinical trial methods to examine the physiological mechanisms underlying tDCS/tACS, technologies and technical strategies for optimizing stimulation protocols, and the state of the science with respect to therapeutic applications and trial designs. Advances in understanding mechanisms, methodological and technological improvements (e.g., electronics, computational models to facilitate proper dosing), and improved clinical trial designs are poised to advance rigorous, reproducible therapeutic applications of these techniques. A number of challenges were identified and meeting participants made recommendations made to address them. These recommendations align with requirements in NIMH funding opportunity announcements to, among other needs, define dosimetry, demonstrate dose/response relationships, implement rigorous blinded trial designs, employ computational modeling, and demonstrate target engagement when testing stimulation-based interventions for the treatment of mental disorders. Published by Elsevier Inc.

  3. Rigor and reproducibility in research with transcranial electrical stimulation: An NIMH-sponsored workshop

    PubMed Central

    Bikson, Marom; Brunoni, Andre R.; Charvet, Leigh E.; Clark, Vincent P.; Cohen, Leonardo G.; Deng, Zhi-De; Dmochowski, Jacek; Edwards, Dylan J.; Frohlich, Flavio; Kappenman, Emily S.; Lim, Kelvin O.; Loo, Colleen; Mantovani, Antonio; McMullen, David P.; Parra, Lucas C.; Pearson, Michele; Richardson, Jessica D.; Rumsey, Judith M.; Sehatpour, Pejman; Sommers, David; Unal, Gozde; Wassermann, Eric M.; Woods, Adam J.; Lisanby, Sarah H.

    2018-01-01

    Background Neuropsychiatric disorders are a leading source of disability and require novel treatments that target mechanisms of disease. As such disorders are thought to result from aberrant neuronal circuit activity, neuromodulation approaches are of increasing interest given their potential for manipulating circuits directly. Low intensity transcranial electrical stimulation (tES) with direct currents (transcranial direct current stimulation, tDCS) or alternating currents (transcranial alternating current stimulation, tACS) represent novel, safe, well-tolerated, and relatively inexpensive putative treatment modalities. Objective This report seeks to promote the science, technology and effective clinical applications of these modalities, identify research challenges, and suggest approaches for addressing these needs in order to achieve rigorous, reproducible findings that can advance clinical treatment. Methods The National Institute of Mental Health (NIMH) convened a workshop in September 2016 that brought together experts in basic and human neuroscience, electrical stimulation biophysics and devices, and clinical trial methods to examine the physiological mechanisms underlying tDCS/tACS, technologies and technical strategies for optimizing stimulation protocols, and the state of the science with respect to therapeutic applications and trial designs. Results Advances in understanding mechanisms, methodological and technological improvements (e.g., electronics, computational models to facilitate proper dosing), and improved clinical trial designs are poised to advance rigorous, reproducible therapeutic applications of these techniques. A number of challenges were identified and meeting participants made recommendations made to address them. Conclusions These recommendations align with requirements in NIMH funding opportunity announcements to, among other needs, define dosimetry, demonstrate dose/response relationships, implement rigorous blinded trial designs, employ computational modeling, and demonstrate target engagement when testing stimulation-based interventions for the treatment of mental disorders. PMID:29398575

  4. Is hyperactivity ubiquitous in ADHD or dependent on environmental demands? Evidence from meta-analysis

    PubMed Central

    Kofler, Michael J.; Raiker, Joseph S.; Sarver, Dustin E.; Wells, Erica L.; Soto, Elia F.

    2016-01-01

    Hyperactivity, or excess gross motor activity, is considered a core and ubiquitous characteristic of ADHD. Alternate models question this premise, and propose that hyperactive behavior reflects, to a large extent, purposeful behavior to cope with environmental demands that interact with underlying neurobiological vulnerabilities. The present review critically evaluates the ubiquity and environmental modifiability of hyperactivity in ADHD through meta-analysis of 63 studies of mechanically measured activity level in children, adolescents, and adults with ADHD relative to typically developing (TD) groups. Random effects models corrected for publication bias confirmed elevated gross motor activity in ADHD (d = 0.86); surprisingly, neither participant age (child vs. adult) nor the proportion of each ADHD sample diagnosed with the Inattentive subtype/presentation moderated this effect. In contrast, activity level assessed during high cognitive load conditions in general (d = 1.14) and high executive functioning demands in particular (d = 1.39) revealed significantly higher effect sizes than activity level during low cognitive load (d = 0.36) and in-class schoolwork (d = 0.50) settings. Low stimulation environments, more rigorous diagnostic practices, actigraph measurement of movement frequency and intensity, and ADHD samples that included fewer females were also associated with larger effects. Overall, the results are inconsistent with DSM-5 and ADHD models that a) describe hyperactivity as ubiquitous behavior, b) predict a developmental decline in hyperactivity, or c) differentiate subtypes/presentations according to perceived differences in hyperactive behavior. Instead, results suggest that the presence and magnitude of hyperactive behavior in ADHD may be influenced to a considerable extent by environmental factors in general, and cognitive/executive functioning demands in particular. PMID:27131918

  5. Population growth rates of reef sharks with and without fishing on the great barrier reef: robust estimation with multiple models.

    PubMed

    Hisano, Mizue; Connolly, Sean R; Robbins, William D

    2011-01-01

    Overfishing of sharks is a global concern, with increasing numbers of species threatened by overfishing. For many sharks, both catch rates and underwater visual surveys have been criticized as indices of abundance. In this context, estimation of population trends using individual demographic rates provides an important alternative means of assessing population status. However, such estimates involve uncertainties that must be appropriately characterized to credibly and effectively inform conservation efforts and management. Incorporating uncertainties into population assessment is especially important when key demographic rates are obtained via indirect methods, as is often the case for mortality rates of marine organisms subject to fishing. Here, focusing on two reef shark species on the Great Barrier Reef, Australia, we estimated natural and total mortality rates using several indirect methods, and determined the population growth rates resulting from each. We used bootstrapping to quantify the uncertainty associated with each estimate, and to evaluate the extent of agreement between estimates. Multiple models produced highly concordant natural and total mortality rates, and associated population growth rates, once the uncertainties associated with the individual estimates were taken into account. Consensus estimates of natural and total population growth across multiple models support the hypothesis that these species are declining rapidly due to fishing, in contrast to conclusions previously drawn from catch rate trends. Moreover, quantitative projections of abundance differences on fished versus unfished reefs, based on the population growth rate estimates, are comparable to those found in previous studies using underwater visual surveys. These findings appear to justify management actions to substantially reduce the fishing mortality of reef sharks. They also highlight the potential utility of rigorously characterizing uncertainty, and applying multiple assessment methods, to obtain robust estimates of population trends in species threatened by overfishing.

  6. Population Growth Rates of Reef Sharks with and without Fishing on the Great Barrier Reef: Robust Estimation with Multiple Models

    PubMed Central

    Hisano, Mizue; Connolly, Sean R.; Robbins, William D.

    2011-01-01

    Overfishing of sharks is a global concern, with increasing numbers of species threatened by overfishing. For many sharks, both catch rates and underwater visual surveys have been criticized as indices of abundance. In this context, estimation of population trends using individual demographic rates provides an important alternative means of assessing population status. However, such estimates involve uncertainties that must be appropriately characterized to credibly and effectively inform conservation efforts and management. Incorporating uncertainties into population assessment is especially important when key demographic rates are obtained via indirect methods, as is often the case for mortality rates of marine organisms subject to fishing. Here, focusing on two reef shark species on the Great Barrier Reef, Australia, we estimated natural and total mortality rates using several indirect methods, and determined the population growth rates resulting from each. We used bootstrapping to quantify the uncertainty associated with each estimate, and to evaluate the extent of agreement between estimates. Multiple models produced highly concordant natural and total mortality rates, and associated population growth rates, once the uncertainties associated with the individual estimates were taken into account. Consensus estimates of natural and total population growth across multiple models support the hypothesis that these species are declining rapidly due to fishing, in contrast to conclusions previously drawn from catch rate trends. Moreover, quantitative projections of abundance differences on fished versus unfished reefs, based on the population growth rate estimates, are comparable to those found in previous studies using underwater visual surveys. These findings appear to justify management actions to substantially reduce the fishing mortality of reef sharks. They also highlight the potential utility of rigorously characterizing uncertainty, and applying multiple assessment methods, to obtain robust estimates of population trends in species threatened by overfishing. PMID:21966402

  7. Academic Rigor in the College Classroom: Two Federal Commissions Strive to Define Rigor in the Past 70 Years

    ERIC Educational Resources Information Center

    Francis, Clay

    2018-01-01

    Historic notions of academic rigor usually follow from critiques of the system--we often define our goals for academically rigorous work through the lens of our shortcomings. This chapter discusses how the Truman Commission in 1947 and the Spellings Commission in 2006 shaped the way we think about academic rigor in today's context.

  8. Emergency cricothyrotomy for trismus caused by instantaneous rigor in cardiac arrest patients.

    PubMed

    Lee, Jae Hee; Jung, Koo Young

    2012-07-01

    Instantaneous rigor as muscle stiffening occurring in the moment of death (or cardiac arrest) can be confused with rigor mortis. If trismus is caused by instantaneous rigor, orotracheal intubation is impossible and a surgical airway should be secured. Here, we report 2 patients who had emergency cricothyrotomy for trismus caused by instantaneous rigor. This case report aims to help physicians understand instantaneous rigor and to emphasize the importance of securing a surgical airway quickly on the occurrence of trismus. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Lumped Model Generation and Evaluation: Sensitivity and Lie Algebraic Techniques with Applications to Combustion

    DTIC Science & Technology

    1989-03-03

    address global parameter space mapping issues for first order differential equations. The rigorous criteria for the existence of exact lumping by linear projective transformations was also established.

  10. Monitoring programs to assess reintroduction efforts: A critical component in recovery

    USGS Publications Warehouse

    Muths, E.; Dreitz, V.

    2008-01-01

    Reintroduction is a powerful tool in our conservation toolbox. However, the necessary follow-up, i.e. long-term monitoring, is not commonplace and if instituted may lack rigor. We contend that valid monitoring is possible, even with sparse data. We present a means to monitor based on demographic data and a projection model using the Wyoming toad (Bufo baxten) as an example. Using an iterative process, existing data is built upon gradually such that demographic estimates and subsequent inferences increase in reliability. Reintroduction and defensible monitoring may become increasingly relevant as the outlook for amphibians, especially in tropical regions, continues to deteriorate and emergency collection, captive breeding, and reintroduction become necessary. Rigorous use of appropriate modeling and an adaptive approach can validate the use of reintroduction and substantially increase its value to recovery programs. ?? 2008 Museu de Cie??ncies Naturals.

  11. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  12. Component Design Report: International Transportation Energy Demand Determinants Model

    EIA Publications

    2017-01-01

    This Component Design Report discusses working design elements for a new model to replace the International Transportation Model (ITran) in the World Energy Projection System Plus (WEPS ) that is maintained by the U.S. Energy Information Administration. The key objective of the new International Transportation Energy Demand Determinants (ITEDD) model is to enable more rigorous, quantitative research related to energy consumption in the international transportation sectors.

  13. No-arbitrage, leverage and completeness in a fractional volatility model

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.; Oliveira, M. J.; Rodrigues, A. M.

    2015-02-01

    When the volatility process is driven by fractional noise one obtains a model which is consistent with the empirical market data. Depending on whether the stochasticity generators of log-price and volatility are independent or are the same, two versions of the model are obtained with different leverage behaviors. Here, the no-arbitrage and completeness properties of the models are rigorously studied.

  14. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors.

    PubMed

    El-Diasty, Mohammed; Pagiatakis, Spiros

    2009-01-01

    In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  15. Rigorous Results for the Distribution of Money on Connected Graphs

    NASA Astrophysics Data System (ADS)

    Lanchier, Nicolas; Reed, Stephanie

    2018-05-01

    This paper is concerned with general spatially explicit versions of three stochastic models for the dynamics of money that have been introduced and studied numerically by statistical physicists: the uniform reshuffling model, the immediate exchange model and the model with saving propensity. All three models consist of systems of economical agents that consecutively engage in pairwise monetary transactions. Computer simulations performed in the physics literature suggest that, when the number of agents and the average amount of money per agent are large, the limiting distribution of money as time goes to infinity approaches the exponential distribution for the first model, the gamma distribution with shape parameter two for the second model and a distribution similar but not exactly equal to a gamma distribution whose shape parameter depends on the saving propensity for the third model. The main objective of this paper is to give rigorous proofs of these conjectures and also extend these conjectures to generalizations of the first two models and a variant of the third model that include local rather than global interactions, i.e., instead of choosing the two interacting agents uniformly at random from the system, the agents are located on the vertex set of a general connected graph and can only interact with their neighbors.

  16. Effects of non-steroidal anti-inflammatory drug treatments on cognitive decline vary by phase of pre-clinical Alzheimer disease: findings from the randomized controlled Alzheimer's Disease Anti-inflammatory Prevention Trial.

    PubMed

    Leoutsakos, Jeannie-Marie S; Muthen, Bengt O; Breitner, John C S; Lyketsos, Constantine G

    2012-04-01

    We examined the effects of non-steroidal anti-inflammatory drugs on cognitive decline as a function of phase of pre-clinical Alzheimer disease. Given recent findings that cognitive decline accelerates as clinical diagnosis is approached, we used rate of decline as a proxy for phase of pre-clinical Alzheimer disease. We fit growth mixture models of Modified Mini-Mental State (3MS) Examination trajectories with data from 2388 participants in the Alzheimer's Disease Anti-inflammatory Prevention Trial and included class-specific effects of naproxen and celecoxib. We identified three classes: "no decline", "slow decline", and "fast decline", and examined the effects of celecoxib and naproxen on linear slope and rate of change by class. Inclusion of quadratic terms improved fit of the model (-2 log likelihood difference: 369.23; p < 0.001) but resulted in reversal of effects over time. Over 4 years, participants in the slow-decline class on placebo typically lost 6.6 3MS points, whereas those on naproxen lost 3.1 points (p-value for difference: 0.19). Participants in the fast-decline class on placebo typically lost 11.2 points, but those on celecoxib first declined and then gained points (p-value for difference from placebo: 0.04), whereas those on naproxen showed a typical decline of 24.9 points (p-value for difference from placebo: <0.0001). Our results appeared statistically robust but provided some unexpected contrasts in effects of different treatments at different times. Naproxen may attenuate cognitive decline in slow decliners while accelerating decline in fast decliners. Celecoxib appeared to have similar effects at first but then attenuated change in fast decliners. Copyright © 2011 John Wiley & Sons, Ltd.

  17. Study of the quality characteristics in cold-smoked salmon (Salmo salar) originating from pre- or post-rigor raw material.

    PubMed

    Birkeland, S; Akse, L

    2010-01-01

    Improved slaughtering procedures in the salmon industry have caused a delayed onset of rigor mortis and, thus, a potential for pre-rigor secondary processing. The aim of this study was to investigate the effect of rigor status at time of processing on quality traits color, texture, sensory, microbiological, in injection salted, and cold-smoked Atlantic salmon (Salmo salar). Injection of pre-rigor fillets caused a significant (P<0.001) contraction (-7.9%± 0.9%) on the caudal-cranial axis. No significant differences in instrumental color (a*, b*, C*, or h*), texture (hardness), or sensory traits (aroma, color, taste, and texture) were observed between pre- or post-rigor processed fillets; however, post-rigor (1477 ± 38 g) fillets had a significant (P>0.05) higher fracturability than pre-rigor fillets (1369 ± 71 g). Pre-rigor fillets were significantly (P<0.01) lighter, L*, (39.7 ± 1.0) than post-rigor fillets (37.8 ± 0.8) and had significantly lower (P<0.05) aerobic plate count (APC), 1.4 ± 0.4 log CFU/g against 2.6 ± 0.6 log CFU/g, and psychrotrophic count (PC), 2.1 ± 0.2 log CFU/g against 3.0 ± 0.5 log CFU/g, than post-rigor processed fillets. This study showed that similar quality characteristics can be obtained in cold-smoked products processed either pre- or post-rigor when using suitable injection salting protocols and smoking techniques. © 2010 Institute of Food Technologists®

  18. Arctic Sea Ice Decline: Observations, Projections, Mechanisms, and Implications

    NASA Astrophysics Data System (ADS)

    DeWeaver, Eric T.; Bitz, Cecilia M.; Tremblay, L.-Bruno

    This volume addresses the rapid decline of Arctic sea ice, placing recent sea ice decline in the context of past observations, climate model simulations and projections, and simple models of the climate sensitivity of sea ice. Highlights of the work presented here include • An appraisal of the role played by wind forcing in driving the decline; • A reconstruction of Arctic sea ice conditions prior to human observations, based on proxy data from sediments; • A modeling approach for assessing the impact of sea ice decline on polar bears, used as input to the U.S. Fish and Wildlife Service's decision to list the polar bear as a threatened species under the Endangered Species Act; • Contrasting studies on the existence of a "tipping point," beyond which Arctic sea ice decline will become (or has already become) irreversible, including an examination of the role of the small ice cap instability in global warming simulations; • A significant summertime atmospheric response to sea ice reduction in an atmospheric general circulation model, suggesting a positive feedback and the potential for short-term climate prediction. The book will be of interest to researchers attempting to understand the recent behavior of Arctic sea ice, model projections of future sea ice loss, and the consequences of sea ice loss for the natural and human systems of the Arctic.

  19. Oak decline around the world

    Treesearch

    Kurt W. Gottschalk; Philip M. Wargo

    1997-01-01

    Oak (Quercus spp.) decline is a malady related to the consequences of stress and successful attack of stressed trees by opportunistic (secondary) organisms (Wargo et al. 1983). It is a progressive process where trees decline in health for several years before they die. Houston (1981) developed a model of declines that is presented in Figure 1. So...

  20. Multivariate Latent Change Modeling of Developmental Decline in Academic Intrinsic Math Motivation and Achievement: Childhood through Adolescence

    ERIC Educational Resources Information Center

    Gottfried, Adele Eskeles; Marcoulides, George A.; Gottfried, Allen W.; Oliver, Pamella H.; Guerin, Diana Wright

    2007-01-01

    Research has established that academic intrinsic motivation, enjoyment of school learning without receipt of external rewards, significantly declines across childhood through adolescence. Math intrinsic motivation evidences the most severe decline compared with other subject areas. This study addresses this developmental decline in math intrinsic…

  1. The teen brain: insights from neuroimaging.

    PubMed

    Giedd, Jay N

    2008-04-01

    Few parents of a teenager are surprised to hear that the brain of a 16-year-old is different from the brain of an 8-year-old. Yet to pin down these differences in a rigorous scientific way has been elusive. Magnetic resonance imaging, with the capacity to provide exquisitely accurate quantifications of brain anatomy and physiology without the use of ionizing radiation, has launched a new era of adolescent neuroscience. Longitudinal studies of subjects from ages 3-30 years demonstrate a general pattern of childhood peaks of gray matter followed by adolescent declines, functional and structural increases in connectivity and integrative processing, and a changing balance between limbic/subcortical and frontal lobe functions, extending well into young adulthood. Although overinterpretation and premature application of neuroimaging findings for diagnostic purposes remains a risk, converging data from multiple imaging modalities is beginning to elucidate the implications of these brain changes on cognition, emotion, and behavior.

  2. Monitoring low density avian populations: An example using Mountain Plovers

    USGS Publications Warehouse

    Dreitz, V.J.; Lukacs, P.M.; Knopf, F.L.

    2006-01-01

    Declines in avian populations highlight a need for rigorous, broad-scale monitoring programs to document trends in avian populations that occur in low densities across expansive landscapes. Accounting for the spatial variation and variation in detection probability inherent to monitoring programs is thought to be effort-intensive and time-consuming. We determined the feasibility of the analytical method developed by Royle and Nichols (2003), which uses presence-absence (detection-non-detection) field data, to estimate abundance of Mountain Plovers (Charadrius montanus) per sampling unit in agricultural fields, grassland, and prairie dog habitat in eastern Colorado. Field methods were easy to implement and results suggest that the analytical method provides valuable insight into population patterning among habitats. Mountain Plover abundance was highest in prairie dog habitat, slightly lower in agricultural fields, and substantially lower in grassland. These results provided valuable insight to focus future research into Mountain Plover ecology and conservation. ?? The Cooper Ornithological Society 2006.

  3. Sea level regulated tetrapod diversity dynamics through the Jurassic/Cretaceous interval

    PubMed Central

    Tennant, Jonathan P.; Mannion, Philip D.; Upchurch, Paul

    2016-01-01

    Reconstructing deep time trends in biodiversity remains a central goal for palaeobiologists, but our understanding of the magnitude and tempo of extinctions and radiations is confounded by uneven sampling of the fossil record. In particular, the Jurassic/Cretaceous (J/K) boundary, 145 million years ago, remains poorly understood, despite an apparent minor extinction and the radiation of numerous important clades. Here we apply a rigorous subsampling approach to a comprehensive tetrapod fossil occurrence data set to assess the group's macroevolutionary dynamics through the J/K transition. Although much of the signal is exclusively European, almost every higher tetrapod group was affected by a substantial decline across the boundary, culminating in the extinction of several important clades and the ecological release and radiation of numerous modern tetrapod groups. Variation in eustatic sea level was the primary driver of these patterns, controlling biodiversity through availability of shallow marine environments and via allopatric speciation on land. PMID:27587285

  4. Modeling the Cloud to Enhance Capabilities for Crises and Catastrophe Management

    DTIC Science & Technology

    2016-11-16

    order for cloud computing infrastructures to be successfully deployed in real world scenarios as tools for crisis and catastrophe management, where...Statement of the Problem Studied As cloud computing becomes the dominant computational infrastructure[1] and cloud technologies make a transition to hosting...1. Formulate rigorous mathematical models representing technological capabilities and resources in cloud computing for performance modeling and

  5. Kirkpatrick and Beyond: A Review of Models of Training Evaluation. IES Report.

    ERIC Educational Resources Information Center

    Tamkin, P.; Yarnall, J.; Kerrin, M.

    Many organizations are not satisfied that their methods of evaluating training are rigorous or extensive enough to answer questions of value to them. Complaints about Kirkpatrick's popular four-step model (1959) of training evaluation are that each level is assumed to be associated with the previous and next levels and that the model is too simple…

  6. Accurate force field for molybdenum by machine learning large materials data

    NASA Astrophysics Data System (ADS)

    Chen, Chi; Deng, Zhi; Tran, Richard; Tang, Hanmei; Chu, Iek-Heng; Ong, Shyue Ping

    2017-09-01

    In this work, we present a highly accurate spectral neighbor analysis potential (SNAP) model for molybdenum (Mo) developed through the rigorous application of machine learning techniques on large materials data sets. Despite Mo's importance as a structural metal, existing force fields for Mo based on the embedded atom and modified embedded atom methods do not provide satisfactory accuracy on many properties. We will show that by fitting to the energies, forces, and stress tensors of a large density functional theory (DFT)-computed dataset on a diverse set of Mo structures, a Mo SNAP model can be developed that achieves close to DFT accuracy in the prediction of a broad range of properties, including elastic constants, melting point, phonon spectra, surface energies, grain boundary energies, etc. We will outline a systematic model development process, which includes a rigorous approach to structural selection based on principal component analysis, as well as a differential evolution algorithm for optimizing the hyperparameters in the model fitting so that both the model error and the property prediction error can be simultaneously lowered. We expect that this newly developed Mo SNAP model will find broad applications in large and long-time scale simulations.

  7. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles.

    PubMed

    Kim, Hyun-Wook; Hwang, Ko-Eun; Song, Dong-Heon; Kim, Yong-Jae; Ham, Youn-Kyung; Yeo, Eui-Joo; Jeong, Tae-Jun; Choi, Yun-Sang; Kim, Cheon-Jei

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle.

  8. Effect of Pre-rigor Salting Levels on Physicochemical and Textural Properties of Chicken Breast Muscles

    PubMed Central

    Choi, Yun-Sang

    2015-01-01

    This study was conducted to evaluate the effect of pre-rigor salting level (0-4% NaCl concentration) on physicochemical and textural properties of pre-rigor chicken breast muscles. The pre-rigor chicken breast muscles were de-boned 10 min post-mortem and salted within 25 min post-mortem. An increase in pre-rigor salting level led to the formation of high ultimate pH of chicken breast muscles at post-mortem 24 h. The addition of minimum of 2% NaCl significantly improved water holding capacity, cooking loss, protein solubility, and hardness when compared to the non-salting chicken breast muscle (p<0.05). On the other hand, the increase in pre-rigor salting level caused the inhibition of myofibrillar protein degradation and the acceleration of lipid oxidation. However, the difference in NaCl concentration between 3% and 4% had no great differences in the results of physicochemical and textural properties due to pre-rigor salting effects (p>0.05). Therefore, our study certified the pre-rigor salting effect of chicken breast muscle salted with 2% NaCl when compared to post-rigor muscle salted with equal NaCl concentration, and suggests that the 2% NaCl concentration is minimally required to ensure the definite pre-rigor salting effect on chicken breast muscle. PMID:26761884

  9. Use of software engineering techniques in the design of the ALEPH data acquisition system

    NASA Astrophysics Data System (ADS)

    Charity, T.; McClatchey, R.; Harvey, J.

    1987-08-01

    The SASD methodology is being used to provide a rigorous design framework for various components of the ALEPH data acquisition system. The Entity-Relationship data model is used to describe the layout and configuration of the control and acquisition systems and detector components. State Transition Diagrams are used to specify control applications such as run control and resource management and Data Flow Diagrams assist in decomposing software tasks and defining interfaces between processes. These techniques encourage rigorous software design leading to enhanced functionality and reliability. Improved documentation and communication ensures continuity over the system life-cycle and simplifies project management.

  10. A new algorithm for construction of coarse-grained sites of large biomolecules.

    PubMed

    Li, Min; Zhang, John Z H; Xia, Fei

    2016-04-05

    The development of coarse-grained (CG) models for large biomolecules remains a challenge in multiscale simulations, including a rigorous definition of CG representations for them. In this work, we proposed a new stepwise optimization imposed with the boundary-constraint (SOBC) algorithm to construct the CG sites of large biomolecules, based on the s cheme of essential dynamics CG. By means of SOBC, we can rigorously derive the CG representations of biomolecules with less computational cost. The SOBC is particularly efficient for the CG definition of large systems with thousands of residues. The resulted CG sites can be parameterized as a CG model using the normal mode analysis based fluctuation matching method. Through normal mode analysis, the obtained modes of CG model can accurately reflect the functionally related slow motions of biomolecules. The SOBC algorithm can be used for the construction of CG sites of large biomolecules such as F-actin and for the study of mechanical properties of biomaterials. © 2015 Wiley Periodicals, Inc.

  11. Derivation of rigorous conditions for high cell-type diversity by algebraic approach.

    PubMed

    Yoshida, Hiroshi; Anai, Hirokazu; Horimoto, Katsuhisa

    2007-01-01

    The development of a multicellular organism is a dynamic process. Starting with one or a few cells, the organism develops into different types of cells with distinct functions. We have constructed a simple model by considering the cell number increase and the cell-type order conservation, and have assessed conditions for cell-type diversity. This model is based on a stochastic Lindenmayer system with cell-to-cell interactions for three types of cells. In the present model, we have successfully derived complex but rigorous algebraic relations between the proliferation and transition rates for cell-type diversity by using a symbolic method: quantifier elimination (QE). Surprisingly, three modes for the proliferation and transition rates have emerged for large ratios of the initial cells to the developed cells. The three modes have revealed that the equality between the development rates for the highest cell-type diversity is reduced during the development process of multicellular organisms. Furthermore, we have found that the highest cell-type diversity originates from order conservation.

  12. Shear-induced opening of the coronal magnetic field

    NASA Technical Reports Server (NTRS)

    Wolfson, Richard

    1995-01-01

    This work describes the evolution of a model solar corona in response to motions of the footpoints of its magnetic field. The mathematics involved is semianalytic, with the only numerical solution being that of an ordinary differential equation. This approach, while lacking the flexibility and physical details of full MHD simulations, allows for very rapid computation along with complete and rigorous exploration of the model's implications. We find that the model coronal field bulges upward, at first slowly and then more dramatically, in response to footpoint displacements. The energy in the field rises monotonically from that of the initial potential state, and the field configuration and energy appraoch asymptotically that of a fully open field. Concurrently, electric currents develop and concentrate into a current sheet as the limiting case of the open field is approached. Examination of the equations shows rigorously that in the asymptotic limit of the fully open field, the current layer becomes a true ideal MHD singularity.

  13. Rigorous numerical modeling of scattering-type scanning near-field optical microscopy and spectroscopy

    NASA Astrophysics Data System (ADS)

    Chen, Xinzhong; Lo, Chiu Fan Bowen; Zheng, William; Hu, Hai; Dai, Qing; Liu, Mengkun

    2017-11-01

    Over the last decade, scattering-type scanning near-field optical microscopy and spectroscopy have been widely used in nano-photonics and material research due to their fine spatial resolution and broad spectral range. A number of simplified analytical models have been proposed to quantitatively understand the tip-scattered near-field signal. However, a rigorous interpretation of the experimental results is still lacking at this stage. Numerical modelings, on the other hand, are mostly done by simulating the local electric field slightly above the sample surface, which only qualitatively represents the near-field signal rendered by the tip-sample interaction. In this work, we performed a more comprehensive numerical simulation which is based on realistic experimental parameters and signal extraction procedures. By directly comparing to the experiments as well as other simulation efforts, our methods offer a more accurate quantitative description of the near-field signal, paving the way for future studies of complex systems at the nanoscale.

  14. Diffraction-based overlay measurement on dedicated mark using rigorous modeling method

    NASA Astrophysics Data System (ADS)

    Lu, Hailiang; Wang, Fan; Zhang, Qingyun; Chen, Yonghui; Zhou, Chang

    2012-03-01

    Diffraction Based Overlay (DBO) is widely evaluated by numerous authors, results show DBO can provide better performance than Imaging Based Overlay (IBO). However, DBO has its own problems. As well known, Modeling based DBO (mDBO) faces challenges of low measurement sensitivity and crosstalk between various structure parameters, which may result in poor accuracy and precision. Meanwhile, main obstacle encountered by empirical DBO (eDBO) is that a few pads must be employed to gain sufficient information on overlay-induced diffraction signature variations, which consumes more wafer space and costs more measuring time. Also, eDBO may suffer from mark profile asymmetry caused by processes. In this paper, we propose an alternative DBO technology that employs a dedicated overlay mark and takes a rigorous modeling approach. This technology needs only two or three pads for each direction, which is economic and time saving. While overlay measurement error induced by mark profile asymmetry being reduced, this technology is expected to be as accurate and precise as scatterometry technologies.

  15. Dead in the water--are we killing the hospital autopsy with poor consent practices?

    PubMed

    Henry, Jaimie; Nicholas, Nick

    2012-07-01

    It is now a recognized fact that the practice of conducting a consent (or hospital) post-mortem examination is in decline. There have been many reasons put forth to explain this demise, but the quality of the consenting process is frequently cited as having a high impact. This article focuses on consent practices for post-mortem examinations in England and Wales, and considers if our consent techniques are adversely affecting post-mortem examination uptake. We examine the regulatory compliance of trusts with their statutory obligations by analyzing the Human Tissue Authority's compliance and inspection reports. We further analyze 21 publicly available NHS Trust policies on post-mortem examination consent procedures, and consider whether these are fit for the purpose of meeting the dual needs of clinicians and the bereaved. Despite more Human Tissue Authority inspections, there is a disproportionate rise in enforcement actions, with up to 48% of sampled Trusts exhibiting shortcomings in their legal duties. Additionally, only 52.4% of sampled trusts follow the Human Tissue Authority best-practice model, with 23.8% having no documented procedures. Despite the well founded evidence base for best-practice models, consent practices for post-mortem examinations remains poor and is likely to have a gross adverse effect on the rate of post-mortem examinations. We recommend that NHS Trusts rigorously review their protocols and introduce a team-approach between clinicians and trained bereavement staff in core-consent teams, as the Human Tissue Authority suggests, whilst at the same time placing a strong emphasis on education for junior and senior colleagues alike.

  16. Testing the Role of Climate Change in Species Decline: Is the Eastern Quoll a Victim of a Change in the Weather?

    PubMed Central

    Fancourt, Bronwyn A.; Bateman, Brooke L.; VanDerWal, Jeremy; Nicol, Stewart C.; Hawkins, Clare E.; Jones, Menna E.; Johnson, Christopher N.

    2015-01-01

    To conserve a declining species we first need to diagnose the causes of decline. This is one of the most challenging tasks faced by conservation practitioners. In this study, we used temporally explicit species distribution models (SDMs) to test whether shifting weather can explain the recent decline of a marsupial carnivore, the eastern quoll (Dasyurus viverrinus). We developed an SDM using weather variables matched to occurrence records of the eastern quoll over the last 60 years, and used the model to reconstruct variation through time in the distribution of climatically suitable range for the species. The weather model produced a meaningful prediction of the known distribution of the species. Abundance of quolls, indexed by transect counts, was positively related to the modelled area of suitable habitat between 1990 and 2004. In particular, a sharp decline in abundance from 2001 to 2003 coincided with a sustained period of unsuitable weather over much of the species’ distribution. Since 2004, abundance has not recovered despite a return to suitable weather conditions, and abundance and area of suitable habitat have been uncorrelated. We suggest that fluctuations in weather account for the species’ recent decline, but other unrelated factors have suppressed recovery. PMID:26106887

  17. Two takes on the ecosystem impacts of climate change and fishing: Comparing a size-based and a species-based ecosystem model in the central North Pacific

    NASA Astrophysics Data System (ADS)

    Woodworth-Jefcoats, Phoebe A.; Polovina, Jeffrey J.; Howell, Evan A.; Blanchard, Julia L.

    2015-11-01

    We compare two ecosystem model projections of 21st century climate change and fishing impacts in the central North Pacific. Both a species-based and a size-based ecosystem modeling approach are examined. While both models project a decline in biomass across all sizes in response to climate change and a decline in large fish biomass in response to increased fishing mortality, the models vary significantly in their handling of climate and fishing scenarios. For example, based on the same climate forcing the species-based model projects a 15% decline in catch by the end of the century while the size-based model projects a 30% decline. Disparities in the models' output highlight the limitations of each approach by showing the influence model structure can have on model output. The aspects of bottom-up change to which each model is most sensitive appear linked to model structure, as does the propagation of interannual variability through the food web and the relative impact of combined top-down and bottom-up change. Incorporating integrated size- and species-based ecosystem modeling approaches into future ensemble studies may help separate the influence of model structure from robust projections of ecosystem change.

  18. Longitudinal Modeling of Functional Decline Associated with Pathologic Alzheimer's Disease in Older Persons without Cognitive Impairment.

    PubMed

    Wang, Dai; Schultz, Tim; Novak, Gerald P; Baker, Susan; Bennett, David A; Narayan, Vaibhav A

    2018-01-01

    Therapeutic research on Alzheimer's disease (AD) has moved to intercepting the disease at the preclinical phase. Most drugs in late development have focused on the amyloid hypothesis. To understand the magnitude of amyloid-related functional decline and to identify the functional domains sensitive to decline in a preclinical AD population. Data were from the Religious Orders Study and the Rush Memory and Aging Project. Cognitive decline was measured by a modified version of the Alzheimer's Disease Cooperative Study Preclinical Alzheimer Cognitive Composite. The trajectories of functional decline, as measured by the instrumental and basic activities of daily living, were longitudinally modeled in 484 participants without cognitive impairment at baseline and having both a final clinical and a postmortem neuropathology assessment of AD. Individuals with different final clinical diagnoses had different trajectories of cognitive and functional decline. Individuals with AD dementia, minor cognitive impairment, and no cognitive impairment had the most, intermediate, and least declines. While individuals with pathologic AD had significantly more cognitive decline over time than those without, the magnitude of difference in functional decline between these two groups was small. Functional domains such as handling finance and handling medications were more sensitive to decline. Demonstrating the functional benefit of an amyloid-targeting drug represents a significant challenge as elderly people experience functional decline due to a wide range of reasons with limited manifestation attributable to AD neuropathology. More sensitive functional scales focusing on the functional domains sensitive to decline in preclinical AD are needed.

  19. Experimental evaluation of rigor mortis. VI. Effect of various causes of death on the evolution of rigor mortis.

    PubMed

    Krompecher, T; Bergerioux, C; Brandt-Casadevall, C; Gujer, H R

    1983-07-01

    The evolution of rigor mortis was studied in cases of nitrogen asphyxia, drowning and strangulation, as well as in fatal intoxications due to strychnine, carbon monoxide and curariform drugs, using a modified method of measurement. Our experiments demonstrated that: (1) Strychnine intoxication hastens the onset and passing of rigor mortis. (2) CO intoxication delays the resolution of rigor mortis. (3) The intensity of rigor may vary depending upon the cause of death. (4) If the stage of rigidity is to be used to estimate the time of death, it is necessary: (a) to perform a succession of objective measurements of rigor mortis intensity; and (b) to verify the eventual presence of factors that could play a role in the modification of its development.

  20. A Computer Model for Teaching the Dynamic Behavior of AC Contactors

    ERIC Educational Resources Information Center

    Ruiz, J.-R. R.; Espinosa, A. G.; Romeral, L.

    2010-01-01

    Ac-powered contactors are extensively used in industry in applications such as automatic electrical devices, motor starters, and heaters. In this work, a practical session that allows students to model and simulate the dynamic behavior of ac-powered electromechanical contactors is presented. Simulation is carried out using a rigorous parametric…

  1. Testing Theoretical Models of Magnetic Damping Using an Air Track

    ERIC Educational Resources Information Center

    Vidaurre, Ana; Riera, Jaime; Monsoriu, Juan A.; Gimenez, Marcos H.

    2008-01-01

    Magnetic braking is a long-established application of Lenz's law. A rigorous analysis of the laws governing this problem involves solving Maxwell's equations in a time-dependent situation. Approximate models have been developed to describe different experimental results related to this phenomenon. In this paper we present a new method for the…

  2. The Cognitive Processes Associated with Occupational/Career Indecision: A Model for Gifted Adolescents

    ERIC Educational Resources Information Center

    Jung, Jae Yup

    2013-01-01

    This study developed and tested a new model of the cognitive processes associated with occupational/career indecision for gifted adolescents. A survey instrument with rigorous psychometric properties, developed from a number of existing instruments, was administered to a sample of 687 adolescents attending three academically selective high schools…

  3. Ocean Profile Measurements during the Seasonal Ice Zone Reconnaissance Surveys

    DTIC Science & Technology

    2012-09-30

    physical processes that occur within the BCSIZ that require data from all components of SIZRS, and improve predictive models of the SIZ through model ...the IABP (Ignatius Rigor) are approved by the USCG for operation from the ADA aircraft, but we anticipate being informed of any Safety of Flight Test

  4. Approximation Methods for Inverse Problems Governed by Nonlinear Parabolic Systems

    DTIC Science & Technology

    1999-12-17

    We present a rigorous theoretical framework for approximation of nonlinear parabolic systems with delays in the context of inverse least squares...numerical results demonstrating the convergence are given for a model of dioxin uptake and elimination in a distributed liver model that is a special case of the general theoretical framework .

  5. A Geometric Comparison of the Transformation Loci with Specific and Mobile Capital

    ERIC Educational Resources Information Center

    Colander, David; Gilbert, John; Oladi, Reza

    2008-01-01

    The authors show how the transformation loci in the specific factors model (capital specificity) and the Heckscher-Ohlin-Samuelson model (capital mobility) can be rigorously derived and easily compared by using geometric techniques on the basis of Savosnick geometry. The approach shows directly that the transformation locus with capital…

  6. Parental Maltreatment, Bullying, and Adolescent Depression: Evidence for the Mediating Role of Perceived Social Support

    ERIC Educational Resources Information Center

    Seeds, Pamela M.; Harkness, Kate L.; Quilty, Lena C.

    2010-01-01

    The support deterioration model of depression states that stress deteriorates the perceived availability and/or effectiveness of social support, which then leads to depression. The present study examined this model in adolescent depression following parent-perpetrated maltreatment and peer-perpetrated bullying, as assessed by a rigorous contextual…

  7. You've Shown the Program Model Is Effective. Now What?

    ERIC Educational Resources Information Center

    Ellickson, Phyllis L.

    2014-01-01

    Rigorous tests of theory-based programs require faithful implementation. Otherwise, lack of results might be attributable to faulty program delivery, faulty theory, or both. However, once the evidence indicates the model works and merits broader dissemination, implementation issues do not fade away. How can developers enhance the likelihood that…

  8. Benchmarking health system performance across regions in Uganda: a systematic analysis of levels and trends in key maternal and child health interventions, 1990-2011.

    PubMed

    Roberts, D Allen; Ng, Marie; Ikilezi, Gloria; Gasasira, Anne; Dwyer-Lindgren, Laura; Fullman, Nancy; Nalugwa, Talemwa; Kamya, Moses; Gakidou, Emmanuela

    2015-12-03

    Globally, countries are increasingly prioritizing the reduction of health inequalities and provision of universal health coverage. While national benchmarking has become more common, such work at subnational levels is rare. The timely and rigorous measurement of local levels and trends in key health interventions and outcomes is vital to identifying areas of progress and detecting early signs of stalled or declining health system performance. Previous studies have yet to provide a comprehensive assessment of Uganda's maternal and child health (MCH) landscape at the subnational level. By triangulating a number of different data sources - population censuses, household surveys, and administrative data - we generated regional estimates of 27 key MCH outcomes, interventions, and socioeconomic indicators from 1990 to 2011. After calculating source-specific estimates of intervention coverage, we used a two-step statistical model involving a mixed-effects linear model as an input to Gaussian process regression to produce regional-level trends. We also generated national-level estimates and constructed an indicator of overall intervention coverage based on the average of 11 high-priority interventions. National estimates often veiled large differences in coverage levels and trends across Uganda's regions. Under-5 mortality declined dramatically, from 163 deaths per 1,000 live births in 1990 to 85 deaths per 1,000 live births in 2011, but a large gap between Kampala and the rest of the country persisted. Uganda rapidly scaled up a subset of interventions across regions, including household ownership of insecticide-treated nets, receipt of artemisinin-based combination therapies among children under 5, and pentavalent immunization. Conversely, most regions saw minimal increases, if not actual declines, in the coverage of indicators that required multiple contacts with the health system, such as four or more antenatal care visits, three doses of oral polio vaccine, and two doses of intermittent preventive therapy during pregnancy. Some of the regions with the lowest levels of overall intervention coverage in 1990, such as North and West Nile, saw marked progress by 2011; nonetheless, sizeable disparities remained between Kampala and the rest of the country. Countrywide, overall coverage increased from 40% in 1990 to 64% in 2011, but coverage in 2011 ranged from 57% to 70% across regions. The MCH landscape in Uganda has, for the most part, improved between 1990 and 2011. Subnational benchmarking quantified the persistence of geographic health inequalities and identified regions in need of additional health systems strengthening. The tracking and analysis of subnational health trends should be conducted regularly to better guide policy decisions and strengthen responsiveness to local health needs.

  9. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT.

    PubMed

    Meltzer, S J; Auer, J

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions-nearly equimolecular to "physiological" solutions of sodium chloride-are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle.

  10. RIGOR MORTIS AND THE INFLUENCE OF CALCIUM AND MAGNESIUM SALTS UPON ITS DEVELOPMENT

    PubMed Central

    Meltzer, S. J.; Auer, John

    1908-01-01

    Calcium salts hasten and magnesium salts retard the development of rigor mortis, that is, when these salts are administered subcutaneously or intravenously. When injected intra-arterially, concentrated solutions of both kinds of salts cause nearly an immediate onset of a strong stiffness of the muscles which is apparently a contraction, brought on by a stimulation caused by these salts and due to osmosis. This contraction, if strong, passes over without a relaxation into a real rigor. This form of rigor may be classed as work-rigor (Arbeitsstarre). In animals, at least in frogs, with intact cords, the early contraction and the following rigor are stronger than in animals with destroyed cord. If M/8 solutions—nearly equimolecular to "physiological" solutions of sodium chloride—are used, even when injected intra-arterially, calcium salts hasten and magnesium salts retard the onset of rigor. The hastening and retardation in this case as well as in the cases of subcutaneous and intravenous injections, are ion effects and essentially due to the cations, calcium and magnesium. In the rigor hastened by calcium the effects of the extensor muscles mostly prevail; in the rigor following magnesium injection, on the other hand, either the flexor muscles prevail or the muscles become stiff in the original position of the animal at death. There seems to be no difference in the degree of stiffness in the final rigor, only the onset and development of the rigor is hastened in the case of the one salt and retarded in the other. Calcium hastens also the development of heat rigor. No positive facts were obtained with regard to the effect of magnesium upon heat vigor. Calcium also hastens and magnesium retards the onset of rigor in the left ventricle of the heart. No definite data were gathered with regard to the effects of these salts upon the right ventricle. PMID:19867124

  11. The Functional Transitions Model: Maximizing Ability in the Context of Progressive Disability Associated with Alzheimer's Disease

    ERIC Educational Resources Information Center

    Slaughter, Susan; Bankes, Jane

    2007-01-01

    The Functional Transitions Model (FTM) integrates the theoretical notions of progressive functional decline associated with Alzheimer's disease (AD), excess disability, and transitions occurring intermittently along the trajectory of functional decline. Application of the Functional Transitions Model to clinical practice encompasses the paradox of…

  12. Learning optimal quantum models is NP-hard

    NASA Astrophysics Data System (ADS)

    Stark, Cyril J.

    2018-02-01

    Physical modeling translates measured data into a physical model. Physical modeling is a major objective in physics and is generally regarded as a creative process. How good are computers at solving this task? Here, we show that in the absence of physical heuristics, the inference of optimal quantum models cannot be computed efficiently (unless P=NP ). This result illuminates rigorous limits to the extent to which computers can be used to further our understanding of nature.

  13. Reflective properties of randomly rough surfaces under large incidence angles.

    PubMed

    Qiu, J; Zhang, W J; Liu, L H; Hsu, P-f; Liu, L J

    2014-06-01

    The reflective properties of randomly rough surfaces at large incidence angles have been reported due to their potential applications in some of the radiative heat transfer research areas. The main purpose of this work is to investigate the formation mechanism of the specular reflection peak of rough surfaces at large incidence angles. The bidirectional reflectance distribution function (BRDF) of rough aluminum surfaces with different roughnesses at different incident angles is measured by a three-axis automated scatterometer. This study used a validated and accurate computational model, the rigorous coupled-wave analysis (RCWA) method, to compare and analyze the measurement BRDF results. It is found that the RCWA results show the same trend of specular peak as the measurement. This paper mainly focuses on the relative roughness at the range of 0.16<σ/λ<5.35. As the relative roughness decreases, the specular peak enhancement dramatically increases and the scattering region significantly reduces, especially under large incidence angles. The RCWA and the Rayleigh criterion results have been compared, showing that the relative error of the total integrated scatter increases as the roughness of the surface increases at large incidence angles. In addition, the zero-order diffractive power calculated by RCWA and the reflectance calculated by Fresnel equations are compared. The comparison shows that the relative error declines sharply when the incident angle is large and the roughness is small.

  14. Acoustic Scattering by Near-Surface Inhomogeneities in Porous Media

    DTIC Science & Technology

    1990-02-21

    surfaces [8]. Recently, this empirical model has been replaced by a more rigorous mi- crostructural model [9]. Here, the acoustical characteristics of...boundaries. A discussion of how ground acoustic characteristics are modelled then follows, with the chapter being concluded by a brief summary. 3.1...of ground acoustic char- acteristics, with particular emphasis on the Four parameter model of Atten- borough, that will be used extensively later. 48

  15. The relationship of rain-induced cross-polarization discrimination to attenuation for 10 to 30 GHz earth-space radio links

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Runyon, D. L.

    1984-01-01

    Rain depolarization is quantified through the cross-polarization discrimination (XPD) versus attenuation relationship. Such a relationship is derived by curve fitting to a rigorous theoretical model (the multiple scattering model) to determine the variation of the parameters involved. This simple isolation model (SIM) is compared to data from several earth-space link experiments and to three other models.

  16. Forecasting volatility with neural regression: a contribution to model adequacy.

    PubMed

    Refenes, A N; Holt, W T

    2001-01-01

    Neural nets' usefulness for forecasting is limited by problems of overfitting and the lack of rigorous procedures for model identification, selection and adequacy testing. This paper describes a methodology for neural model misspecification testing. We introduce a generalization of the Durbin-Watson statistic for neural regression and discuss the general issues of misspecification testing using residual analysis. We derive a generalized influence matrix for neural estimators which enables us to evaluate the distribution of the statistic. We deploy Monte Carlo simulation to compare the power of the test for neural and linear regressors. While residual testing is not a sufficient condition for model adequacy, it is nevertheless a necessary condition to demonstrate that the model is a good approximation to the data generating process, particularly as neural-network estimation procedures are susceptible to partial convergence. The work is also an important step toward developing rigorous procedures for neural model identification, selection and adequacy testing which have started to appear in the literature. We demonstrate its applicability in the nontrivial problem of forecasting implied volatility innovations using high-frequency stock index options. Each step of the model building process is validated using statistical tests to verify variable significance and model adequacy with the results confirming the presence of nonlinear relationships in implied volatility innovations.

  17. Whither the Pulmonary Ward Attending? Preserving Subspecialty Exposure in United States Internal Medicine Residency Training.

    PubMed

    Santhosh, Lekshmi; Babik, Jennifer; Looney, Mark R; Hollander, Harry

    2017-04-01

    Twenty years ago, the term "hospitalist" was coined at the University of California-San Francisco (San Francisco, CA), heralding a new specialty focused on the care of inpatients. There are now more than 50,000 hospitalists practicing in the United States. At many academic medical centers, hospitalists are largely replacing subspecialists as attendings on the inpatient medicine wards. At University of California-San Francisco, this has been accompanied by declining percentages of residency graduates who enter subspecialty training in internal medicine. The decline in subspecialty medicine interest can be attributed to many factors, including differences in compensation, decreased subspecialist exposure, and a changing research funding landscape. Although there has not been systematic documentation of this trend in pulmonary and critical care medicine, we have noted previously pulmonary and critical care-bound trainees switching to hospital medicine instead. With our broad, multiorgan system perspective, pulmonary and critical care faculty should embrace teaching general medicine. Residency programs have instituted creative solutions to encourage more internal medicine residents to pursue careers in subspecialty medicine. Some solutions include creating rotations that promote more contact with subspecialists and physician-scientists, creating clinician-educator tracks within fellowship programs, and appointing subspecialists to internal medicine residency leadership positions. We need more rigorous research to track the trends and implications of the generalist-specialist balance of inpatient ward teams on resident career choices, and learn what interventions affect those choices.

  18. The influence of natural short photoperiodic and temperature conditions on plasma thyroid hormones and cholesterol in male Syrian hamsters

    NASA Astrophysics Data System (ADS)

    Vaughan, M. K.; Brainard, G. C.; Reiter, R. J.

    1984-09-01

    Adult male Syrian hamsters were subjected to 1, 3, 5, 7 or 11 weeks of either natural winter conditions or rigorously controlled laboratory conditions (LD 10∶14; 22 ± 2‡C). Although both groups of hamsters gained weight over the course of the experiment, hamsters housed indoors were significantly heavier after 5 weeks of treatment compared to their outdoors counterparts. Animals housed under natural conditions exhibited a significant decrease in circulating levels of thyroxine (T4) and a rapid rise in triiodothyronine (T3) levels; the free T4 and free T3 index (FT4I and FT3I) mirrored the changes in circulating levels of the respective hormones. Laboratory-housed animals had a slight rise in T4 and FT4I at 3 weeks followed by a slow steady decline in these values; T3 and FT3I values did not change remarkably in these animals. Plasma cholesterol declined steadily over the course of the experiment in laboratory-maintained animals but increased slightly during the first 5 weeks in animals under natural conditions. Since the photoperiodic conditions were approximately of the same duration in these 2 groups, it is concluded that the major differences in body weight, thyroid hormone values and plasma cholesterol are due to some component (possibly temperature) in the natural environment.

  19. Spatial and temporal variations in silver contamination and toxicity in San Francisco Bay

    USGS Publications Warehouse

    Flegal, A.R.; Brown, C.L.; Squire, S.; Ross, J.R.M.; Scelfo, G.M.; Hibdon, S.

    2007-01-01

    Although San Francisco Bay has a "Golden Gate", it may be argued that it is the "Silver Estuary". For at one time the Bay was reported to have the highest levels of silver in its sediments and biota, along with the only accurately measured values of silver in solution, of any estuarine system. Since then others have argued that silver contamination is higher elsewhere (e.g., New York Bight, Florida Bay, Galveston Bay) in a peculiar form of pollution machismo, while silver contamination has measurably declined in sediments, biota, and surface waters of the Bay over the past two to three decades. Documentation of those systemic temporal declines has been possible because of long-term, ongoing monitoring programs, using rigorous trace metal clean sampling and analytical techniques, of the United States Geological Survey and San Francisco Bay Regional Monitoring Program that are summarized in this report. However, recent toxicity studies with macro-invertebrates in the Bay have indicated that silver may still be adversely affecting the health of the estuarine system, and other studies have indicated that silver concentrations in the Bay may be increasing due to new industrial inputs and/or the diagenetic remobilization of silver from historically contaminated sediments being re-exposed to overlying surface waters and benthos. Consequently, the Bay may not be ready to relinquish its title as the "Silver Estuary". ?? 2007 Elsevier Inc. All rights reserved.

  20. Long persistence of rigor mortis at constant low temperature.

    PubMed

    Varetto, Lorenzo; Curto, Ombretta

    2005-01-06

    We studied the persistence of rigor mortis by using physical manipulation. We tested the mobility of the knee on 146 corpses kept under refrigeration at Torino's city mortuary at a constant temperature of +4 degrees C. We found a persistence of complete rigor lasting for 10 days in all the cadavers we kept under observation; and in one case, rigor lasted for 16 days. Between the 11th and the 17th days, a progressively increasing number of corpses showed a change from complete into partial rigor (characterized by partial bending of the articulation). After the 17th day, all the remaining corpses showed partial rigor and in the two cadavers that were kept under observation "à outrance" we found the absolute resolution of rigor mortis occurred on the 28th day. Our results prove that it is possible to find a persistence of rigor mortis that is much longer than the expected when environmental conditions resemble average outdoor winter temperatures in temperate zones. Therefore, this datum must be considered when a corpse is found in those environmental conditions so that when estimating the time of death, we are not misled by the long persistence of rigor mortis.

  1. Mathematical models and photogrammetric exploitation of image sensing

    NASA Astrophysics Data System (ADS)

    Puatanachokchai, Chokchai

    Mathematical models of image sensing are generally categorized into physical/geometrical sensor models and replacement sensor models. While the former is determined from image sensing geometry, the latter is based on knowledge of the physical/geometric sensor models and on using such models for its implementation. The main thrust of this research is in replacement sensor models which have three important characteristics: (1) Highly accurate ground-to-image functions; (2) Rigorous error propagation that is essentially of the same accuracy as the physical model; and, (3) Adjustability, or the ability to upgrade the replacement sensor model parameters when additional control information becomes available after the replacement sensor model has replaced the physical model. In this research, such replacement sensor models are considered as True Replacement Models or TRMs. TRMs provide a significant advantage of universality, particularly for image exploitation functions. There have been several writings about replacement sensor models, and except for the so called RSM (Replacement Sensor Model as a product described in the Manual of Photogrammetry), almost all of them pay very little or no attention to errors and their propagation. This is because, it is suspected, the few physical sensor parameters are usually replaced by many more parameters, thus presenting a potential error estimation difficulty. The third characteristic, adjustability, is perhaps the most demanding. It provides an equivalent flexibility to that of triangulation using the physical model. Primary contributions of this thesis include not only "the eigen-approach", a novel means of replacing the original sensor parameter covariance matrices at the time of estimating the TRM, but also the implementation of the hybrid approach that combines the eigen-approach with the added parameters approach used in the RSM. Using either the eigen-approach or the hybrid approach, rigorous error propagation can be performed during image exploitation. Further, adjustability can be performed when additional control information becomes available after the TRM has been implemented. The TRM is shown to apply to imagery from sensors having different geometries, including an aerial frame camera, a spaceborne linear array sensor, an airborne pushbroom sensor, and an airborne whiskbroom sensor. TRM results show essentially negligible differences as compared to those from rigorous physical sensor models, both for geopositioning from single and overlapping images. Simulated as well as real image data are used to address all three characteristics of the TRM.

  2. Cross-bridge kinetics, cooperativity, and negatively strained cross- bridges in vertebrate smooth muscle. A laser-flash photolysis study

    PubMed Central

    1988-01-01

    The effects of laser-flash photolytic release of ATP from caged ATP [P3- 1(2-nitrophenyl)ethyladenosine-5'-triphosphate] on stiffness and tension transients were studied in permeabilized guinea pig protal vein smooth muscle. During rigor, induced by removing ATP from the relaxed or contracting muscles, stiffness was greater than in relaxed muscle, and electron microscopy showed cross-bridges attached to actin filaments at an approximately 45 degree angle. In the absence of Ca2+, liberation of ATP (0.1-1 mM) into muscles in rigor caused relaxation, with kinetics indicating cooperative reattachment of some cross- bridges. Inorganic phosphate (Pi; 20 mM) accelerated relaxation. A rapid phase of force development, accompanied by a decline in stiffness and unaffected by 20 mM Pi, was observed upon liberation of ATP in muscles that were released by 0.5-1.0% just before the laser pulse. This force increment observed upon detachment suggests that the cross- bridges can bear a negative tension. The second-order rate constant for detachment of rigor cross-bridges by ATP, in the absence of Ca2+, was estimated to be 0.1-2.5 X 10(5) M-1s-1, which indicates that this reaction is too fast to limit the rate of ATP hydrolysis during physiological contractions. In the presence of Ca2+, force development occurred at a rate (0.4 s-1) similar to that of intact, electrically stimulated tissue. The rate of force development was an order of magnitude faster in muscles that had been thiophosphorylated with ATP gamma S before the photochemical liberation of ATP, which indicates that under physiological conditions, in non-thiophosphorylated muscles, light-chain phosphorylation, rather than intrinsic properties of the actomyosin cross-bridges, limits the rate of force development. The release of micromolar ATP or CTP from caged ATP or caged CTP caused force development of up to 40% of maximal active tension in the absence of Ca2+, consistent with cooperative attachment of cross-bridges. Cooperative reattachment of dephosphorylated cross-bridges may contribute to force maintenance at low energy cost and low cross-bridge cycling rates in smooth muscle. PMID:3373178

  3. Model-Based Assessment of Estuary Ecosystem Health Using the Latent Health Factor Index, with Application to the Richibucto Estuary

    PubMed Central

    Chiu, Grace S.; Wu, Margaret A.; Lu, Lin

    2013-01-01

    The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI) was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired) and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt–clay content–all regarded a priori as qualitatively important abiotic drivers–towards site health in the Richibucto ecosystem. This suggests the potential effectiveness of the LHFI approach for assessing not only freshwater systems but aquatic ecosystems in general. PMID:23785443

  4. Development and validation of a prediction model for functional decline in older medical inpatients.

    PubMed

    Takada, Toshihiko; Fukuma, Shingo; Yamamoto, Yosuke; Tsugihashi, Yukio; Nagano, Hiroyuki; Hayashi, Michio; Miyashita, Jun; Azuma, Teruhisa; Fukuhara, Shunichi

    2018-05-17

    To prevent functional decline in older inpatients, identification of high-risk patients is crucial. The aim of this study was to develop and validate a prediction model to assess the risk of functional decline in older medical inpatients. In this retrospective cohort study, patients ≥65 years admitted acutely to medical wards were included. The healthcare database of 246 acute care hospitals (n = 229,913) was used for derivation, and two acute care hospitals (n = 1767 and 5443, respectively) were used for validation. Data were collected using a national administrative claims and discharge database. Functional decline was defined as a decline of the Katz score at discharge compared with on admission. About 6% of patients in the derivation cohort and 9% and 2% in each validation cohort developed functional decline. A model with 7 items, age, body mass index, living in a nursing home, ambulance use, need for assistance in walking, dementia, and bedsore, was developed. On internal validation, it demonstrated a c-statistic of 0.77 (95% confidence interval (CI) = 0.767-0.771) and good fit on the calibration plot. On external validation, the c-statistics were 0.79 (95% CI = 0.77-0.81) and 0.75 (95% CI = 0.73-0.77) for each cohort, respectively. Calibration plots showed good fit in one cohort and overestimation in the other one. A prediction model for functional decline in older medical inpatients was derived and validated. It is expected that use of the model would lead to early identification of high-risk patients and introducing early intervention. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Change in motor function and adverse health outcomes in older African-Americans.

    PubMed

    Buchman, Aron S; Wilson, Robert S; Leurgans, Sue E; Bennett, David A; Barnes, Lisa L

    2015-10-01

    We tested whether declining motor function accelerates with age in older African-Americans. Eleven motor performances were assessed annually in 513 older African-Americans. During follow-up of 5 years, linear mixed-effect models showed that motor function declined by about 0.03 units/year (Estimate, -0.026, p<0.001); about 4% more rapidly for each additional year of age at baseline. A proportional hazard model showed that both baseline motor function level and its rate of change were independent predictors of death and incident disability (all p's<0.001). These models showed that the additional annual amount of motor decline in 85 year old persons at baseline versus 65 year old persons was associated with a 1.5-fold higher rate of death and a 3-fold higher rate of developing Katz disability. The rate of declining motor function accelerates with increasing age and its rate of decline predicts adverse health outcomes in older African-Americans. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Change in Motor Function and Adverse Health Outcomes in Older African Americas

    PubMed Central

    Buchman, Aron S.; Wilson, Robert S.; Leurgans, Sue E.; Bennett, David A.; Barnes, Lisa L.

    2015-01-01

    Objective We tested whether declining motor function accelerates with age in older African Americans. Methods Eleven motor performances were assessed annually in 513 older African Americans. Results During follow-up of 5 years, linear mixed-effect models showed that motor function declined by about 0.03 units/yr (Estimate, −0.026, p<0.001); about 4% more rapidly for each additional year of age at baseline. A proportional hazard model showed that both baseline motor function level and its rate of change were independent predictors of death and incident disability (all p’s <0.001). These models showed that the additional annual amount of motor decline in 85 year old persons at baseline versus 65 year old persons was associated with a 1.5-fold higher rate of death and a 3-fold higher rate of developing Katz disability. Conclusions The rate of declining motor function accelerates with increasing age and its rate of decline predicts adverse health outcomes in older African Americans. PMID:26209439

  7. Facilities Stewardship: Measuring the Return on Physical Assets.

    ERIC Educational Resources Information Center

    Kadamus, David A.

    2001-01-01

    Asserts that colleges and universities should apply the same analytical rigor to physical assets as they do financial assets. Presents a management tool, the Return on Physical Assets model, to help guide physical asset allocation decisions. (EV)

  8. Career Decision Making and Its Evaluation.

    ERIC Educational Resources Information Center

    Miller-Tiedeman, Anna

    1979-01-01

    The author discusses a career decision-making program which she designed and implemented using a pyramidal model of exploration, crystallization, choice, and classification. Her article outlines the value of rigorous evaluation techniques applied by the local practitioner. (MF)

  9. Computer vision-based evaluation of pre- and postrigor changes in size and shape of Atlantic cod (Gadus morhua) and Atlantic salmon (Salmo salar) fillets during rigor mortis and ice storage: effects of perimortem handling stress.

    PubMed

    Misimi, E; Erikson, U; Digre, H; Skavhaug, A; Mathiassen, J R

    2008-03-01

    The present study describes the possibilities for using computer vision-based methods for the detection and monitoring of transient 2D and 3D changes in the geometry of a given product. The rigor contractions of unstressed and stressed fillets of Atlantic salmon (Salmo salar) and Atlantic cod (Gadus morhua) were used as a model system. Gradual changes in fillet shape and size (area, length, width, and roundness) were recorded for 7 and 3 d, respectively. Also, changes in fillet area and height (cross-section profiles) were tracked using a laser beam and a 3D digital camera. Another goal was to compare rigor developments of the 2 species of farmed fish, and whether perimortem stress affected the appearance of the fillets. Some significant changes in fillet size and shape were found (length, width, area, roundness, height) between unstressed and stressed fish during the course of rigor mortis as well as after ice storage (postrigor). However, the observed irreversible stress-related changes were small and would hardly mean anything for postrigor fish processors or consumers. The cod were less stressed (as defined by muscle biochemistry) than the salmon after the 2 species had been subjected to similar stress bouts. Consequently, the difference between the rigor courses of unstressed and stressed fish was more extreme in the case of salmon. However, the maximal whole fish rigor strength was judged to be about the same for both species. Moreover, the reductions in fillet area and length, as well as the increases in width, were basically of similar magnitude for both species. In fact, the increases in fillet roundness and cross-section height were larger for the cod. We conclude that the computer vision method can be used effectively for automated monitoring of changes in 2D and 3D shape and size of fish fillets during rigor mortis and ice storage. In addition, it can be used for grading of fillets according to uniformity in size and shape, as well as measurement of fillet yield measured in thickness. The methods are accurate, rapid, nondestructive, and contact-free and can therefore be regarded as suitable for industrial purposes.

  10. Rigorous force field optimization principles based on statistical distance minimization

    DOE PAGES

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-12

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. Here we exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of themore » approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.« less

  11. Rigor Made Easy: Getting Started

    ERIC Educational Resources Information Center

    Blackburn, Barbara R.

    2012-01-01

    Bestselling author and noted rigor expert Barbara Blackburn shares the secrets to getting started, maintaining momentum, and reaching your goals. Learn what rigor looks like in the classroom, understand what it means for your students, and get the keys to successful implementation. Learn how to use rigor to raise expectations, provide appropriate…

  12. Close Early Learning Gaps with Rigorous DAP

    ERIC Educational Resources Information Center

    Brown, Christopher P.; Mowry, Brian

    2015-01-01

    Rigorous DAP (developmentally appropriate practices) is a set of 11 principles of instruction intended to help close early childhood learning gaps. Academically rigorous learning environments create the conditions for children to learn at high levels. While academic rigor focuses on one dimension of education--academic--DAP considers the whole…

  13. Neuropsychological tests for predicting cognitive decline in older adults

    PubMed Central

    Baerresen, Kimberly M; Miller, Karen J; Hanson, Eric R; Miller, Justin S; Dye, Richelin V; Hartman, Richard E; Vermeersch, David; Small, Gary W

    2015-01-01

    Summary Aim To determine neuropsychological tests likely to predict cognitive decline. Methods A sample of nonconverters (n = 106) was compared with those who declined in cognitive status (n = 24). Significant univariate logistic regression prediction models were used to create multivariate logistic regression models to predict decline based on initial neuropsychological testing. Results Rey–Osterrieth Complex Figure Test (RCFT) Retention predicted conversion to mild cognitive impairment (MCI) while baseline Buschke Delay predicted conversion to Alzheimer’s disease (AD). Due to group sample size differences, additional analyses were conducted using a subsample of demographically matched nonconverters. Analyses indicated RCFT Retention predicted conversion to MCI and AD, and Buschke Delay predicted conversion to AD. Conclusion Results suggest RCFT Retention and Buschke Delay may be useful in predicting cognitive decline. PMID:26107318

  14. Tests for senescent decline in annual survival probabilities of common pochards, Aythya ferina

    USGS Publications Warehouse

    Nichols, J.D.; Hines, J.E.; Blums, P.

    1997-01-01

    Senescent decline in survival probabilities of animals is a topic about which much has been written but little is known. Here, we present formal tests of senescence hypotheses, using 1373 recaptures from 8877 duckling (age 0) and 504 yearling Common Pochards (Aythya ferina) banded at a Latvian study site, 1975-1992. The tests are based on capture-recapture models that explicitly incorporate sampling probabilities that, themselves, may exhibit timeand age-specific variation. The tests provided no evidence of senescent decline in survival probabilities for this species. Power of the most useful test was low for gradual declines in annual survival probability with age, but good for steeper declines. We recommend use of this type of capture-recapture modeling and analysis for other investigations of senescence in animal survival rates.

  15. Contribution of H. pylori and smoking trends to US incidence of intestinal-type noncardia gastric adenocarcinoma: a microsimulation model.

    PubMed

    Yeh, Jennifer M; Hur, Chin; Schrag, Deb; Kuntz, Karen M; Ezzati, Majid; Stout, Natasha; Ward, Zachary; Goldie, Sue J

    2013-01-01

    Although gastric cancer has declined dramatically in the US, the disease remains the second leading cause of cancer mortality worldwide. A better understanding of reasons for the decline can provide important insights into effective preventive strategies. We sought to estimate the contribution of risk factor trends on past and future intestinal-type noncardia gastric adenocarcinoma (NCGA) incidence. We developed a population-based microsimulation model of intestinal-type NCGA and calibrated it to US epidemiologic data on precancerous lesions and cancer. The model explicitly incorporated the impact of Helicobacter pylori and smoking on disease natural history, for which birth cohort-specific trends were derived from the National Health and Nutrition Examination Survey (NHANES) and National Health Interview Survey (NHIS). Between 1978 and 2008, the model estimated that intestinal-type NCGA incidence declined 60% from 11.0 to 4.4 per 100,000 men, <3% discrepancy from national statistics. H. pylori and smoking trends combined accounted for 47% (range = 30%-58%) of the observed decline. With no tobacco control, incidence would have declined only 56%, suggesting that lower smoking initiation and higher cessation rates observed after the 1960s accelerated the relative decline in cancer incidence by 7% (range = 0%-21%). With continued risk factor trends, incidence is projected to decline an additional 47% between 2008 and 2040, the majority of which will be attributable to H. pylori and smoking (81%; range = 61%-100%). Limitations include assuming all other risk factors influenced gastric carcinogenesis as one factor and restricting the analysis to men. Trends in modifiable risk factors explain a significant proportion of the decline of intestinal-type NCGA incidence in the US, and are projected to continue. Although past tobacco control efforts have hastened the decline, full benefits will take decades to be realized, and further discouragement of smoking and reduction of H. pylori should be priorities for gastric cancer control efforts.

  16. Design, development, and application of LANDIS-II, a spatial landscape simulation model with flexible temporal and spatial resolution

    Treesearch

    Robert M. Scheller; James B. Domingo; Brian R. Sturtevant; Jeremy S. Williams; Arnold Rudy; Eric J. Gustafson; David J. Mladenoff

    2007-01-01

    We introduce LANDIS-II, a landscape model designed to simulate forest succession and disturbances. LANDIS-II builds upon and preserves the functionality of previous LANDIS forest landscape simulation models. LANDIS-II is distinguished by the inclusion of variable time steps for different ecological processes; our use of a rigorous development and testing process used...

  17. Rigorous Science: a How-To Guide.

    PubMed

    Casadevall, Arturo; Fang, Ferric C

    2016-11-08

    Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. Copyright © 2016 Casadevall and Fang.

  18. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence

    PubMed Central

    Kelly, David; Majda, Andrew J.; Tong, Xin T.

    2015-01-01

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335

  19. Testing adaptive toolbox models: a Bayesian hierarchical approach.

    PubMed

    Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.

  20. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    PubMed

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.

  1. A rigorous multiple independent binding site model for determining cell-based equilibrium dissociation constants.

    PubMed

    Drake, Andrew W; Klakamp, Scott L

    2007-01-10

    A new 4-parameter nonlinear equation based on the standard multiple independent binding site model (MIBS) is presented for fitting cell-based ligand titration data in order to calculate the ligand/cell receptor equilibrium dissociation constant and the number of receptors/cell. The most commonly used linear (Scatchard Plot) or nonlinear 2-parameter model (a single binding site model found in commercial programs like Prism(R)) used for analysis of ligand/receptor binding data assumes only the K(D) influences the shape of the titration curve. We demonstrate using simulated data sets that, depending upon the cell surface receptor expression level, the number of cells titrated, and the magnitude of the K(D) being measured, this assumption of always being under K(D)-controlled conditions can be erroneous and can lead to unreliable estimates for the binding parameters. We also compare and contrast the fitting of simulated data sets to the commonly used cell-based binding equation versus our more rigorous 4-parameter nonlinear MIBS model. It is shown through these simulations that the new 4-parameter MIBS model, when used for cell-based titrations under optimal conditions, yields highly accurate estimates of all binding parameters and hence should be the preferred model to fit cell-based experimental nonlinear titration data.

  2. Experimental evaluation of rigor mortis. VII. Effect of ante- and post-mortem electrocution on the evolution of rigor mortis.

    PubMed

    Krompecher, T; Bergerioux, C

    1988-01-01

    The influence of electrocution on the evolution of rigor mortis was studied on rats. Our experiments showed that: (1) Electrocution hastens the onset of rigor mortis. After an electrocution of 90 s, a complete rigor develops already 1 h post-mortem (p.m.) compared to 5 h p.m. for the controls. (2) Electrocution hastens the passing of rigor mortis. After an electrocution of 90 s, the first significant decrease occurs at 3 h p.m. (8 h p.m. in the controls). (3) These modifications in rigor mortis evolution are less pronounced in the limbs not directly touched by the electric current. (4) In case of post-mortem electrocution, the changes are slightly less pronounced, the resistance is higher and the absorbed energy is lower as compared with the ante-mortem electrocution cases. The results are completed by two practical observations on human electrocution cases.

  3. Rigorous Schools and Classrooms: Leading the Way

    ERIC Educational Resources Information Center

    Williamson, Ronald; Blackburn, Barbara R.

    2010-01-01

    Turn your school into a student-centered learning environment, where rigor is at the heart of instruction in every classroom. From the bestselling author of "Rigor is Not a Four-Letter Word," Barbara Blackburn, and award-winning educator Ronald Williamson, this comprehensive guide to establishing a schoolwide culture of rigor is for principals and…

  4. Rigor Revisited: Scaffolding College Student Learning by Incorporating Their Lived Experiences

    ERIC Educational Resources Information Center

    Castillo-Montoya, Milagros

    2018-01-01

    This chapter explores how students' lived experiences contribute to the rigor of their thinking. Insights from research indicate faculty can enhance rigor by accounting for the many ways it may surface in the classroom. However, to see this type of rigor, we must revisit the way we conceptualize it for higher education.

  5. Predicting early cognitive decline in newly-diagnosed Parkinson's patients: A practical model.

    PubMed

    Hogue, Olivia; Fernandez, Hubert H; Floden, Darlene P

    2018-06-19

    To create a multivariable model to predict early cognitive decline among de novo patients with Parkinson's disease, using brief, inexpensive assessments that are easily incorporated into clinical flow. Data for 351 drug-naïve patients diagnosed with idiopathic Parkinson's disease were obtained from the Parkinson's Progression Markers Initiative. Baseline demographic, disease history, motor, and non-motor features were considered as candidate predictors. Best subsets selection was used to determine the multivariable baseline symptom profile that most accurately predicted individual cognitive decline within three years. Eleven per cent of the sample experienced cognitive decline. The final logistic regression model predicting decline included five baseline variables: verbal memory retention, right-sided bradykinesia, years of education, subjective report of cognitive impairment, and REM behavior disorder. Model discrimination was good (optimism-adjusted concordance index = .749). The associated nomogram provides a tool to determine individual patient risk of meaningful cognitive change in the early stages of the disease. Through the consideration of easily-implemented or routinely-gathered assessments, we have identified a multidimensional baseline profile and created a convenient, inexpensive tool to predict cognitive decline in the earliest stages of Parkinson's disease. The use of this tool would generate prediction at the individual level, allowing clinicians to tailor medical management for each patient and identify at-risk patients for clinical trials aimed at disease modifying therapies. Copyright © 2018. Published by Elsevier Ltd.

  6. Effect of rigor temperature, ageing and display time on the meat quality and lipid oxidative stability of hot boned beef Semimembranosus muscle.

    PubMed

    Mungure, Tanyaradzwa E; Bekhit, Alaa El-Din A; Birch, E John; Stewart, Ian

    2016-04-01

    The effects of rigor temperature (5, 15, 20 and 25°C), ageing (3, 7, 14, and 21 days) and display time on meat quality and lipid oxidative stability of hot boned beef M. Semimembranosus (SM) muscle were investigated. Ultimate pH (pH(u)) was rapidly attained at higher rigor temperatures. Electrical conductivity increased with rigor temperature (p<0.001). Tenderness, purge and cooking losses were not affected by rigor temperature; however purge loss and tenderness increased with ageing (p<0.01). Lightness (L*) and redness (a*) of the SM increased as rigor temperature increased (p<0.01). Lipid oxidation was assessed using (1)H NMR where changes in aliphatic to olefinic (R(ao)) and diallylmethylene (R(ad)) proton ratios can be rapidly monitored. R(ad), R(ao), PUFA and TBARS were not affected by rigor temperature, however ageing and display increased lipid oxidation (p<0.05). This study shows that rigor temperature manipulation of hot boned beef SM muscle does not have adverse effects on lipid oxidation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. A Center of Excellence in the Mathematical Sciences - at Cornell University

    DTIC Science & Technology

    1992-03-01

    of my recent efforts go in two directions. 1. Cellular Automata. The Greenberg Hastings model is a simple system that models the behavior of an... Greenberg -Hastings Model. We also obtained results concerning the crucial value for a threshold voter model. This resulted in the papers "Some Rigorous...Results for the Greenberg - Hastings Model" and "Fixation Results for Threshold Voter Systems." Together with Scot Adams, I wrote "An Application of the

  8. DSN system performance test Doppler noise models; noncoherent configuration

    NASA Technical Reports Server (NTRS)

    Bunce, R.

    1977-01-01

    The newer model for variance, the Allan technique, now adopted for testing, is analyzed in the subject mode. A model is generated (including considerable contribution from the station secondary frequency standard), and rationalized with existing data. The variance model is definitely sound; the Allan technique mates theory and measure. The mean-frequency model is an estimate; this problem is yet to be rigorously resolved. The unaltered defining expressions are noncovergent, and the observed mean is quite erratic.

  9. Analysis of pelagic species decline in the upper San Francisco Estuary using multivariate autoregressive modeling (MAR)

    USGS Publications Warehouse

    Mac Nally, Ralph; Thomson, James R.; Kimmerer, Wim J.; Feyrer, Frederick; Newman, Ken B.; Sih, Andy; Bennett, William A.; Brown, Larry; Fleishman, Erica; Culberson, Steven D.; Castillo, Gonzalo

    2010-01-01

    Four species of pelagic fish of particular management concern in the upper San Francisco Estuary, California, USA, have declined precipitously since ca. 2002: delta smelt (Hypomesus transpacificus), longfin smelt (Spirinchus thaleichthys), striped bass (Morone saxatilis), and threadfin shad (Dorosoma petenense). The estuary has been monitored since the late 1960s with extensive collection of data on the fishes, their pelagic prey, phytoplankton biomass, invasive species, and physical factors. We used multivariate autoregressive (MAR) modeling to discern the main factors responsible for the declines. An expert-elicited model was built to describe the system. Fifty-four relationships were built into the model, only one of which was of uncertain direction a priori. Twenty-eight of the proposed relationships were strongly supported by or consistent with the data, while 26 were close to zero (not supported by the data but not contrary to expectations). The position of the 2‰ isohaline (a measure of the physical response of the estuary to freshwater flow) and increased water clarity over the period of analyses were two factors affecting multiple declining taxa (including fishes and the fishes' main zooplankton prey). Our results were relatively robust with respect to the form of stock–recruitment model used and to inclusion of subsidiary covariates but may be enhanced by using detailed state–space models that describe more fully the life-history dynamics of the declining species.

  10. Analysis of pelagic species decline in the upper San Francisco Estuary using multivariate autoregressive modeling (MAR).

    PubMed

    Mac Nally, Ralph; Thomson, James R; Kimmerer, Wim J; Feyrer, Frederick; Newman, Ken B; Sih, Andy; Bennett, William A; Brown, Larry; Fleishman, Erica; Culberson, Steven D; Castillo, Gonzalo

    2010-07-01

    Four species of pelagic fish of particular management concern in the upper San Francisco Estuary, California, USA, have declined precipitously since ca. 2002: delta smelt (Hypomesus transpacificus), longfin smelt (Spirinchus thaleichthys), striped bass (Morone saxatilis), and threadfin shad (Dorosoma petenense). The estuary has been monitored since the late 1960s with extensive collection of data on the fishes, their pelagic prey, phytoplankton biomass, invasive species, and physical factors. We used multivariate autoregressive (MAR) modeling to discern the main factors responsible for the declines. An expert-elicited model was built to describe the system. Fifty-four relationships were built into the model, only one of which was of uncertain direction a priori. Twenty-eight of the proposed relationships were strongly supported by or consistent with the data, while 26 were close to zero (not supported by the data but not contrary to expectations). The position of the 2 per thousand isohaline (a measure of the physical response of the estuary to freshwater flow) and increased water clarity over the period of analyses were two factors affecting multiple declining taxa (including fishes and the fishes' main zooplankton prey): Our results were relatively robust with respect to the form of stock-recruitment model used and to inclusion of subsidiary covariates but may be enhanced by using detailed state-space models that describe more fully the life-history dynamics of the declining species.

  11. Providing Home Visiting to High-Risk Pregnant and Postpartum Families: The Development and Evaluation of the MOMobile® Program

    ERIC Educational Resources Information Center

    Hadley, Barbara; Rudolph, Kara E.; Mogul, Marjie; Perry, Deborah F.

    2014-01-01

    Maternal, Infant, and Early Childhood Home Visiting legislation permits states to fund "promising practices"--with the understanding that these models will have a rigorous evaluation component. This article describes an innovative, low cost paraprofessional home visiting model developed in Pennsylvania by the Maternity Care Coalition. In…

  12. Large eddy simulation of forest canopy flow for wildland fire modeling

    Treesearch

    Eric Mueller; William Mell; Albert Simeoni

    2014-01-01

    Large eddy simulation (LES) based computational fluid dynamics (CFD) simulators have obtained increasing attention in the wildland fire research community, as these tools allow the inclusion of important driving physics. However, due to the complexity of the models, individual aspects must be isolated and tested rigorously to ensure meaningful results. As wind is a...

  13. From the Schoolhouse to the Statehouse: Building a Statewide Model for Technology Education

    ERIC Educational Resources Information Center

    Rhine, Luke

    2013-01-01

    This article details the journey Luke Rhine, a Program Specialist in Career and Technology Education at the Maryland State Department of Education, as he went about the difficult task of building consistency and establishing rigorous expectations for Technology education in Maryland. As a result, Maryland has developed a model for Technology…

  14. Texas M-E flexible pavement design system: literature review and proposed framework.

    DOT National Transportation Integrated Search

    2012-04-01

    Recent developments over last several decades have offered an opportunity for more rational and rigorous pavement design procedures. Substantial work has already been completed in Texas, nationally, and internationally, in all aspects of modeling, ma...

  15. An asymptotic model in acoustics: acoustic drift equations.

    PubMed

    Vladimirov, Vladimir A; Ilin, Konstantin

    2013-11-01

    A rigorous asymptotic procedure with the Mach number as a small parameter is used to derive the equations of mean flows which coexist and are affected by the background acoustic waves in the limit of very high Reynolds number.

  16. THE US ENVIRONMENTAL PROTECTION AGENCY'S MONITORING AND ASSESSMENT PROGRAM

    EPA Science Inventory

    A scientifically rigorous determination of the condition of an aquatic resource is fundamental to all subsequent research, modeling, protection, and restoration issues. Environmental risk characterization is predicated on knowledge of condition and the rate at which that conditio...

  17. High and low rigor temperature effects on sheep meat tenderness and ageing.

    PubMed

    Devine, Carrick E; Payne, Steven R; Peachey, Bridget M; Lowe, Timothy E; Ingram, John R; Cook, Christian J

    2002-02-01

    Immediately after electrical stimulation, the paired m. longissimus thoracis et lumborum (LT) of 40 sheep were boned out and wrapped tightly with a polyethylene cling film. One of the paired LT's was chilled in 15°C air to reach a rigor mortis (rigor) temperature of 18°C and the other side was placed in a water bath at 35°C and achieved rigor at this temperature. Wrapping reduced rigor shortening and mimicked meat left on the carcass. After rigor, the meat was aged at 15°C for 0, 8, 26 and 72 h and then frozen. The frozen meat was cooked to 75°C in an 85°C water bath and shear force values obtained from a 1×1 cm cross-section. The shear force values of meat for 18 and 35°C rigor were similar at zero ageing, but as ageing progressed, the 18 rigor meat aged faster and became more tender than meat that went into rigor at 35°C (P<0.001). The mean sarcomere length values of meat samples for 18 and 35°C rigor at each ageing time were significantly different (P<0.001), the samples at 35°C being shorter. When the short sarcomere length values and corresponding shear force values were removed for further data analysis, the shear force values for the 35°C rigor were still significantly greater. Thus the toughness of 35°C meat was not a consequence of muscle shortening and appears to be due to both a faster rate of tenderisation and the meat tenderising to a greater extent at the lower temperature. The cook loss at 35°C rigor (30.5%) was greater than that at 18°C rigor (28.4%) (P<0.01) and the colour Hunter L values were higher at 35°C (P<0.01) compared with 18°C, but there were no significant differences in a or b values.

  18. Animal species endangerment: The role of environmental pollution

    USGS Publications Warehouse

    Pattee, Oliver H.; Fellows, Valerie L.; Bounds, Dixie L.; Hoffman, David J.; Rattner, Barnett A.; Burton, G. Allen; Cairns, John

    2003-01-01

    Multiple factors contribute to the decline of species. Habitat destruction is the primary factor that threatens species. affecting 73 % of endangered species. The second major factor causing species decline is the introduction of nonnative species. affecting 68% of endangered species. Pollution and overharvesting were identified as impacting, respectively, 38 and 15% of endangered species. Other factors affecting species decline include hybridization, competition, disease, and other interspecific interactions. Once a species is reduced to a remnant of its former population size and distribution, its vulnerability to catastrophic pollution events increases, frequently exceeding or replacing the factors responsible for the initial decline. Small, isolated populations are particularly vulnerable to catastrophic loss by an acute event. such as a chemical spill or pesticide application. However, when it comes to surviving a single disaster, widespread subpopulations of a species are far more resilient and ensure genetic survival. Hypothesizing theoretical concerns of potential factors that could affect an endangered species could predispose the scientific and political communities to jeopardizing threats. The user of recovery plans as a data source must be aware of the bias within the data set. These data should be used with the caveat that the source of information in recovery plans is not always based on scientific research and rigorous data collection. Over 58% of the information identifying species threats is based on estimates or personal communication. while only 42% is based on peer reviewed literature, academic research. or government reports. Many recovery plans were written when a species was initially listed in the 1970s or 1980s. Politics, human disturbance, and habitat demand issues evolve over a 20- to 30-year period. leaving much of the threats facing endangered species outdated and inadequate. These data are most valuable when used to facilitate reviews of Section 7 consultations and environmental impact statements, review permit applications, conduct environmental risk assessments, prioritize research needs. and identify limiting factors affecting species health. These data are also useful in identifying potential threats to species' health. Without properly identifying threats to endangered species based on sound. scientific research. there is little hope to successfully recover an endangered species.

  19. Augmented assessment as a means to augmented reality.

    PubMed

    Bergeron, Bryan

    2006-01-01

    Rigorous scientific assessment of educational technologies typically lags behind the availability of the technologies by years because of the lack of validated instruments and benchmarks. Even when the appropriate assessment instruments are available, they may not be applied because of time and monetary constraints. Work in augmented reality, instrumented mannequins, serious gaming, and similar promising educational technologies that haven't undergone timely, rigorous evaluation, highlights the need for assessment methodologies that address the limitations of traditional approaches. The most promising augmented assessment solutions incorporate elements of rapid prototyping used in the software industry, simulation-based assessment techniques modeled after methods used in bioinformatics, and object-oriented analysis methods borrowed from object oriented programming.

  20. On analyticity of linear waves scattered by a layered medium

    NASA Astrophysics Data System (ADS)

    Nicholls, David P.

    2017-10-01

    The scattering of linear waves by periodic structures is a crucial phenomena in many branches of applied physics and engineering. In this paper we establish rigorous analytic results necessary for the proper numerical analysis of a class of High-Order Perturbation of Surfaces methods for simulating such waves. More specifically, we prove a theorem on existence and uniqueness of solutions to a system of partial differential equations which model the interaction of linear waves with a multiply layered periodic structure in three dimensions. This result provides hypotheses under which a rigorous numerical analysis could be conducted for recent generalizations to the methods of Operator Expansions, Field Expansions, and Transformed Field Expansions.

  1. Dynamic patterns of overexploitation in fisheries.

    PubMed

    Perissi, Ilaria; Bardi, Ugo; El Asmar, Toufic; Lavacchi, Alessandro

    2017-09-10

    Understanding overfishing and regulating fishing quotas is a major global challenge for the 21st Century both in terms of providing food for humankind and to preserve the oceans' ecosystems. However, fishing is a complex economic activity, affected not just by overfishing but also by such factors as pollution, technology, financial factors and more. For this reason, it is often difficult to state with complete certainty that overfishing is the cause of the decline of a fishery. In this study, we developed a simple dynamic model specifically designed to isolate and to study the role of depletion on production. The model is based on the well-known Lotka-Volterra model, or Prey-Predator mechanism, assuming that the fish stock and the fishing industry are coupled variables that dynamically affect each other. In the model, the fishing industry acts as the "predator" and the fish stock as the "prey". If the model can fit historical data, in particular relative to the productive decline of specific fisheries, then we have a strong indication that the decline of the fish stock is driving the decline of the fishery production. The model doesn't pretend to be a general description of the fishing industry in all its varied forms; however, the data reported here show that the model can describe several historical cases of fisheries whose production decreased and collapsed, indicating that the overexploitation of the fish stocks is an important factor in the decline of fisheries.

  2. Declining fertility and economic well-being: do education and health ride to the rescue?

    PubMed Central

    Prettner, Klaus; Bloom, David E.; Strulik, Holger

    2015-01-01

    It is widely argued that declining fertility slows the pace of economic growth in industrialized countries through its negative effect on labor supply. There are, however, theoretical arguments suggesting that the effect of falling fertility on effective labor supply can be offset by associated behavioral changes. We formalize these arguments by setting forth a dynamic consumer optimization model that incorporates endogenous fertility as well as endogenous education and health investments. The model shows that a fertility decline induces higher education and health investments that are able to compensate for declining fertility under certain circumstances. We assess the theoretical implications by investigating panel data for 118 countries over the period 1980 to 2005 and show that behavioral changes partly mitigate the negative impact of declining fertility on effective labor supply. PMID:26388677

  3. Openness as a buffer against cognitive decline: The Openness-Fluid-Crystallized-Intelligence (OFCI) model applied to late adulthood.

    PubMed

    Ziegler, Matthias; Cengia, Anja; Mussel, Patrick; Gerstorf, Denis

    2015-09-01

    Explaining cognitive decline in late adulthood is a major research area. Models using personality traits as possible influential variables are rare. This study tested assumptions based on an adapted version of the Openness-Fluid-Crystallized-Intelligence (OFCI) model. The OFCI model adapted to late adulthood predicts that openness is related to the decline in fluid reasoning (Gf) through environmental enrichment. Gf should be related to the development of comprehension knowledge (Gc; investment theory). It was also assumed that Gf predicts changes in openness as suggested by the environmental success hypothesis. Finally, the OFCI model proposes that openness has an indirect influence on the decline in Gc through its effect on Gf (mediation hypothesis). Using data from the Berlin Aging Study (N = 516, 70-103 years at T1), these predictions were tested using latent change score and latent growth curve models with indicators of each trait. The current findings and prior research support environmental enrichment and success, investment theory, and partially the mediation hypotheses. Based on a summary of all findings, the OFCI model for late adulthood is suggested. (c) 2015 APA, all rights reserved).

  4. Exploring Student Perceptions of Rigor Online: Toward a Definition of Rigorous Learning

    ERIC Educational Resources Information Center

    Duncan, Heather E.; Range, Bret; Hvidston, David

    2013-01-01

    Technological advances in the last decade have impacted delivery methods of university courses. More and more courses are offered in a variety of formats. While academic rigor is a term often used, its definition is less clear. This mixed-methods study explored graduate student conceptions of rigor in the online learning environment embedded…

  5. Coincident patterns of waste water suspended solids reduction, water transparency increase and chlorophyll decline in Narragansett Bay.

    PubMed

    Borkman, David G; Smayda, Theodore J

    2016-06-15

    Dramatic changes occurred in Narragansett Bay during the 1980s: water clarity increased, while phytoplankton abundance and chlorophyll concentration decreased. We examine how changes in total suspended solids (TSS) loading from wastewater treatment plants may have influenced this decline in phytoplankton chlorophyll. TSS loading, light and phytoplankton observations were compiled and a light- and temperature-dependent Skeletonema-based phytoplankton growth model was applied to evaluate chlorophyll supported by TSS nitrogen during 1983-1995. TSS loading declined 75% from ~0.60×10(6)kgmonth(-1) to ~0.15×10(6)kgmonth(-1) during 1983-1995. Model results indicate that nitrogen reduction related to TSS reduction was minor and explained a small fraction (~15%) of the long-term chlorophyll decline. The decline in NBay TSS loading appears to have increased water clarity and in situ irradiance and contributed to the long-term chlorophyll decline by inducing a physiological response of a ~20% reduction in chlorophyll per cell. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Methodological rigor and citation frequency in patient compliance literature.

    PubMed Central

    Bruer, J T

    1982-01-01

    An exhaustive bibliography which assesses the methodological rigor of the patient compliance literature, and citation data from the Science Citation Index (SCI) are combined to determine if methodologically rigorous papers are used with greater frequency than substandard articles by compliance investigators. There are low, but statistically significant, correlations between methodological rigor and citation indicators for 138 patient compliance papers published in SCI source journals during 1975 and 1976. The correlation is not strong enough to warrant use of citation measures as indicators of rigor on a paper-by-paper basis. The data do suggest that citation measures might be developed as crude indicators of methodological rigor. There is no evidence that randomized trials are cited more frequently than studies that employ other experimental designs. PMID:7114334

  7. Endogenous technological and demographic change under increasing water scarcity

    NASA Astrophysics Data System (ADS)

    Pande, Saket; Ertsen, Maurits; Sivapalan, Murugesu

    2014-05-01

    The ancient civilization in the Indus Valley civilization dispersed under extreme dry conditions; there are indications that the same holds for many other ancient societies. Even contemporary societies, such as the one in Murrumbidgee river basin in Australia, have started to witness a decline in overall population under increasing water scarcity. Hydroclimatic change may not be the sole predictor of the fate of contemporary societies in water scarce regions and many critics of such (perceived) hydroclimatic determinism have suggested that technological change may ameliorate the effects of increasing water scarcity and as such counter the effects of hydroclimatic changes. To study the role of technological change on the dynamics of coupled human-water systems, we develop a simple overlapping-generations model of endogenous technological and demographic change. We model technological change as an endogenous process that depends on factors such as the investments that are (endogenously) made in a society, the (endogenous) diversification of a society into skilled and unskilled workers, a society's patience in terms of its present consumption vs. future consumption, production technology and the (endogenous) interaction of all of these factors. In the model the population growth rate is programmed to decline once consumption per capita crosses a "survival" threshold. This means we do not treat technology as an exogenous random sequence of events, but instead assume that it results (endogenously) from societal actions. The model demonstrates that technological change may indeed ameliorate the effects of increasing water scarcity but typically it does so only to a certain extent. It is possible that technological change may allow a society to escape the effect of increasing water scarcity, leading to a (super)-exponential rise in technology and population. However, such cases require the rate of success of investment in technological advancement to be high. In other more realistic cases of technological success, we find that endogenous technology change only helps to delay the peak of population size before it inevitably starts to decline. While the model is a rather simple model of societal development, it is shown to be capable of replicating patterns of technological and population changes. It is capable of replicating the pattern of declining consumption per capita in presence of growth in aggregate production. It is also capable of replicating an exponential population rise, even under increasing water scarcity. The results of the model suggest that societies that declined or are declining in the face of extreme water scarcity may have done so due to slower rate of success of investment in technological advancement. The model suggests that the population decline occurs after a prolonged decline in consumption per capita, which in turn is due to the joint effect of initially increasing population and increasing water scarcity. This is despite technological advancement and increase in aggregate production. We suggest that declining consumption per capita despite technological advancement and increase in aggregate production may serve as a useful predictor of upcoming decline in contemporary societies in water scarce basins.

  8. Endogenous technological and population change under increasing water scarcity

    NASA Astrophysics Data System (ADS)

    Pande, S.; Ertsen, M.; Sivapalan, M.

    2013-11-01

    The ancient civilization in the Indus Valley civilization dispersed under extreme dry conditions; there are indications that the same holds for many other ancient societies. Even contemporary societies, such as the one in Murrumbidgee river basin in Australia, have started to witness a decline in overall population under increasing water scarcity. Hydroclimatic change may not be the sole predictor of the fate of contemporary societies in water scarce regions and many critics of such (perceived) hydroclimatic determinism have suggested that technological change may ameliorate the effects of increasing water scarcity and as such counter the effects of hydroclimatic changes. To study the role of technological change on the dynamics of coupled human-water systems, we develop a simple overlapping-generations model of endogenous technological and demographic change. We model technological change as an endogenous process that depends on factors such as the investments that are (endogenously) made in a society, the (endogenous) diversification of a society into skilled and unskilled workers, a society's patience in terms of its present consumption vs. future consumption, production technology and the (endogenous) interaction of all of these factors. In the model the population growth rate is programmed to decline once consumption per capita crosses a "survival" threshold. This means we do not treat technology as an exogenous random sequence of events, but instead assume that it results (endogenously) from societal actions. The model demonstrates that technological change may indeed ameliorate the effects of increasing water scarcity but typically it does so only to a certain extent. It is possible that technological change may allow a society to escape the effect of increasing water scarcity, leading to a (super)-exponential rise in technology and population. However, such cases require the rate of success of investment in technological advancement to be high. In other more realistic cases of technological success, we find that endogenous technology change only helps to delay the peak of population size before it inevitably starts to decline. While the model is a rather simple model of societal development, it is shown to be capable of replicating patterns of technological and population changes. It is capable of replicating the pattern of declining consumption per capita in presence of growth in aggregate production. It is also capable of replicating an exponential population rise, even under increasing water scarcity. The results of the model suggest that societies that declined or are declining in the face of extreme water scarcity may have done so due to slower rate of success of investment in technological advancement. The model suggests that the population decline occurs after a prolonged decline in consumption per capita, which in turn is due to the joint effect of initially increasing population and increasing water scarcity. This is despite technological advancement and increase in aggregate production. We suggest that declining consumption per capita despite technological advancement and increase in aggregate production may serve as a useful predictor of upcoming decline in contemporary societies in water scarce basins.

  9. Comparative Town Meetings: A Search for Causative Models of Feminine Involvement in Politics with New Operational Definitions of a Well Calloused Dependent Variable.

    ERIC Educational Resources Information Center

    Bryan, Frank M.

    Variations in the level of female political participation were examined in the context of the "standard" model of political participation (higher socioeconomic status, urbanism, living at society's center, increased participation) and the "decline of community" model (decreased group membership, increased mobility, decline of…

  10. Can Policy Alone Stop Decline of Children and Youth Fitness?

    ERIC Educational Resources Information Center

    Zhang, Chunhua; Yang, Yang

    2017-01-01

    Various models and methods have been proposed to address the worldwide decline in children's and youth's physical fitness, and the social-ecological model has shown some promise. Yet, the impact of the policy intervention, one component of that model, has not been evaluated carefully. Using limited data from policy documents, the impact of policy…

  11. Forecasting production in Liquid Rich Shale plays

    NASA Astrophysics Data System (ADS)

    Nikfarman, Hanieh

    Production from Liquid Rich Shale (LRS) reservoirs is taking center stage in the exploration and production of unconventional reservoirs. Production from the low and ultra-low permeability LRS plays is possible only through multi-fractured horizontal wells (MFHW's). There is no existing workflow that is applicable to forecasting multi-phase production from MFHW's in LRS plays. This project presents a practical and rigorous workflow for forecasting multiphase production from MFHW's in LRS reservoirs. There has been much effort in developing workflows and methodology for forecasting in tight/shale plays in recent years. The existing workflows, however, are applicable only to single phase flow, and are primarily used in shale gas plays. These methodologies do not apply to the multi-phase flow that is inevitable in LRS plays. To account for complexities of multiphase flow in MFHW's the only available technique is dynamic modeling in compositional numerical simulators. These are time consuming and not practical when it comes to forecasting production and estimating reserves for a large number of producers. A workflow was developed, and validated by compositional numerical simulation. The workflow honors physics of flow, and is sufficiently accurate while practical so that an analyst can readily apply it to forecast production and estimate reserves in a large number of producers in a short period of time. To simplify the complex multiphase flow in MFHW, the workflow divides production periods into an initial period where large production and pressure declines are expected, and the subsequent period where production decline may converge into a common trend for a number of producers across an area of interest in the field. Initial period assumes the production is dominated by single-phase flow of oil and uses the tri-linear flow model of Erdal Ozkan to estimate the production history. Commercial software readily available can simulate flow and forecast production in this period. In the subsequent Period, dimensionless rate and dimensionless time functions are introduced that help identify transition from initial period into subsequent period. The production trends in terms of the dimensionless parameters converge for a range of rock permeability and stimulation intensity. This helps forecast production beyond transition to the end of life of well. This workflow is applicable to single fluid system.

  12. Seismic waves and earthquakes in a global monolithic model

    NASA Astrophysics Data System (ADS)

    Roubíček, Tomáš

    2018-03-01

    The philosophy that a single "monolithic" model can "asymptotically" replace and couple in a simple elegant way several specialized models relevant on various Earth layers is presented and, in special situations, also rigorously justified. In particular, global seismicity and tectonics is coupled to capture, e.g., (here by a simplified model) ruptures of lithospheric faults generating seismic waves which then propagate through the solid-like mantle and inner core both as shear (S) or pressure (P) waves, while S-waves are suppressed in the fluidic outer core and also in the oceans. The "monolithic-type" models have the capacity to describe all the mentioned features globally in a unified way together with corresponding interfacial conditions implicitly involved, only when scaling its parameters appropriately in different Earth's layers. Coupling of seismic waves with seismic sources due to tectonic events is thus an automatic side effect. The global ansatz is here based, rather for an illustration, only on a relatively simple Jeffreys' viscoelastic damageable material at small strains whose various scaling (limits) can lead to Boger's viscoelastic fluid or even to purely elastic (inviscid) fluid. Self-induced gravity field, Coriolis, centrifugal, and tidal forces are counted in our global model, as well. The rigorous mathematical analysis as far as the existence of solutions, convergence of the mentioned scalings, and energy conservation is briefly presented.

  13. Monte Carlo simulation of radiation transport in human skin with rigorous treatment of curved tissue boundaries

    NASA Astrophysics Data System (ADS)

    Majaron, Boris; Milanič, Matija; Premru, Jan

    2015-01-01

    In three-dimensional (3-D) modeling of light transport in heterogeneous biological structures using the Monte Carlo (MC) approach, space is commonly discretized into optically homogeneous voxels by a rectangular spatial grid. Any round or oblique boundaries between neighboring tissues thus become serrated, which raises legitimate concerns about the realism of modeling results with regard to reflection and refraction of light on such boundaries. We analyze the related effects by systematic comparison with an augmented 3-D MC code, in which analytically defined tissue boundaries are treated in a rigorous manner. At specific locations within our test geometries, energy deposition predicted by the two models can vary by 10%. Even highly relevant integral quantities, such as linear density of the energy absorbed by modeled blood vessels, differ by up to 30%. Most notably, the values predicted by the customary model vary strongly and quite erratically with the spatial discretization step and upon minor repositioning of the computational grid. Meanwhile, the augmented model shows no such unphysical behavior. Artifacts of the former approach do not converge toward zero with ever finer spatial discretization, confirming that it suffers from inherent deficiencies due to inaccurate treatment of reflection and refraction at round tissue boundaries.

  14. Microbicide safety/efficacy studies in animals: macaques and small animal models.

    PubMed

    Veazey, Ronald S

    2008-09-01

    A number of microbicide candidates have failed to prevent HIV transmission in human clinical trials, and there is uncertainty as to how many additional trials can be supported by the field. Regardless, there are far too many microbicide candidates in development, and a logical and consistent method for screening and selecting candidates for human clinical trials is desperately needed. The unique host and cell specificity of HIV, however, provides challenges for microbicide safety and efficacy screening, that can only be addressed by rigorous testing in relevant laboratory animal models. A number of laboratory animal model systems ranging from rodents to nonhuman primates, and single versus multiple dose challenges have recently been developed to test microbicide candidates. These models have shed light on both the safety and efficacy of candidate microbicides as well as the early mechanisms involved in transmission. This article summarizes the major advantages and disadvantages of the relevant animal models for microbicide safety and efficacy testing. Currently, nonhuman primates are the only relevant and effective laboratory model for screening microbicide candidates. Given the consistent failures of prior strategies, it is now clear that rigorous safety and efficacy testing in nonhuman primates should be a prerequisite for advancing additional microbicide candidates to human clinical trials.

  15. Microbicide Safety/Efficacy studies in animals -macaques and small animal models

    PubMed Central

    Veazey, Ronald S.

    2009-01-01

    Purpose of review A number of microbicide candidates have failed to prevent HIV transmission in human clinical trials, and there is uncertainty as to how many additional trials can be supported by the field. Regardless, there are far too many microbicide candidates in development, and a logical and consistent method for screening and selecting candidates for human clinical trials is desperately needed. However, the unique host and cell specificity of HIV provides challenges for microbicide safety and efficacy screening, that can only be addressed by rigorous testing in relevant laboratory animal models. Recent findings A number of laboratory animal model systems ranging from rodents to nonhuman primates, and single versus multiple dose challenges have recently been developed to test microbicide candidates. These models have shed light on both the safety and efficacy of candidate microbicides as well as the early mechanisms involved in transmission. This article summarizes the major advantages and disadvantages of the relevant animal models for microbicide safety and efficacy testing. Summary Currently, nonhuman primates are the only relevant and effective laboratory model for screening microbicide candidates. Given the consistent failures of prior strategies, it is now clear that rigorous safety and efficacy testing in nonhuman primates should be a pre-requisite for advancing additional microbicide candidates to human clinical trials. PMID:19373023

  16. Random Predictor Models for Rigorous Uncertainty Quantification: Part 2

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This and a companion paper propose techniques for constructing parametric mathematical models describing key features of the distribution of an output variable given input-output data. By contrast to standard models, which yield a single output value at each value of the input, Random Predictors Models (RPMs) yield a random variable at each value of the input. Optimization-based strategies for calculating RPMs having a polynomial dependency on the input and a linear dependency on the parameters are proposed. These formulations yield RPMs having various levels of fidelity in which the mean, the variance, and the range of the model's parameter, thus of the output, are prescribed. As such they encompass all RPMs conforming to these prescriptions. The RPMs are optimal in the sense that they yield the tightest predictions for which all (or, depending on the formulation, most) of the observations are less than a fixed number of standard deviations from the mean prediction. When the data satisfies mild stochastic assumptions, and the optimization problem(s) used to calculate the RPM is convex (or, when its solution coincides with the solution to an auxiliary convex problem), the model's reliability, which is the probability that a future observation would be within the predicted ranges, is bounded rigorously.

  17. Random Predictor Models for Rigorous Uncertainty Quantification: Part 1

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2015-01-01

    This and a companion paper propose techniques for constructing parametric mathematical models describing key features of the distribution of an output variable given input-output data. By contrast to standard models, which yield a single output value at each value of the input, Random Predictors Models (RPMs) yield a random variable at each value of the input. Optimization-based strategies for calculating RPMs having a polynomial dependency on the input and a linear dependency on the parameters are proposed. These formulations yield RPMs having various levels of fidelity in which the mean and the variance of the model's parameters, thus of the predicted output, are prescribed. As such they encompass all RPMs conforming to these prescriptions. The RPMs are optimal in the sense that they yield the tightest predictions for which all (or, depending on the formulation, most) of the observations are less than a fixed number of standard deviations from the mean prediction. When the data satisfies mild stochastic assumptions, and the optimization problem(s) used to calculate the RPM is convex (or, when its solution coincides with the solution to an auxiliary convex problem), the model's reliability, which is the probability that a future observation would be within the predicted ranges, can be bounded tightly and rigorously.

  18. Cognitive impairment, decline and fluctuations in older community-dwelling subjects with Lewy bodies

    PubMed Central

    Arvanitakis, Z.; Yu, L.; Boyle, P. A.; Leurgans, S. E.; Bennett, D. A.

    2012-01-01

    Lewy bodies are common in the ageing brain and often co-occur with Alzheimer’s disease pathology. There is little known regarding the independent role of Lewy body pathology in cognition impairment, decline and fluctuations in community-dwelling older persons. We examined the contribution of Lewy body pathology to dementia, global cognition, cognitive domains, cognitive decline and fluctuations in 872 autopsied subjects (mean age = 87.9 years) from the Rush Religious Order Study (n = 491) and Memory and Aging Project (n = 381) longitudinal community-based clinical–pathological studies. Dementia was based on a clinical evaluation; annual cognitive performance tests were used to create a measure of global cognition and five cognitive domains. Lewy body type was determined by using α-synuclein immunostained sections of substantia nigra, limbic and neocortical regions. Statistical models included multiple regression models for dementia and cognition and mixed effects models for decline. Cognitive fluctuations were estimated by comparing standard deviations of individual residuals from mean trajectories of decline in those with and without Lewy bodies. All models controlled for age, sex, education, Alzheimer’s disease pathology and infarcts. One hundred and fifty-seven subjects (18%) exhibited Lewy body pathology (76 neocortical-type, 54 limbic-type and 27 nigra-predominant). One hundred and three (66%) subjects with Lewy body pathology had a pathologic diagnosis of Alzheimer’s disease. Neocortical-type, but not nigral-predominant or limbic-type Lewy body pathology was related to an increased odds of dementia (odds ratio = 3.21; 95% confidence interval = 1.78–5.81) and lower cognition (P < 0.001) including episodic memory function (P < 0.001) proximate to death. Neocortical-type Lewy body pathology was also related to a faster decline in global cognition (P < 0.001), decline in all five specific cognitive domains (all P-values < 0.001), and to fluctuations in decline of working and semantic memory (P-values < 0.001). Limbic-type Lewy body pathology was related to lower and faster decline in visuospatial skills (P = 0.042). The relationship of Lewy body pathology to cognition and dementia was not modified by Alzheimer’s disease pathology. Neocortical-type Lewy body pathology is associated with increased odds of dementia; lower and more rapid decline in all cognitive domains including episodic memory and fluctuations in decline in semantic and working memory. Limbic-type Lewy body pathology is specifically associated with lower and more rapid decline in visuospatial skills. The effect of Lewy body pathology on cognition appears to be independent of Alzheimer’s disease pathology. PMID:23065790

  19. Revised Planning Methodology For Signalized Intersections And Operational Analysis Of Exclusive Left-Turn Lanes, Part-II: Models And Procedures (Final Report)

    DOT National Transportation Integrated Search

    1996-04-01

    THIS REPORT ALSO DESCRIBES THE PROCEDURES FOR DIRECT ESTIMATION OF INTERSECTION CAPACITY WITH SIMULATION, INCLUDING A SET OF RIGOROUS STATISTICAL TESTS FOR SIMULATION PARAMETER CALIBRATION FROM FIELD DATA.

  20. The Rigor Mortis of Education: Rigor Is Required in a Dying Educational System

    ERIC Educational Resources Information Center

    Mixon, Jason; Stuart, Jerry

    2009-01-01

    In an effort to answer the "Educational Call to Arms", our national public schools have turned to Advanced Placement (AP) courses as the predominate vehicle used to address the lack of academic rigor in our public high schools. Advanced Placement is believed by many to provide students with the rigor and work ethic necessary to…

  1. Technological characteristics of pre- and post-rigor deboned beef mixtures from Holstein steers and quality attributes of cooked beef sausage.

    PubMed

    Sukumaran, Anuraj T; Holtcamp, Alexander J; Campbell, Yan L; Burnett, Derris; Schilling, Mark W; Dinh, Thu T N

    2018-06-07

    The objective of this study was to determine the effects of deboning time (pre- and post-rigor), processing steps (grinding - GB; salting - SB; batter formulation - BB), and storage time on the quality of raw beef mixtures and vacuum-packaged cooked sausage, produced using a commercial formulation with 0.25% phosphate. The pH was greater in pre-rigor GB and SB than in post-rigor GB and SB (P < .001). However, deboning time had no effect on metmyoglobin reducing activity, cooking loss, and color of raw beef mixtures. Protein solubility of pre-rigor beef mixtures (124.26 mg/kg) was greater than that of post-rigor beef (113.93 mg/kg; P = .071). TBARS were increased in BB but decreased during vacuum storage of cooked sausage (P ≤ .018). Except for chewiness and saltiness being 52.9 N-mm and 0.3 points greater in post-rigor sausage (P = .040 and 0.054, respectively), texture profile analysis and trained panelists detected no difference in texture between pre- and post-rigor sausage. Published by Elsevier Ltd.

  2. A cellular automata model of Ebola virus dynamics

    NASA Astrophysics Data System (ADS)

    Burkhead, Emily; Hawkins, Jane

    2015-11-01

    We construct a stochastic cellular automaton (SCA) model for the spread of the Ebola virus (EBOV). We make substantial modifications to an existing SCA model used for HIV, introduced by others and studied by the authors. We give a rigorous analysis of the similarities between models due to the spread of virus and the typical immune response to it, and the differences which reflect the drastically different timing of the course of EBOV. We demonstrate output from the model and compare it with clinical data.

  3. Mathematical and Numerical Analysis of Model Equations on Interactions of the HIV/AIDS Virus and the Immune System

    NASA Astrophysics Data System (ADS)

    Parumasur, N.; Willie, R.

    2008-09-01

    We consider a simple HIV/AIDs finite dimensional mathematical model on interactions of the blood cells, the HIV/AIDs virus and the immune system for consistence of the equations to the real biomedical situation that they model. A better understanding to a cure solution to the illness modeled by the finite dimensional equations is given. This is accomplished through rigorous mathematical analysis and is reinforced by numerical analysis of models developed for real life cases.

  4. Deconstructing spatiotemporal chaos using local symbolic dynamics.

    PubMed

    Pethel, Shawn D; Corron, Ned J; Bollt, Erik

    2007-11-23

    We find that the global symbolic dynamics of a diffusively coupled map lattice is well approximated by a very small local model for weak to moderate coupling strengths. A local symbolic model is a truncation of the full symbolic model to one that considers only a single element and a few neighbors. Using interval analysis, we give rigorous results for a range of coupling strengths and different local model widths. Examples are presented of extracting a local symbolic model from data and of controlling spatiotemporal chaos.

  5. Solar energy market penetration models - Science or number mysticism

    NASA Technical Reports Server (NTRS)

    Warren, E. H., Jr.

    1980-01-01

    The forecast market potential of a solar technology is an important factor determining its R&D funding. Since solar energy market penetration models are the method used to forecast market potential, they have a pivotal role in a solar technology's development. This paper critiques the applicability of the most common solar energy market penetration models. It is argued that the assumptions underlying the foundations of rigorously developed models, or the absence of a reasonable foundation for the remaining models, restrict their applicability.

  6. Predicting functional decline and survival in amyotrophic lateral sclerosis.

    PubMed

    Ong, Mei-Lyn; Tan, Pei Fang; Holbrook, Joanna D

    2017-01-01

    Better predictors of amyotrophic lateral sclerosis disease course could enable smaller and more targeted clinical trials. Partially to address this aim, the Prize for Life foundation collected de-identified records from amyotrophic lateral sclerosis sufferers who participated in clinical trials of investigational drugs and made them available to researchers in the PRO-ACT database. In this study, time series data from PRO-ACT subjects were fitted to exponential models. Binary classes for decline in the total score of amyotrophic lateral sclerosis functional rating scale revised (ALSFRS-R) (fast/slow progression) and survival (high/low death risk) were derived. Data was segregated into training and test sets via cross validation. Learning algorithms were applied to the demographic, clinical and laboratory parameters in the training set to predict ALSFRS-R decline and the derived fast/slow progression and high/low death risk categories. The performance of predictive models was assessed by cross-validation in the test set using Receiver Operator Curves and root mean squared errors. A model created using a boosting algorithm containing the decline in four parameters (weight, alkaline phosphatase, albumin and creatine kinase) post baseline, was able to predict functional decline class (fast or slow) with fair accuracy (AUC = 0.82). However similar approaches to build a predictive model for decline class by baseline subject characteristics were not successful. In contrast, baseline values of total bilirubin, gamma glutamyltransferase, urine specific gravity and ALSFRS-R item score-climbing stairs were sufficient to predict survival class. Using combinations of small numbers of variables it was possible to predict classes of functional decline and survival across the 1-2 year timeframe available in PRO-ACT. These findings may have utility for design of future ALS clinical trials.

  7. Computationally inexpensive identification of noninformative model parameters by sequential screening

    NASA Astrophysics Data System (ADS)

    Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis

    2015-08-01

    Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.

  8. Computationally inexpensive identification of noninformative model parameters by sequential screening

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis

    2016-04-01

    Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.

  9. Alarms about structural alerts.

    PubMed

    Alves, Vinicius; Muratov, Eugene; Capuzzi, Stephen; Politi, Regina; Low, Yen; Braga, Rodolpho; Zakharov, Alexey V; Sedykh, Alexander; Mokshyna, Elena; Farag, Sherif; Andrade, Carolina; Kuz'min, Victor; Fourches, Denis; Tropsha, Alexander

    2016-08-21

    Structural alerts are widely accepted in chemical toxicology and regulatory decision support as a simple and transparent means to flag potential chemical hazards or group compounds into categories for read-across. However, there has been a growing concern that alerts disproportionally flag too many chemicals as toxic, which questions their reliability as toxicity markers. Conversely, the rigorously developed and properly validated statistical QSAR models can accurately and reliably predict the toxicity of a chemical; however, their use in regulatory toxicology has been hampered by the lack of transparency and interpretability. We demonstrate that contrary to the common perception of QSAR models as "black boxes" they can be used to identify statistically significant chemical substructures (QSAR-based alerts) that influence toxicity. We show through several case studies, however, that the mere presence of structural alerts in a chemical, irrespective of the derivation method (expert-based or QSAR-based), should be perceived only as hypotheses of possible toxicological effect. We propose a new approach that synergistically integrates structural alerts and rigorously validated QSAR models for a more transparent and accurate safety assessment of new chemicals.

  10. Surgery results in exaggerated and persistent cognitive decline in a rat model of the Metabolic Syndrome.

    PubMed

    Feng, Xiaomei; Degos, Vincent; Koch, Lauren G; Britton, Steven L; Zhu, Yinggang; Vacas, Susana; Terrando, Niccolò; Nelson, Jeffrey; Su, Xiao; Maze, Mervyn

    2013-05-01

    Postoperative cognitive decline can be reproduced in animal models. In a well-validated rat model of the Metabolic Syndrome, we sought to investigate whether surgery induced a more severe and persistent form of cognitive decline similar to that noted in preliminary clinical studies. In rats that had been selectively bred for low and high exercise endurance, the low capacity runners (LCR) exhibited features of Metabolic Syndrome (obesity, dyslipidemia, insulin resistance, and hypertension). Tibial fracture surgery was performed under isoflurane anesthesia in LCR and high capacity runner (HCR) rats and cognitive function was assessed postoperatively in a trace-fear conditioning paradigm and Morris Water Maze; non-operated rats were exposed to anesthesia and analgesia (sham). Group sizes were n = 6. On postoperative D7, LCR rats had shorter freezing times than postoperative HCR rats. Five months postoperatively, LCR rats had a flatter learning trajectory and took longer to locate the submerged platform than postoperative HCR rats; dwell-time in the target quadrant in a probe trial was shorter in the postoperative LCR compared to HCR rats. LCR and HCR sham rats did not differ in any test. Postoperatively, LCR rats diverged from HCR rats exhibiting a greater decline in memory, acutely, with persistent learning and memory decline, remotely; this could not be attributed to changes in locomotor or swimming performance. This Metabolic Syndrome animal model of surgery-induced cognitive decline corroborates, with high fidelity, preliminary findings of postoperative cognitive dysfunction in Metabolic Syndrome patients.

  11. Validating a Model of Motivational Factors Influencing Involvement for Parents of Transition-Age Youth with Disabilities

    ERIC Educational Resources Information Center

    Hirano, Kara A.; Shanley, Lina; Garbacz, S. Andrew; Rowe, Dawn A.; Lindstrom, Lauren; Leve, Leslie D.

    2018-01-01

    Parent involvement is a predictor of postsecondary education and employment outcomes, but rigorous measures of parent involvement for youth with disabilities are lacking. Hirano, Garbacz, Shanley, and Rowe adapted scales based on Hoover-Dempsey and Sandler model of parent involvement for use with parents of youth with disabilities aged 14 to 23.…

  12. Analysis of the Impacts of City Year's Whole School Whole Child Model on Partner Schools' Performance

    ERIC Educational Resources Information Center

    Meredith, Julie; Anderson, Leslie M.

    2015-01-01

    City Year is a learning organization committed to the rigorous evaluation of its "Whole School Whole Child" model, which trains and deploys teams of AmeriCorps members to low-performing, urban schools to empower more students to reach their full potential. A third-party study by Policy Studies Associates (PSA) examined the impact of…

  13. Closing the Loop: Automated Data-Driven Cognitive Model Discoveries Lead to Improved Instruction and Learning Gains

    ERIC Educational Resources Information Center

    Liu, Ran; Koedinger, Kenneth R.

    2017-01-01

    As the use of educational technology becomes more ubiquitous, an enormous amount of learning process data is being produced. Educational data mining seeks to analyze and model these data, with the ultimate goal of improving learning outcomes. The most firmly grounded and rigorous evaluation of an educational data mining discovery is whether it…

  14. Combined Homogeneous Surface Diffusion Model - Design of experiments approach to optimize dye adsorption considering both equilibrium and kinetic aspects.

    PubMed

    Muthukkumaran, A; Aravamudan, K

    2017-12-15

    Adsorption, a popular technique for removing azo dyes from aqueous streams, is influenced by several factors such as pH, initial dye concentration, temperature and adsorbent dosage. Any strategy that seeks to identify optimal conditions involving these factors, should take into account both kinetic and equilibrium aspects since they influence rate and extent of removal by adsorption. Hence rigorous kinetics and accurate equilibrium models are required. In this work, the experimental investigations pertaining to adsorption of acid orange 10 dye (AO10) on activated carbon were carried out using Central Composite Design (CCD) strategy. The significant factors that affected adsorption were identified to be solution temperature, solution pH, adsorbent dosage and initial solution concentration. Thermodynamic analysis showed the endothermic nature of the dye adsorption process. The kinetics of adsorption has been rigorously modeled using the Homogeneous Surface Diffusion Model (HSDM) after incorporating the non-linear Freundlich adsorption isotherm. Optimization was performed for kinetic parameters (color removal time and surface diffusion coefficient) as well as the equilibrium affected response viz. percentage removal. Finally, the optimum conditions predicted were experimentally validated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Electromagnetic Wave Propagation in Body Area Networks Using the Finite-Difference-Time-Domain Method

    PubMed Central

    Bringuier, Jonathan N.; Mittra, Raj

    2012-01-01

    A rigorous full-wave solution, via the Finite-Difference-Time-Domain (FDTD) method, is performed in an attempt to obtain realistic communication channel models for on-body wireless transmission in Body-Area-Networks (BANs), which are local data networks using the human body as a propagation medium. The problem of modeling the coupling between body mounted antennas is often not amenable to attack by hybrid techniques owing to the complex nature of the human body. For instance, the time-domain Green's function approach becomes more involved when the antennas are not conformal. Furthermore, the human body is irregular in shape and has dispersion properties that are unique. One consequence of this is that we must resort to modeling the antenna network mounted on the body in its entirety, and the number of degrees of freedom (DoFs) can be on the order of billions. Even so, this type of problem can still be modeled by employing a parallel version of the FDTD algorithm running on a cluster. Lastly, we note that the results of rigorous simulation of BANs can serve as benchmarks for comparison with the abundance of measurement data. PMID:23012575

  16. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  17. Rigorous Proof of the Boltzmann-Gibbs Distribution of Money on Connected Graphs

    NASA Astrophysics Data System (ADS)

    Lanchier, Nicolas

    2017-04-01

    Models in econophysics, i.e., the emerging field of statistical physics that applies the main concepts of traditional physics to economics, typically consist of large systems of economic agents who are characterized by the amount of money they have. In the simplest model, at each time step, one agent gives one dollar to another agent, with both agents being chosen independently and uniformly at random from the system. Numerical simulations of this model suggest that, at least when the number of agents and the average amount of money per agent are large, the distribution of money converges to an exponential distribution reminiscent of the Boltzmann-Gibbs distribution of energy in physics. The main objective of this paper is to give a rigorous proof of this result and show that the convergence to the exponential distribution holds more generally when the economic agents are located on the vertices of a connected graph and interact locally with their neighbors rather than globally with all the other agents. We also study a closely related model where, at each time step, agents buy with a probability proportional to the amount of money they have, and prove that in this case the limiting distribution of money is Poissonian.

  18. A surface spherical harmonic expansion of gravity anomalies on the ellipsoid

    NASA Astrophysics Data System (ADS)

    Claessens, S. J.; Hirt, C.

    2015-10-01

    A surface spherical harmonic expansion of gravity anomalies with respect to a geodetic reference ellipsoid can be used to model the global gravity field and reveal its spectral properties. In this paper, a direct and rigorous transformation between solid spherical harmonic coefficients of the Earth's disturbing potential and surface spherical harmonic coefficients of gravity anomalies in ellipsoidal approximation with respect to a reference ellipsoid is derived. This transformation cannot rigorously be achieved by the Hotine-Jekeli transformation between spherical and ellipsoidal harmonic coefficients. The method derived here is used to create a surface spherical harmonic model of gravity anomalies with respect to the GRS80 ellipsoid from the EGM2008 global gravity model. Internal validation of the model shows a global RMS precision of 1 nGal. This is significantly more precise than previous solutions based on spherical approximation or approximations to order or , which are shown to be insufficient for the generation of surface spherical harmonic coefficients with respect to a geodetic reference ellipsoid. Numerical results of two applications of the new method (the computation of ellipsoidal corrections to gravimetric geoid computation, and area means of gravity anomalies in ellipsoidal approximation) are provided.

  19. The fate of haloacetic acids and trihalomethanes in an aquifer storage and recovery program, Las Vegas, Nevada

    USGS Publications Warehouse

    Thomas, J.M.; McKay, W.A.; Colec, E.; Landmeyer, J.E.; Bradley, P.M.

    2000-01-01

    The fate of disinfection byproducts during aquifer storage and recovery (ASR) is evaluated for aquifers in Southern Nevada. Rapid declines of haloacetic acid (HAA) concentrations during ASR, with associated little change in Cl concentration, indicate that HAAs decline primarily by in situ microbial oxidation. Dilution is only a minor contributor to HAA concentration declines during ASR. Trihalomethane (THM) concentrations generally increased during storage of artificial recharge (AR) water and then declined during recovery. The decline of THM concentrations during recovery was primarily from dilution of current season AR water with residual AR water remaining in the aquifer from previous ASR seasons and native ground water. In more recent ASR seasons, for wells with the longest history of ASR, brominated THMs declined during storage and recovery by processes in addition to dilution. These conclusions about THMs are indicated by THM/Cl values and supported by a comparison of measured and model predicted THM concentrations. Geochemical mixing models were constructed using major-ion chemistry of the three end-member waters to calculate predicted THM concentrations. The decline in brominated THM concentrations in addition to that from dilution may result from biotransformation processes.

  20. Differential rate in decline in ovarian reserve markers in women with polycystic ovary syndrome compared with control subjects: results of a longitudinal study.

    PubMed

    Ahmad, Asima K; Kao, Chia-Ning; Quinn, Molly; Lenhart, Nikolaus; Rosen, Mitchell; Cedars, Marcelle I; Huddleston, Heather

    2018-03-01

    To estimate rates of ovarian aging in polycystic ovary syndrome (PCOS) subjects versus a community control population. Longitudinal. Tertiary academic center. PCOS subjects diagnosed according to the 2004 Rotterdam criteria were systematically enrolled in a PCOS cohort study. The comparison control subjects were from the Ovarian Aging study, a prospective longitudinal study of ovarian aging in healthy women with regular menstrual cycles. Clinical data collection over two study visits. Antral follicle count (AFC), ovarian volume (OV), and antimüllerian hormone level (AMH). PCOS subjects were found to have higher baseline values for all ovarian reserve markers compared with control subjects. Univariate models indicated that, compared with control subjects, PCOS patients experienced significantly faster rates of decline for both AFC and AMH. Change in OV did not differ significantly. To account for potential confounder effects, multiple analysis of covariance models were evaluated for the best fit, considering age, body mass index, and baseline ovarian reserve markers. Adjusted models demonstrated that PCOS patients do not experience a significant difference in AFC decline compared with control subjects, but they do experience a faster rate of decline in AMH (P<.01) and slower rate of decline in OV (P<.01). Ovarian aging in PCOS is characterized by a more rapid decline in AMH and a slower decline in OV compared with control subjects. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  1. Endogenous technological and demographic change under increasing water scarcity

    NASA Astrophysics Data System (ADS)

    Pande, S.; Ertsen, M.; Sivapalan, M.

    2013-12-01

    Many ancient civilizations such as the Indus Valley civilization dispersed under extreme dry conditions. Even contemporary societies such as the one in Murrumbidgee river basin, Australia, have started to witness a decline in overall population under increasing water scarcity. Skeptics of hydroclimatic determinism have often cautioned against the use of hydroclimatic change as the sole predictor of the fate of contemporary societies in water scarce regions by suggesting that technological change may ameliorate the effects of increasing water scarcity. We here develop a simple overlapping generations model of endogenous technological and demographic change. It models technological change not as an exogenous random sequence of events but as an endogenous process (as is widely accepted in contemporary literature) that depends on factors such as the investments that are (endogenously) made in a society, the endogenous diversification of a society into skilled and unskilled workers, individuals' patience in terms of its present consumption versus future consumption, the production technology and the (endogenous) interaction of these factors. The population growth rate is modeled to decline once consumption per capita crosses a ';survival' threshold. The model demonstrates that technological change may ameliorate the effects of increasing water scarcity but only to a certain extent in many cases. It is possible that technological change may allow a society to escape the effect of increasing water society, leading to an exponential rise in technology and population. However, such cases require that the rate of success of investment in technological advancement is high. In other more realistic cases of technological success, we find that endogenous technology change has an effect delaying the peak of population before it starts to decline. While the model is a rather simple model of societal growth, it is capable of replicating (not to scale) patterns of technological change (proxies of which in ancient technology include irrigation canals, metal tools, and the use of horses for labor while in contemporary societies its proxies may be the advent of drip irrigation, increasing reservoir storage capacity etc) and population change. It is capable of replicating the pattern of declining consumption per capita in presence of growth in aggregate production. It is also capable of modeling the exponential population rise even under increasing water scarcity. The results of the model suggest, as one of the many other possible explanations, that ancient societies that declined in the face of extreme water scarcity may have done so due to slower rate of success of investment in technological advancement. The model suggests that the population decline occurs after a prolonged decline in consumption per capita, which in turn is due to the joint effect of initially increasing population and increasing water scarcity. This is despite technological advancement and increase in aggregate production. Thus declining consumption per capita despite technological advancement and increase in aggregate production may serve as a useful predictor of upcoming decline in contemporary societies in water scarce basins.

  2. A Unified Theory of Non-Ideal Gas Lattice Boltzmann Models

    NASA Technical Reports Server (NTRS)

    Luo, Li-Shi

    1998-01-01

    A non-ideal gas lattice Boltzmann model is directly derived, in an a priori fashion, from the Enskog equation for dense gases. The model is rigorously obtained by a systematic procedure to discretize the Enskog equation (in the presence of an external force) in both phase space and time. The lattice Boltzmann model derived here is thermodynamically consistent and is free of the defects which exist in previous lattice Boltzmann models for non-ideal gases. The existing lattice Boltzmann models for non-ideal gases are analyzed and compared with the model derived here.

  3. Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-1994, with projections to 2020

    USGS Publications Warehouse

    Kernodle, J.M.; McAda, D.P.; Thorn, C.R.

    1995-01-01

    This report describes a three-dimensional finite-difference ground-water-flow model of the Santa Fe Group aquifer system in the Albuquerque Basin, which comprises the Santa Fe Group (late Oligocene to middle Pleistocene age) and overlying valley and basin-fill deposits (Pleistocene to Holocene age). The model is designed to be flexible and adaptive to new geologic and hydrologic information as it becomes available by using a geographic information system as a data-base manager to interface with the model. The aquifer system was defined and quantified in the model consistent with the current (July 1994) understanding of the structural and geohydrologic framework of the basin. Rather than putting the model through a rigorous calibration process, dis- crepancies between simulated and measured responses in hydraulic head were taken to indicate that the understanding of a local part of the aquifer system was incomplete or incorrect. The model simulates ground-water flow over an area of about 2,400 square miles to a depth of 1,730 to about 2,020 feet below the water table with 244 rows, 178 columns, and 11 layers. Of the 477,752 cells in the model, 310,376 are active. The top four model layers approximate the 80-foot thickness of alluvium in the incised and refilled valley of the Rio Grande to provide detail of the effect of ground-water withdrawals on the surface- water system. Away from the valley these four layers represent the interval within the Santa Fe Group aquifer system between the com- puted predevelopment water table and a level 80 feet below the grade of the Rio Grande. The simulations include initial condi- tions (steady-state), the 1901-1994 historical period, and four possible ground-water withdrawal scenarios from 1994 to 2020. The model indicates that for the year ending in March 1994, net surface-water loss in the basin resulting from the City of Albuquerque's ground-water withdrawal totaled about 53,000 acre- feet. The balance of the about 123,000 acre-feet of withdrawal came from aquifer storage depletion (about 67,800 acre-feet) and captured or salvaged evapotranspiration (about 2,500 acre-feet). In the four scenarios projected from 1994 to 2020, City of Albuquerque annual withdrawals ranged from about 98,700 to about 177,000 acre-feet by the year 2020. The range of resulting sur- face-water loss was from about 62,000 to about 77,000 acre-feet. The range of aquifer storage depletion was from about 33,400 to about 95,900 acre-feet. Captured evapotranspiration and drain- return flow remained nearly constant for all scenarios. From 1994 to 2020, maximum projected declines in hydraulic head in the pri- mary water-production zone of the aquifer (model layer 9) for the four scenarios ranged from 55 to 164 feet east of the Rio Grande, and from 91 to 258 feet west of the river. Average declines in a 383.7-square-mile area around Albuquerque ranged from 28 to 65 feet in the production zone for the same period.

  4. Landscape metrics, scales of resolution

    Treesearch

    Samuel A. Cushman; Kevin McGarigal

    2008-01-01

    Effective implementation of the "multiple path" approach to managing green landscapes depends fundamentally on rigorous quantification of the composition and structure of the landscapes of concern at present, modelling landscape structure trajectories under alternative management paths, and monitoring landscape structure into the future to confirm...

  5. Reed Warbler Hosts Fine-Tune their Defenses to Track Three Decades of Cuckoo Decline

    PubMed Central

    Thorogood, Rose; Davies, Nicholas B

    2013-01-01

    Interactions between avian hosts and brood parasites can provide a model for how animals adapt to a changing world. Reed warbler (Acrocephalus scirpaceus) hosts employ costly defenses to combat parasitism by common cuckoos (Cuculus canorus). During the past three decades cuckoos have declined markedly across England, reducing parasitism at our study site (Wicken Fen) from 24% of reed warbler nests in 1985 to 1% in 2012. Here we show with experiments that host mobbing and egg rejection defenses have tracked this decline in local parasitism risk: the proportion of reed warbler pairs mobbing adult cuckoos (assessed by responses to cuckoo mounts and models) has declined from 90% to 38%, and the proportion rejecting nonmimetic cuckoo eggs (assessed by responses to model eggs) has declined from 61% to 11%. This is despite no change in response to other nest enemies or mimetic model eggs. Individual variation in both defenses is predicted by parasitism risk during the host’s egg-laying period. Furthermore, the response of our study population to temporal variation in parasitism risk can also explain spatial variation in egg rejection behavior in other populations across Europe. We suggest that spatial and temporal variation in parasitism risk has led to the evolution of plasticity in reed warbler defenses. PMID:24299407

  6. The Dynamics of Son Preference, Technology Diffusion, and Fertility Decline Underlying Distorted Sex Ratios at Birth: A Simulation Approach.

    PubMed

    Kashyap, Ridhi; Villavicencio, Francisco

    2016-10-01

    We present a micro-founded simulation model that formalizes the "ready, willing, and able" framework, originally used to explain historical fertility decline, to the practice of prenatal sex selection. The model generates sex ratio at birth (SRB) distortions from the bottom up and attempts to quantify plausible levels, trends, and interactions of son preference, technology diffusion, and fertility decline that underpin SRB trajectories at the macro level. Calibrating our model for South Korea, we show how even as the proportion with a preference for sons was declining, SRB distortions emerged due to rapid diffusion of prenatal sex determination technology combined with small but growing propensities to abort at low birth parities. Simulations reveal that relatively low levels of son preference (about 20 % to 30 % wanting one son) can result in skewed SRB levels if technology diffuses early and steadily, and if fertility falls rapidly to encourage sex-selective abortion at low parities. Model sensitivity analysis highlights how the shape of sex ratio trajectories is particularly sensitive to the timing and speed of prenatal sex-determination technology diffusion. The maximum SRB levels reached in a population are influenced by how the readiness to abort rises as a function of the fertility decline.

  7. Dual-Retrieval Models and Neurocognitive Impairment

    ERIC Educational Resources Information Center

    Brainerd, C. J.; Reyna, V. F.; Gomes, C. F. A.; Kenney, A. E.; Gross, C. J.; Taub, E. S.; Spreng, R. N.

    2014-01-01

    Advances in dual-retrieval models of recall make it possible to use clinical data to test theoretical hypotheses about mild cognitive impairment (MCI) and Alzheimer's dementia (AD), the most common forms of neurocognitive impairment. Hypotheses about the nature of the episodic memory declines in these diseases, about decline versus sparing of…

  8. Changes in net ecosystem productivity of boreal black spruce stands in response to changes in temperature at diurnal and seasonal time scales.

    PubMed

    Grant, R F; Margolis, H A; Barr, A G; Black, T A; Dunn, A L; Bernier, P Y; Bergeron, O

    2009-01-01

    Net ecosystem productivity (NEP) of boreal coniferous forests is believed to rise with climate warming, thereby offsetting some of the rise in atmospheric CO(2) concentration (C(a)) by which warming is caused. However, the response of conifer NEP to warming may vary seasonally, with rises in spring and declines in summer. To gain more insight into this response, we compared changes in CO(2) exchange measured by eddy covariance and simulated by the ecosystem process model ecosys under rising mean annual air temperatures (T(a)) during 2004-2006 at black spruce stands in Saskatchewan, Manitoba and Quebec. Hourly net CO(2) uptake was found to rise with warming at T(a) < 15 degrees C and to decline with warming at T(a) > 20 degrees C. As mean annual T(a) rose from 2004 to 2006, increases in net CO(2) uptake with warming at lower T(a) were greater than declines with warming at higher T(a) so that annual gross primary productivity and hence NEP increased. Increases in net CO(2) uptake measured at lower T(a) were explained in the model by earlier recovery of photosynthetic capacity in spring, and by increases in carboxylation activity, using parameters for the Arrhenius temperature functions of key carboxylation processes derived from independent experiments. Declines in net CO(2) uptake measured at higher T(a) were explained in the model by sharp declines in mid-afternoon canopy stomatal conductance (g(c)) under higher vapor pressure deficits (D). These declines were modeled from a hydraulic constraint to water uptake imposed by low axial conductivity of conifer roots and boles that forced declines in canopy water potential (psi(c)), and hence in g(c) under higher D when equilibrating water uptake with transpiration. In a model sensitivity study, the contrasting responses of net CO(2) uptake to specified rises in T(a) caused annual NEP of black spruce in the model to rise with increases in T(a) of up to 6 degrees C, but to decline with further increases at mid-continental sites with lower precipitation. However, these contrasting responses to warming also indicate that rises in NEP with climate warming would depend on the seasonality (spring versus summer) as well as the magnitude of rises in T(a).

  9. Experimental evaluation of rigor mortis. III. Comparative study of the evolution of rigor mortis in different sized muscle groups in rats.

    PubMed

    Krompecher, T; Fryc, O

    1978-01-01

    The use of new methods and an appropriate apparatus has allowed us to make successive measurements of rigor mortis and a study of its evolution in the rat. By a comparative examination on the front and hind limbs, we have determined the following: (1) The muscular mass of the hind limbs is 2.89 times greater than that of the front limbs. (2) In the initial phase rigor mortis is more pronounced in the front limbs. (3) The front and hind limbs reach maximum rigor mortis at the same time and this state is maintained for 2 hours. (4) Resolution of rigor mortis is accelerated in the front limbs during the initial phase, but both front and hind limbs reach complete resolution at the same time.

  10. Onset of rigor mortis is earlier in red muscle than in white muscle.

    PubMed

    Kobayashi, M; Takatori, T; Nakajima, M; Sakurada, K; Hatanaka, K; Ikegaya, H; Matsuda, Y; Iwase, H

    2000-01-01

    Rigor mortis is thought to be related to falling ATP levels in muscles postmortem. We measured rigor mortis as tension determined isometrically in three rat leg muscles in liquid paraffin kept at 37 degrees C or 25 degrees C--two red muscles, red gastrocnemius (RG) and soleus (SO) and one white muscle, white gastrocnemius (WG). Onset, half and full rigor mortis occurred earlier in RG and SO than in WG both at 37 degrees C and at 25 degrees C even though RG and WG were portions of the same muscle. This suggests that rigor mortis directly reflects the postmortem intramuscular ATP level, which decreases more rapidly in red muscle than in white muscle after death. Rigor mortis was more retarded at 25 degrees C than at 37 degrees C in each type of muscle.

  11. The thermo-elastic instability model of melting of alkali halides in the Debye approximation

    NASA Astrophysics Data System (ADS)

    Owens, Frank J.

    2018-05-01

    The Debye model of lattice vibrations of alkali halides is used to show that there is a temperature below the melting temperature where the vibrational pressure exceeds the electrostatic pressure. The onset temperature of this thermo-elastic instability scales as the melting temperature of NaCl, KCl, and KBr, suggesting its role in the melting of the alkali halides in agreement with a previous more rigorous model.

  12. Higher brain BDNF gene expression is associated with slower cognitive decline in older adults.

    PubMed

    Buchman, Aron S; Yu, Lei; Boyle, Patricia A; Schneider, Julie A; De Jager, Philip L; Bennett, David A

    2016-02-23

    We tested whether brain-derived neurotrophic factor (BDNF) gene expression levels are associated with cognitive decline in older adults. Five hundred thirty-five older participants underwent annual cognitive assessments and brain autopsy at death. BDNF gene expression was measured in the dorsolateral prefrontal cortex. Linear mixed models were used to examine whether BDNF expression was associated with cognitive decline adjusting for age, sex, and education. An interaction term was added to determine whether this association varied with clinical diagnosis proximate to death (no cognitive impairment, mild cognitive impairment, or dementia). Finally, we examined the extent to which the association of Alzheimer disease (AD) pathology with cognitive decline varied by BDNF expression. Higher brain BDNF expression was associated with slower cognitive decline (p < 0.001); cognitive decline was about 50% slower with the 90th percentile BDNF expression vs 10th. This association was strongest in individuals with dementia. The level of BDNF expression was lower in individuals with pathologic AD (p = 0.006), but was not associated with macroscopic infarcts, Lewy body disease, or hippocampal sclerosis. BDNF expression remained associated with cognitive decline in a model adjusting for age, sex, education, and neuropathologies (p < 0.001). Furthermore, the effect of AD pathology on cognitive decline varied by BDNF expression such that the effect was strongest for high levels of AD pathology (p = 0.015); thus, in individuals with high AD pathology (90th percentile), cognitive decline was about 40% slower with the 90th percentile BDNF expression vs 10th. Higher brain BDNF expression is associated with slower cognitive decline and may also reduce the deleterious effects of AD pathology on cognitive decline. © 2016 American Academy of Neurology.

  13. MOE vs. M&E: considering the difference between measuring strategic effectiveness and monitoring tactical evaluation.

    PubMed

    Diehl, Glen; Major, Solomon

    2015-01-01

    Measuring the effectiveness of military Global Health Engagements (GHEs) has become an area of increasing interest to the military medical field. As a result, there have been efforts to more logically and rigorously evaluate GHE projects and programs; many of these have been based on the Logic and Results Frameworks. However, while these Frameworks are apt and appropriate planning tools, they are not ideally suited to measuring programs' effectiveness. This article introduces military medicine professionals to the Measures of Effectiveness for Defense Engagement and Learning (MODEL) program, which implements a new method of assessment, one that seeks to rigorously use Measures of Effectiveness (vs. Measures of Performance) to gauge programs' and projects' success and fidelity to Theater Campaign goals. While the MODEL method draws on the Logic and Results Frameworks where appropriate, it goes beyond their planning focus by using the latest social scientific and econometric evaluation methodologies to link on-the-ground GHE "lines of effort" to the realization of national and strategic goals and end-states. It is hoped these methods will find use beyond the MODEL project itself, and will catalyze a new body of rigorous, empirically based work, which measures the effectiveness of a broad spectrum of GHE and security cooperation activities. We based our strategies on the principle that it is much more cost-effective to prevent conflicts than it is to stop one once it's started. I cannot overstate the importance of our theater security cooperation programs as the centerpiece to securing our Homeland from the irregular and catastrophic threats of the 21st Century.-GEN James L. Jones, USMC (Ret.). Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.

  14. Standards and Methodological Rigor in Pulmonary Arterial Hypertension Preclinical and Translational Research.

    PubMed

    Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien

    2018-03-30

    Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from academic institutions, funding agencies, and animal ethics authorities. © 2018 American Heart Association, Inc.

  15. Modes of Arctic Ocean Change from GRACE, ICESat and the PIOMAS and ECCO2 Models of the Arctic Ocean

    NASA Astrophysics Data System (ADS)

    Peralta Ferriz, C.; Morison, J. H.; Bonin, J. A.; Chambers, D. P.; Kwok, R.; Zhang, J.

    2012-12-01

    EOF analysis of month-to-month variations in GRACE derived Arctic Ocean bottom pressure (OBP) with trend and seasonal variation removed yield three dominant modes. The first mode is a basin wide variation in mass associated with high atmospheric pressure (SLP) over Scandinavia mainly in winter. The second mode is a shift of mass from the central Arctic Ocean to the Siberian shelves due to low pressure over the basins, associated with the Arctic Oscillation. The third mode is a shift in mass between the Eastern and Western Siberian shelves, related to strength of the Beaufort High mainly in summer, and to eastward alongshore winds on the Barents Sea in winter. The PIOMAS and ECCO2 modeled OBP show fair agreement with the form of these modes and provide context in terms of variations in sea surface height SSH. Comparing GRACE OBP from 2007 to 2011 with GRACE OBP from 2002 to 2006 reveals a rising trend over most of the Arctic Ocean but declines in the Kara Sea region and summer East Siberian Sea. ECCO2 bears a faint resemblance to the observed OBP change but appears to be biased negatively. In contrast, PIOMAS SSH and ECCO2 especially, show changes between the two periods that are muted but similar to ICESat dynamic ocean topography and GRACE-ICESat freshwater trends from 2005 through 2008 [Morison et al., 2012] with a rising DOT and freshening in the Beaufort Sea and a trough with decreased freshwater on the Russian side of the Arctic Ocean. Morison, J., R. Kwok, C. Peralta-Ferriz, M. Alkire, I. Rigor, R. Andersen, and M. Steele (2012), Changing Arctic Ocean freshwater pathways, Nature, 481(7379), 66-70.

  16. Habitat fragmentation, vole population fluctuations, and the ROMPA hypothesis: An experimental test using model landscapes.

    PubMed

    Batzli, George O

    2016-11-01

    Increased habitat fragmentation leads to smaller size of habitat patches and to greater distance between patches. The ROMPA hypothesis (ratio of optimal to marginal patch area) uniquely links vole population fluctuations to the composition of the landscape. It states that as ROMPA decreases (fragmentation increases), vole population fluctuations will increase (including the tendency to display multi-annual cycles in abundance) because decreased proportions of optimal habitat result in greater population declines and longer recovery time after a harsh season. To date, only comparative observations in the field have supported the hypothesis. This paper reports the results of the first experimental test. I used prairie voles, Microtus ochrogaster, and mowed grassland to create model landscapes with 3 levels of ROMPA (high with 25% mowed, medium with 50% mowed and low with 75% mowed). As ROMPA decreased, distances between patches of favorable habitat (high cover) increased owing to a greater proportion of unfavorable (mowed) habitat. Results from the first year with intensive live trapping indicated that the preconditions for operation of the hypothesis existed (inversely density dependent emigration and, as ROMPA decreased, increased per capita mortality and decreased per capita movement between optimal patches). Nevertheless, contrary to the prediction of the hypothesis that populations in landscapes with high ROMPA should have the lowest variability, 5 years of trapping indicated that variability was lowest with medium ROMPA. The design of field experiments may never be perfect, but these results indicate that the ROMPA hypothesis needs further rigorous testing. © 2016 International Society of Zoological Sciences, Institute of Zoology/Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.

  17. Stroke trends in an aging population. The Technology Assessment Methods Project Team.

    PubMed

    Niessen, L W; Barendregt, J J; Bonneux, L; Koudstaal, P J

    1993-07-01

    Trends in stroke incidence and survival determine changes in stroke morbidity and mortality. This study examines the extent of the incidence decline and survival improvement in the Netherlands from 1979 to 1989. In addition, it projects future changes in stroke morbidity during the period 1985 to 2005, when the country's population will be aging. A state-event transition model is used, which combines Dutch population projections and existing data on stroke epidemiology. Based on the clinical course of stroke, the model describes historical national age- and sex-specific hospital admission and mortality rates for stroke. It extrapolates observed trends and projects future changes in stroke morbidity rates. There is evidence of a continuing incidence decline. The most plausible rate of change is an annual decline of -1.9% (range, -1.7% to -2.1%) for men and -2.4% (range, -2.3% to -2.8%) for women. Projecting a constant mortality decline, the model shows a 35% decrease of the stroke incidence rate for a period of 20 years. Prevalence rates for major stroke will decline among the younger age groups but increase among the oldest because of increased survival in the latter. In absolute numbers this results in an 18% decrease of acute stroke episodes and an 11% increase of major stroke cases. The increase in survival cannot fully explain the observed mortality decline and, therefore, a concomitant incidence decline has to be assumed. Aging of the population partially outweighs the effect of an incidence decline on the total burden of stroke. Increase in cardiovascular survival leads to a further increase in major stroke prevalence among the oldest age groups.

  18. On the Modeling of Shells in Multibody Dynamics

    NASA Technical Reports Server (NTRS)

    Bauchau, Olivier A.; Choi, Jou-Young; Bottasso, Carlo L.

    2000-01-01

    Energy preserving/decaying schemes are presented for the simulation of the nonlinear multibody systems involving shell components. The proposed schemes are designed to meet four specific requirements: unconditional nonlinear stability of the scheme, a rigorous treatment of both geometric and material nonlinearities, exact satisfaction of the constraints, and the presence of high frequency numerical dissipation. The kinematic nonlinearities associated with arbitrarily large displacements and rotations of shells are treated in a rigorous manner, and the material nonlinearities can be handled when the, constitutive laws stem from the existence of a strain energy density function. The efficiency and robustness of the proposed approach is illustrated with specific numerical examples that also demonstrate the need for integration schemes possessing high frequency numerical dissipation.

  19. Origin of the spike-timing-dependent plasticity rule

    NASA Astrophysics Data System (ADS)

    Cho, Myoung Won; Choi, M. Y.

    2016-08-01

    A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.

  20. Ice-sheet response to oceanic forcing.

    PubMed

    Joughin, Ian; Alley, Richard B; Holland, David M

    2012-11-30

    The ice sheets of Greenland and Antarctica are losing ice at accelerating rates, much of which is a response to oceanic forcing, especially of the floating ice shelves. Recent observations establish a clear correspondence between the increased delivery of oceanic heat to the ice-sheet margin and increased ice loss. In Antarctica, most of these processes are reasonably well understood but have not been rigorously quantified. In Greenland, an understanding of the processes by which warmer ocean temperatures drive the observed retreat remains elusive. Experiments designed to identify the relevant processes are confounded by the logistical difficulties of instrumenting ice-choked fjords with actively calving glaciers. For both ice sheets, multiple challenges remain before the fully coupled ice-ocean-atmosphere models needed for rigorous sea-level projection are available.

  1. Properties of Coulomb crystals: rigorous results.

    PubMed

    Cioslowski, Jerzy

    2008-04-28

    Rigorous equalities and bounds for several properties of Coulomb crystals are presented. The energy e(N) per particle pair is shown to be a nondecreasing function of the particle number N for all clusters described by double-power-law pairwise-additive potentials epsilon(r) that are unbound at both r-->0 and r-->infinity. A lower bound for the ratio of the mean reciprocal crystal radius and e(N) is derived. The leading term in the asymptotic expression for the shell capacity that appears in the recently introduced approximate model of Coulomb crystals is obtained, providing in turn explicit large-N asymptotics for e(N) and the mean crystal radius. In addition, properties of the harmonic vibrational spectra are investigated, producing an upper bound for the zero-point energy.

  2. Human milk banking in the volunteer sector: policy development and actuality in 1970s Australia.

    PubMed

    Thorley, Virginia

    2012-04-01

    to describe the development of rigorous milk banking policies in the voluntary sector in Australia, 1975-1979, by the non-government organisation, the Nursing Mothers' Association of Australia (now the Australian Breastfeeding Association), and the eventual abandonment of milk banking by the organisation. historical article. Australia in the years 1975-1979. during the period in which the policy development described here took place, conducting a milk bank to the rigorous standards set by the organisation required too heavy an investment of hours by unpaid volunteer coordinators to be sustainable. in establishing and continuing a successful milk bank, models which depend less on volunteer hours may be more sustainable. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. A projection of lesser prairie chicken (Tympanuchus pallidicinctus) populations range-wide

    USGS Publications Warehouse

    Cummings, Jonathan W.; Converse, Sarah J.; Moore, Clinton T.; Smith, David R.; Nichols, Clay T.; Allan, Nathan L.; O'Meilia, Chris M.

    2017-08-09

    We built a population viability analysis (PVA) model to predict future population status of the lesser prairie-chicken (Tympanuchus pallidicinctus, LEPC) in four ecoregions across the species’ range. The model results will be used in the U.S. Fish and Wildlife Service's (FWS) Species Status Assessment (SSA) for the LEPC. Our stochastic projection model combined demographic rate estimates from previously published literature with demographic rate estimates that integrate the influence of climate conditions. This LEPC PVA projects declining populations with estimated population growth rates well below 1 in each ecoregion regardless of habitat or climate change. These results are consistent with estimates of LEPC population growth rates derived from other demographic process models. Although the absolute magnitude of the decline is unlikely to be as low as modeling tools indicate, several different lines of evidence suggest LEPC populations are declining.

  4. Steady-state and dynamic models for particle engulfment during solidification

    NASA Astrophysics Data System (ADS)

    Tao, Yutao; Yeckel, Andrew; Derby, Jeffrey J.

    2016-06-01

    Steady-state and dynamic models are developed to study the physical mechanisms that determine the pushing or engulfment of a solid particle at a moving solid-liquid interface. The mathematical model formulation rigorously accounts for energy and momentum conservation, while faithfully representing the interfacial phenomena affecting solidification phase change and particle motion. A numerical solution approach is developed using the Galerkin finite element method and elliptic mesh generation in an arbitrary Lagrangian-Eulerian implementation, thus allowing for a rigorous representation of forces and dynamics previously inaccessible by approaches using analytical approximations. We demonstrate that this model accurately computes the solidification interface shape while simultaneously resolving thin fluid layers around the particle that arise from premelting during particle engulfment. We reinterpret the significance of premelting via the definition an unambiguous critical velocity for engulfment from steady-state analysis and bifurcation theory. We also explore the complicated transient behaviors that underlie the steady states of this system and posit the significance of dynamical behavior on engulfment events for many systems. We critically examine the onset of engulfment by comparing our computational predictions to those obtained using the analytical model of Rempel and Worster [29]. We assert that, while the accurate calculation of van der Waals repulsive forces remains an open issue, the computational model developed here provides a clear benefit over prior models for computing particle drag forces and other phenomena needed for the faithful simulation of particle engulfment.

  5. Measurements of the degree of development of rigor mortis as an indicator of stress in slaughtered pigs.

    PubMed

    Warriss, P D; Brown, S N; Knowles, T G

    2003-12-13

    The degree of development of rigor mortis in the carcases of slaughter pigs was assessed subjectively on a three-point scale 35 minutes after they were exsanguinated, and related to the levels of cortisol, lactate and creatine kinase in blood collected at exsanguination. Earlier rigor development was associated with higher concentrations of these stress indicators in the blood. This relationship suggests that the mean rigor score, and the frequency distribution of carcases that had or had not entered rigor, could be used as an index of the degree of stress to which the pigs had been subjected.

  6. Numerical Modeling of Sub-Wavelength Anti-Reflective Structures for Solar Module Applications

    PubMed Central

    Han, Katherine; Chang, Chih-Hung

    2014-01-01

    This paper reviews the current progress in mathematical modeling of anti-reflective subwavelength structures. Methods covered include effective medium theory (EMT), finite-difference time-domain (FDTD), transfer matrix method (TMM), the Fourier modal method (FMM)/rigorous coupled-wave analysis (RCWA) and the finite element method (FEM). Time-based solutions to Maxwell’s equations, such as FDTD, have the benefits of calculating reflectance for multiple wavelengths of light per simulation, but are computationally intensive. Space-discretized methods such as FDTD and FEM output field strength results over the whole geometry and are capable of modeling arbitrary shapes. Frequency-based solutions such as RCWA/FMM and FEM model one wavelength per simulation and are thus able to handle dispersion for regular geometries. Analytical approaches such as TMM are appropriate for very simple thin films. Initial disadvantages such as neglect of dispersion (FDTD), inaccuracy in TM polarization (RCWA), inability to model aperiodic gratings (RCWA), and inaccuracy with metallic materials (FDTD) have been overcome by most modern software. All rigorous numerical methods have accurately predicted the broadband reflection of ideal, graded-index anti-reflective subwavelength structures; ideal structures are tapered nanostructures with periods smaller than the wavelengths of light of interest and lengths that are at least a large portion of the wavelengths considered. PMID:28348287

  7. Open Pit Mine 3d Mapping by Tls and Digital Photogrammetry: 3d Model Update Thanks to a Slam Based Approach

    NASA Astrophysics Data System (ADS)

    Vassena, G.; Clerici, A.

    2018-05-01

    The state of the art of 3D surveying technologies, if correctly applied, allows to obtain 3D coloured models of large open pit mines using different technologies as terrestrial laser scanner (TLS), with images, combined with UAV based digital photogrammetry. GNSS and/or total station are also currently used to geo reference the model. The University of Brescia has been realised a project to map in 3D an open pit mine located in Botticino, a famous location of marble extraction close to Brescia in North Italy. Terrestrial Laser Scanner 3D point clouds combined with RGB images and digital photogrammetry from UAV have been used to map a large part of the cave. By rigorous and well know procedures a 3D point cloud and mesh model have been obtained using an easy and rigorous approach. After the description of the combined mapping process, the paper describes the innovative process proposed for the daily/weekly update of the model itself. To realize this task a SLAM technology approach is described, using an innovative approach based on an innovative instrument capable to run an automatic localization process and real time on the field change detection analysis.

  8. Delay Discounting: I'm a "K", You're a "K"

    ERIC Educational Resources Information Center

    Odum, Amy L.

    2011-01-01

    Delay discounting is the decline in the present value of a reward with delay to its receipt. Across a variety of species, populations, and reward types, value declines hyperbolically with delay. Value declines steeply with shorter delays, but more shallowly with longer delays. Quantitative modeling provides precise measures to characterize the…

  9. Association of Crossword Puzzle Participation with Memory Decline in Persons Who Develop Dementia

    PubMed Central

    Pillai, Jagan A.; Hall, Charles B.; Dickson, Dennis W.; Buschke, Herman; Lipton, Richard B.; Verghese, Joe

    2013-01-01

    Participation in cognitively stimulating leisure activities such as crossword puzzles may delay onset of the memory decline in the preclinical stages of dementia, possibly via its effect on improving cognitive reserve. We followed 488 initially cognitively intact community residing individuals with clinical and cognitive assessments every 12–18 months in the Bronx Aging Study. We assessed the influence of crossword puzzle participation on the onset of accelerated memory decline as measured by the Buschke Selective Reminding Test in 101 individuals who developed incident dementia using a change point model. Crossword puzzle participation at baseline delayed onset of accelerated memory decline by 2.54 years. Inclusion of education or participation in other cognitively stimulating activities did not significantly add to the fit of the model beyond the effect of puzzles. Our findings show that late life crossword puzzle participation, independent of education, was associated with delayed onset of memory decline in persons who developed dementia. Given the wide availability and accessibility of crossword puzzles, their role in preventing cognitive decline should be validated in future clinical trials. PMID:22040899

  10. BioAge: Toward A Multi-Determined, Mechanistic Account of Cognitive Aging

    PubMed Central

    DeCarlo, Correne A.; Tuokko, Holly A.; Williams, Dorothy; Dixon, Roger A.; MacDonald, Stuart W.S.

    2014-01-01

    The search for reliable early indicators of age-related cognitive decline represents a critical avenue for progress in aging research. Chronological age is a commonly used developmental index; however, it offers little insight into the mechanisms underlying cognitive decline. In contrast, biological age (BioAge), reflecting the vitality of essential biological systems, represents a promising operationalization of developmental time. Current BioAge models have successfully predicted age-related cognitive deficits. Research on aging-related cognitive function indicates that the interaction of multiple risk and protective factors across the human lifespan confers individual risk for late-life cognitive decline, implicating a multi-causal explanation. In this review, we explore current BioAge models, describe three broad yet pathologically relevant biological processes linked to cognitive decline, and propose a novel operationalization of BioAge accounting for both moderating and causal mechanisms of cognitive decline and dementia. We argue that a multivariate and mechanistic BioAge approach will lead to a greater understanding of disease pathology as well as more accurate prediction and early identification of late-life cognitive decline. PMID:25278166

  11. BioAge: toward a multi-determined, mechanistic account of cognitive aging.

    PubMed

    DeCarlo, Correne A; Tuokko, Holly A; Williams, Dorothy; Dixon, Roger A; MacDonald, Stuart W S

    2014-11-01

    The search for reliable early indicators of age-related cognitive decline represents a critical avenue for progress in aging research. Chronological age is a commonly used developmental index; however, it offers little insight into the mechanisms underlying cognitive decline. In contrast, biological age (BioAge), reflecting the vitality of essential biological systems, represents a promising operationalization of developmental time. Current BioAge models have successfully predicted age-related cognitive deficits. Research on aging-related cognitive function indicates that the interaction of multiple risk and protective factors across the human lifespan confers individual risk for late-life cognitive decline, implicating a multi-causal explanation. In this review, we explore current BioAge models, describe three broad yet pathologically relevant biological processes linked to cognitive decline, and propose a novel operationalization of BioAge accounting for both moderating and causal mechanisms of cognitive decline and dementia. We argue that a multivariate and mechanistic BioAge approach will lead to a greater understanding of disease pathology as well as more accurate prediction and early identification of late-life cognitive decline. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Decline in Antarctic Ozone Depletion and Lower Stratospheric Chlorine Determined From Aura Microwave Limb Sounder Observations

    NASA Astrophysics Data System (ADS)

    Strahan, Susan E.; Douglass, Anne R.

    2018-01-01

    Attribution of Antarctic ozone recovery to the Montreal protocol requires evidence that (1) Antarctic chlorine levels are declining and (2) there is a reduction in ozone depletion in response to a chlorine decline. We use Aura Microwave Limb Sounder measurements of O3, HCl, and N2O to demonstrate that inorganic chlorine (Cly) from 2013 to 2016 was 223 ± 93 parts per trillion lower in the Antarctic lower stratosphere than from 2004 to 2007 and that column ozone depletion declined in response. The mean Cly decline rate, 0.8%/yr, agrees with the expected rate based on chlorofluorocarbon lifetimes. N2O measurements are crucial for identifying changes in stratospheric Cly loading independent of dynamical variability. From 2005 to 2016, the ozone depletion and Cly time series show matching periods of decline, stability, and increase. The observed sensitivity of O3 depletion to changing Cly agrees with the sensitivity simulated by the Global Modeling Initiative chemistry transport model integrated with Modern Era Retrospective Analysis for Research and Applications 2 meteorology.

  13. The declining influence of family background on educational attainment in Australia: The role of measured and unmeasured influences.

    PubMed

    Marks, Gary N; Mooi-Reci, Irma

    2016-01-01

    The paper examines changes in the influence of family background, including socioeconomic and social background variables on educational attainment in Australia for cohorts born between 1890 and 1982. We test hypotheses from modernization theory on sibling data using random effects models and find: (i) substantial declines in the influence of family background on educational attainment (indicated by the sibling intraclass correlations); (ii) declines in the effects of both economic and cultural socioeconomic background variables; (iii) changes in the effects of some social background variables (e.g., family size); (iv) and declines in the extent that socioeconomic and social background factors account for variation in educational attainment. Unmeasured family background factors are more important, and proportionally increasingly so, for educational attainment than the measured socioeconomic and social background factors analyzed. Fixed effects models showed steeper declines in the effects of socioeconomic background variables than in standard analyses suggesting that unmeasured family factors associated with socioeconomic background obscure the full extent of the decline. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Coarse-grained stochastic processes and kinetic Monte Carlo simulators for the diffusion of interacting particles

    NASA Astrophysics Data System (ADS)

    Katsoulakis, Markos A.; Vlachos, Dionisios G.

    2003-11-01

    We derive a hierarchy of successively coarse-grained stochastic processes and associated coarse-grained Monte Carlo (CGMC) algorithms directly from the microscopic processes as approximations in larger length scales for the case of diffusion of interacting particles on a lattice. This hierarchy of models spans length scales between microscopic and mesoscopic, satisfies a detailed balance, and gives self-consistent fluctuation mechanisms whose noise is asymptotically identical to the microscopic MC. Rigorous, detailed asymptotics justify and clarify these connections. Gradient continuous time microscopic MC and CGMC simulations are compared under far from equilibrium conditions to illustrate the validity of our theory and delineate the errors obtained by rigorous asymptotics. Information theory estimates are employed for the first time to provide rigorous error estimates between the solutions of microscopic MC and CGMC, describing the loss of information during the coarse-graining process. Simulations under periodic boundary conditions are used to verify the information theory error estimates. It is shown that coarse-graining in space leads also to coarse-graining in time by q2, where q is the level of coarse-graining, and overcomes in part the hydrodynamic slowdown. Operation counting and CGMC simulations demonstrate significant CPU savings in continuous time MC simulations that vary from q3 for short potentials to q4 for long potentials. Finally, connections of the new coarse-grained stochastic processes to stochastic mesoscopic and Cahn-Hilliard-Cook models are made.

  15. Terminal decline and practice effects in older adults without dementia: the MoVIES project.

    PubMed

    Dodge, Hiroko H; Wang, Chia-Ning; Chang, Chung-Chou H; Ganguli, Mary

    2011-08-23

    To track cognitive change over time in dementia-free older adults and to examine terminal cognitive decline. A total of 1,230 subjects who remained free from dementia over 14 years of follow-up were included in a population-based epidemiologic cohort study. First, we compared survivors and decedents on their trajectories of 5 cognitive functions (learning, memory, language, psychomotor speed, executive functions), dissociating practice effects which can mask clinically significant decline from age-associated cognitive decline. We used longitudinal mixed-effects models with penalized linear spline. Second, limiting the sample to 613 subjects who died during follow-up, we identified the inflection points at which the rate of cognitive decline accelerated, in relation to time of death, controlling for practice effects. We used mixed-effects model with a change point. Age-associated cognitive trajectories were similar between decedents and survivors without dementia. However, substantial differences were observed between the trajectories of practice effects of survivors and decedents, resembling those usually observed between normal and mildly cognitively impaired elderly. Executive and language functions showed the earliest terminal declines, more than 9 years prior to death, independent of practice effects. Terminal cognitive decline in older adults without dementia may reflect presymptomatic disease which does not cross the clinical threshold during life. Alternatively, cognitive decline attributed to normal aging may itself represent underlying neurodegenerative or vascular pathology. Although we cannot conclude definitively from this study, the separation of practice effects from age-associated decline could help identify preclinical dementia.

  16. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    NASA Astrophysics Data System (ADS)

    Ngampitipan, Tritos; Boonserm, Petarpa; Chatrabhuti, Auttakit; Visser, Matt

    2016-06-01

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy, the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.

  17. The rigorous bound on the transmission probability for massless scalar field of non-negative-angular-momentum mode emitted from a Myers-Perry black hole

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ngampitipan, Tritos, E-mail: tritos.ngampitipan@gmail.com; Particle Physics Research Laboratory, Department of Physics, Faculty of Science, Chulalongkorn University, Phayathai Road, Patumwan, Bangkok 10330; Boonserm, Petarpa, E-mail: petarpa.boonserm@gmail.com

    Hawking radiation is the evidence for the existence of black hole. What an observer can measure through Hawking radiation is the transmission probability. In the laboratory, miniature black holes can successfully be generated. The generated black holes are, most commonly, Myers-Perry black holes. In this paper, we will derive the rigorous bounds on the transmission probabilities for massless scalar fields of non-negative-angular-momentum modes emitted from a generated Myers-Perry black hole in six, seven, and eight dimensions. The results show that for low energy, the rigorous bounds increase with the increase in the energy of emitted particles. However, for high energy,more » the rigorous bounds decrease with the increase in the energy of emitted particles. When the black holes spin faster, the rigorous bounds decrease. For dimension dependence, the rigorous bounds also decrease with the increase in the number of extra dimensions. Furthermore, as comparison to the approximate transmission probability, the rigorous bound is proven to be useful.« less

  18. Do commonly used frailty models predict mortality, loss of autonomy and mental decline in older adults in northwestern Russia? A prospective cohort study.

    PubMed

    Turusheva, Anna; Frolova, Elena; Korystina, Elena; Zelenukha, Dmitry; Tadjibaev, Pulodjon; Gurina, Natalia; Turkeshi, Eralda; Degryse, Jean-Marie

    2016-05-09

    Frailty prevalence differs across countries depending on the models used to assess it that are based on various conceptual and operational definitions. This study aims to assess the clinical validity of three frailty models among community-dwelling older adults in north-western Russia where there is a higher incidence of cardiovascular disease and lower life expectancy than in European countries. The Crystal study is a population-based prospective cohort study in Kolpino, St. Petersburg, Russia. A random sample of the population living in the district was stratified into two age groups: 65-75 (n = 305) and 75+ (n = 306) and had a baseline comprehensive health assessment followed by a second one after 33.4 +/-3 months. The total observation time was 47 +/-14.6 months. Frailty was assessed according to the models of Fried, Puts and Steverink-Slaets. Its association with mortality at 5 years follow-up as well as dependency, mental and physical decline at around 2.5 years follow up was explored by multivariable and time-to-event analyses. Mortality was predicted independently from age, sex and comorbidities only by the frail status of the Fried model in those over 75 years old [HR (95 % CI) = 2.50 (1.20-5.20)]. Mental decline was independently predicted only by pre-frail [OR (95 % CI) = 0.24 (0.10-0.55)] and frail [OR (95 % CI) = 0.196 (0.06-0.67)] status of Fried model in those 65-75 years old. The prediction of dependency and physical decline by pre-frail and frail status of any the three frailty models was not statistically significant in this cohort of older adults. None of the three frailty models was valid at predicting 5 years mortality and disability, mental and physical decline at 2.5 years in a cohort of older adults in north-west Russia. Frailty by the Fried model had only limited value for mortality in those 75 years old and mental decline in those 65-75 years old. Further research is needed to identify valid frailty markers for older adults in this population.

  19. Understanding the temporal dimension of the red-edge spectral region for forest decline detection using high-resolution hyperspectral and Sentinel-2a imagery

    NASA Astrophysics Data System (ADS)

    Zarco-Tejada, P. J.; Hornero, A.; Hernández-Clemente, R.; Beck, P. S. A.

    2018-03-01

    The operational monitoring of forest decline requires the development of remote sensing methods that are sensitive to the spatiotemporal variations of pigment degradation and canopy defoliation. In this context, the red-edge spectral region (RESR) was proposed in the past due to its combined sensitivity to chlorophyll content and leaf area variation. In this study, the temporal dimension of the RESR was evaluated as a function of forest decline using a radiative transfer method with the PROSPECT and 3D FLIGHT models. These models were used to generate synthetic pine stands simulating decline and recovery processes over time and explore the temporal rate of change of the red-edge chlorophyll index (CI) as compared to the trajectories obtained for the structure-related Normalized Difference Vegetation Index (NDVI). The temporal trend method proposed here consisted of using synthetic spectra to calculate the theoretical boundaries of the subspace for healthy and declining pine trees in the temporal domain, defined by CItime=n/CItime=n+1 vs. NDVItime=n/NDVItime=n+1. Within these boundaries, trees undergoing decline and recovery processes showed different trajectories through this subspace. The method was then validated using three high-resolution airborne hyperspectral images acquired at 40 cm resolution and 260 spectral bands of 6.5 nm full-width half-maximum (FWHM) over a forest with widespread tree decline, along with field-based monitoring of chlorosis and defoliation (i.e., 'decline' status) in 663 trees between the years 2015 and 2016. The temporal rate of change of chlorophyll vs. structural indices, based on reflectance spectra extracted from the hyperspectral images, was different for trees undergoing decline, and aligned towards the decline baseline established using the radiative transfer models. By contrast, healthy trees over time aligned towards the theoretically obtained healthy baseline. The applicability of this temporal trend method to the red-edge bands of the MultiSpectral Imager (MSI) instrument on board Sentinel-2a for operational forest status monitoring was also explored by comparing the temporal rate of change of the Sentinel-2-derived CI over areas with declining and healthy trees. Results demonstrated that the Sentinel-2a red-edge region was sensitive to the temporal dimension of forest condition, as the relationships obtained for pixels in healthy condition deviated from those of pixels undergoing decline.

  20. Mathematical modeling of liquid/liquid hollow fiber membrane contactor accounting for interfacial transport phenomena: Extraction of lanthanides as a surrogate for actinides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, J.D.

    1994-08-04

    This report is divided into two parts. The second part is divided into the following sections: experimental protocol; modeling the hollow fiber extractor using film theory; Graetz model of the hollow fiber membrane process; fundamental diffusive-kinetic model; and diffusive liquid membrane device-a rigorous model. The first part is divided into: membrane and membrane process-a concept; metal extraction; kinetics of metal extraction; modeling the membrane contactor; and interfacial phenomenon-boundary conditions-applied to membrane transport.

  1. Noncommunicable Diseases: Three Decades Of Global Data Show A Mixture Of Increases And Decreases In Mortality Rates.

    PubMed

    Ali, Mohammed K; Jaacks, Lindsay M; Kowalski, Alysse J; Siegel, Karen R; Ezzati, Majid

    2015-09-01

    Noncommunicable diseases are the leading health concerns of the modern era, accounting for two-thirds of global deaths, half of all disability, and rapidly growing costs. To provide a contemporary overview of the burdens caused by noncommunicable diseases, we compiled mortality data reported by authorities in forty-nine countries for atherosclerotic cardiovascular diseases; diabetes; chronic respiratory diseases; and lung, colon, breast, cervical, liver, and stomach cancers. From 1980 to 2012, on average across all countries, mortality for cardiovascular disease, stomach cancer, and cervical cancer declined, while mortality for diabetes, liver cancer, and female chronic respiratory disease and lung cancer increased. In contrast to the relatively steep cardiovascular and cancer mortality declines observed in high-income countries, mortality for cardiovascular disease and chronic respiratory disease was flat in most low- and middle-income countries, which also experienced increasing breast and colon cancer mortality. These divergent mortality patterns likely reflect differences in timing and magnitude of risk exposures, health care, and policies to counteract the diseases. Improving both the coverage and the accuracy of mortality documentation in populous low- and middle-income countries is a priority, as is the need to rigorously evaluate societal-level interventions. Furthermore, given the complex, chronic, and progressive nature of noncommunicable diseases, policies and programs to prevent and control them need to be multifaceted and long-term, as returns on investment accrue with time. Project HOPE—The People-to-People Health Foundation, Inc.

  2. Using constraints and their value for optimization of large ODE systems

    PubMed Central

    Domijan, Mirela; Rand, David A.

    2015-01-01

    We provide analytical tools to facilitate a rigorous assessment of the quality and value of the fit of a complex model to data. We use this to provide approaches to model fitting, parameter estimation, the design of optimization functions and experimental optimization. This is in the context where multiple constraints are used to select or optimize a large model defined by differential equations. We illustrate the approach using models of circadian clocks and the NF-κB signalling system. PMID:25673300

  3. Holistic Competence: Putting Judgements First

    ERIC Educational Resources Information Center

    Beckett, David

    2008-01-01

    Professional practice can be conceptualised holistically, and in fact during the 1990s the "Australian model" of integrated or holistic competence emerged empirically. This piece outlines that story, and then develops a more rigorous conceptual analysis of what it is to make competent practical judgements, through inferences, in…

  4. Memory sparing, fast scattering formalism for rigorous diffraction modeling

    NASA Astrophysics Data System (ADS)

    Iff, W.; Kämpfe, T.; Jourlin, Y.; Tishchenko, A. V.

    2017-07-01

    The basics and algorithmic steps of a novel scattering formalism suited for memory sparing and fast electromagnetic calculations are presented. The formalism, called ‘S-vector algorithm’ (by analogy with the known scattering-matrix algorithm), allows the calculation of the collective scattering spectra of individual layered micro-structured scattering objects. A rigorous method of linear complexity is applied to model the scattering at individual layers; here the generalized source method (GSM) resorting to Fourier harmonics as basis functions is used as one possible method of linear complexity. The concatenation of the individual scattering events can be achieved sequentially or in parallel, both having pros and cons. The present development will largely concentrate on a consecutive approach based on the multiple reflection series. The latter will be reformulated into an implicit formalism which will be associated with an iterative solver, resulting in improved convergence. The examples will first refer to 1D grating diffraction for the sake of simplicity and intelligibility, with a final 2D application example.

  5. Robust source and mask optimization compensating for mask topography effects in computational lithography.

    PubMed

    Li, Jia; Lam, Edmund Y

    2014-04-21

    Mask topography effects need to be taken into consideration for a more accurate solution of source mask optimization (SMO) in advanced optical lithography. However, rigorous 3D mask models generally involve intensive computation and conventional SMO fails to manipulate the mask-induced undesired phase errors that degrade the usable depth of focus (uDOF) and process yield. In this work, an optimization approach incorporating pupil wavefront aberrations into SMO procedure is developed as an alternative to maximize the uDOF. We first design the pupil wavefront function by adding primary and secondary spherical aberrations through the coefficients of the Zernike polynomials, and then apply the conjugate gradient method to achieve an optimal source-mask pair under the condition of aberrated pupil. We also use a statistical model to determine the Zernike coefficients for the phase control and adjustment. Rigorous simulations of thick masks show that this approach provides compensation for mask topography effects by improving the pattern fidelity and increasing uDOF.

  6. The Role of Forest Tent Caterpillar Defoliations and Partial Harvest in the Decline and Death of Sugar Maple

    PubMed Central

    Hartmann, Henrik; Messier, Christian

    2008-01-01

    Background and Aims Natural and anthropogenic disturbances can act as stresses on tree vigour. According to Manion's conceptual model of tree disease, the initial vigour of trees decreases as a result of predisposing factors that render these trees more vulnerable to severe inciting stresses, stresses that can then cause final vigour decline and subsequent tree death. This tree disease model was tested in sugar maple (Acer saccharum) by assessing the roles of natural and anthropogenic disturbances in tree decline and death. Methods Radial growth data from 377 sugar maple trees that had undergone both defoliations by insects and partial harvest were used to estimate longitudinal survival probabilities as a proxy for tree vigour. Radial growth rates and survival probabilities were compared among trees subjected to different levels of above- and below-ground disturbances, between periods of defoliation and harvest, and between live and dead trees. Key Results Manion's tree disease model correctly accounts for vigour decline and tree death in sugar maple; tree growth and vigour were negatively affected by a first defoliation, predisposing these trees to death later during the study period due to a second insect outbreak that initiated a final vigour decline. This decline was accelerated by the partial harvest disturbance in 1993. Even the most severe anthropogenic disturbances from partial harvest did not cause, unlike insect defoliation, any growth or vigour declines in live sugar maple. Conclusions Natural disturbances acted as predisposing and inciting stresses in tree sugar maple decline and death. Anthropogenic disturbances from a partial harvest at worst accelerated a decline in trees that were already weakened by predisposing and inciting stresses (i.e. repeated insect defoliations). Favourable climatic conditions just before and after the partial harvest may have alleviated possible negative effects on growth resulting from harvesting. PMID:18660493

  7. Reframing Rigor: A Modern Look at Challenge and Support in Higher Education

    ERIC Educational Resources Information Center

    Campbell, Corbin M.; Dortch, Deniece; Burt, Brian A.

    2018-01-01

    This chapter describes the limitations of the traditional notions of academic rigor in higher education, and brings forth a new form of rigor that has the potential to support student success and equity.

  8. The link between rapid enigmatic amphibian decline and the globally emerging chytrid fungus.

    PubMed

    Lötters, Stefan; Kielgast, Jos; Bielby, Jon; Schmidtlein, Sebastian; Bosch, Jaime; Veith, Michael; Walker, Susan F; Fisher, Matthew C; Rödder, Dennis

    2009-09-01

    Amphibians are globally declining and approximately one-third of all species are threatened with extinction. Some of the most severe declines have occurred suddenly and for unknown reasons in apparently pristine habitats. It has been hypothesized that these "rapid enigmatic declines" are the result of a panzootic of the disease chytridiomycosis caused by globally emerging amphibian chytrid fungus. In a Species Distribution Model, we identified the potential distribution of this pathogen. Areas and species from which rapid enigmatic decline are known significantly overlap with those of highest environmental suitability to the chytrid fungus. We confirm the plausibility of a link between rapid enigmatic decline in worldwide amphibian species and epizootic chytridiomycosis.

  9. Modeling Longitudinal Changes in Older Adults’ Memory for Spoken Discourse: Findings from the ACTIVE Cohort

    PubMed Central

    Payne, Brennan R.; Gross, Alden L.; Parisi, Jeanine M.; Sisco, Shannon M.; Stine-Morrow, Elizabeth A. L.; Marsiske, Michael; Rebok, George W.

    2014-01-01

    Episodic memory shows substantial declines with advancing age, but research on longitudinal trajectories of spoken discourse memory (SDM) in older adulthood is limited. Using parallel process latent growth curve models, we examined 10 years of longitudinal data from the no-contact control group (N = 698) of the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) randomized controlled trial in order to test (a) the degree to which SDM declines with advancing age, (b) predictors of these age-related declines, and (c) the within-person relationship between longitudinal changes in SDM and longitudinal changes in fluid reasoning and verbal ability over 10 years, independent of age. Individuals who were younger, White, had more years of formal education, were male, and had better global cognitive function and episodic memory performance at baseline demonstrated greater levels of SDM on average. However, only age at baseline uniquely predicted longitudinal changes in SDM, such that declines accelerated with greater age. Independent of age, within-person decline in reasoning ability over the 10-year study period was substantially correlated with decline in SDM (r = .87). An analogous association with SDM did not hold for verbal ability. The findings suggest that longitudinal declines in fluid cognition are associated with reduced spoken language comprehension. Unlike findings from memory for written prose, preserved verbal ability may not protect against developmental declines in memory for speech. PMID:24304364

  10. Arresting Decline in Shared Governance: Towards a Flexible Model for Academic Participation

    ERIC Educational Resources Information Center

    Lapworth, Susan

    2004-01-01

    This paper considers tensions between corporate models of governance focused on the governing body and more traditional, consensual academic approaches. It argues that despite these tensions, a decline in the role of the academic community in matters of institutional governance (shared governance) is neither desirable nor inevitable, and that…

  11. Demand for resident hunting in the southeastern United States

    Treesearch

    Neelam Poudyal; Seong Hoon Cho; J. Michael Bowker

    2008-01-01

    We modeled hunting demand among resident hunters in the Southeastern United States. Our model revealed that future hunting demand will likely decline in this region. Population growth in the region will increase demand but structural change in the region's demography (e.g., "browning" and "aging "), along with declining forestland access will...

  12. Assessing Rigor in Experiential Education: A Working Model from Partners in the Parks

    ERIC Educational Resources Information Center

    MacLean, John S.; White, Brain J.

    2013-01-01

    Assessment has become a popular buzzword on academic campuses over the last few decades. Most assessment models are designed to evaluate traditional learning structures. If we were to state simply the process of assessment, it might read like this: (1) what you want the students to learn; (2) how you want to teach the material; and (3) how you…

  13. Predictions on the Development Dimensions of Provincial Tourism Discipline Based on the Artificial Neural Network BP Model

    ERIC Educational Resources Information Center

    Yang, Yang; Hu, Jun; Lv, Yingchun; Zhang, Mu

    2013-01-01

    As the tourism industry has gradually become the strategic mainstay industry of the national economy, the scope of the tourism discipline has developed rigorously. This paper makes a predictive study on the development of the scope of Guangdong provincial tourism discipline based on the artificial neural network BP model in order to find out how…

  14. Advanced statistical methods to study the effects of gastric tube and non-invasive ventilation on functional decline and survival in amyotrophic lateral sclerosis.

    PubMed

    Atassi, Nazem; Cudkowicz, Merit E; Schoenfeld, David A

    2011-07-01

    A few studies suggest that non-invasive ventilation (1) and gastric tube (G-tube) may have a positive impact on survival but the effect on functional decline is unclear. Confounding by indication may have produced biased estimates of the benefit seen in some of these retrospective studies. The objective of this study was to evaluate the effects of G-tube and NIV on survival and functional decline using advanced statistical models that adjust for confounding by indications. A database of 331 subjects enrolled in previous clinical trials in ALS was available for analysis. Marginal structural models (MSM) were used to compare the mortality hazards and ALSFRS-R slopes between treatment and non-treatment groups, after adjusting for confounding by indication. Results showed that the placement of a G-tube was associated with an additional 1.42 units/month decline in the ALSFRS-R slope (p < 0.0001) and increased mortality hazard of 0.28 (p = 0.02). The use of NIV had no significant effect on ALSFRS-R decline or mortality. In conclusion, marginal structural models can be used to adjust for confounding by indication in retrospective ALS studies. G-tube placement could be followed by a faster rate of functional decline and increased mortality. Our results may suffer from some of the limitations of retrospective analyses.

  15. Projected response of an endangered marine turtle population to climate change

    NASA Astrophysics Data System (ADS)

    Saba, Vincent S.; Stock, Charles A.; Spotila, James R.; Paladino, Frank V.; Tomillo, Pilar Santidrián

    2012-11-01

    Assessing the potential impacts of climate change on individual species and populations is essential for the stewardship of ecosystems and biodiversity. Critically endangered leatherback turtles in the eastern Pacific Ocean are excellent candidates for such an assessment because their sensitivity to contemporary climate variability has been substantially studied. If incidental fisheries mortality is eliminated, this population still faces the challenge of recovery in a rapidly changing climate. Here we combined an Earth system model, climate model projections assessed by the Intergovernmental Panel on Climate Change and a population dynamics model to estimate a 7% per decade decline in the Costa Rica nesting population over the twenty-first century. Whereas changes in ocean conditions had a small effect on the population, the ~2.5°C warming of the nesting beach was the primary driver of the decline through reduced hatching success and hatchling emergence rate. Hatchling sex ratio did not substantially change. Adjusting nesting phenology or changing nesting sites may not entirely prevent the decline, but could offset the decline rate. However, if future observations show a long-term decline in hatching success and emergence rate, anthropogenic climate mitigation of nests (for example, shading, irrigation) may be able to preserve the nesting population.

  16. Declining atmospheric deposition of heavy metals over the last three decades is reflected in soil and foliage of 97 beech (Fagus sylvatica) stands in the Vienna Woods☆

    PubMed Central

    Türtscher, Selina; Berger, Pétra; Lindebner, Leopold; Berger, Torsten W.

    2017-01-01

    Rigorous studies on long-term changes of heavy metal distribution in forest soils since the implementation of emission controls are rare. Hence, we resampled 97 old-growth beech stands in the Vienna Woods. This study exploits an extensive data set of soil (infiltration zone of stemflow and between trees area) and foliar chemistry from three decades ago. It was hypothesized that declining deposition of heavy metals is reflected in soil and foliar total contents of Pb, Cu, Zn, Ni, Mn and Fe. Mean soil contents of Pb in the stemflow area declined at the highest rate from 223 to 50 mg kg−1 within the last three decades. Soil contents of Pb and Ni decreased significantly both in the stemflow area and the between trees area down to 80–90 cm soil depth from 1984 to 2012. Top soil (0–5 cm) accumulation and simultaneous loss in the lower soil over time for the plant micro nutrients Cu and Zn are suggested to be caused by plant uptake from deep horizons. Reduced soil leaching, due to a mean soil pH (H2O) increase from 4.3 to 4.9, and increased plant cycling are put forward to explain the significant increase of total Mn contents in the infiltration zone of beech stemflow. Top soil Pb contents in the stemflow area presently exceed the critical value at which toxicity symptoms may occur at numerous sites. Mean foliar contents of all six studied heavy metals decreased within the last three decades, but plant supply with the micro nutrients Cu, Zn, Mn and Fe is still in the optimum range for beech trees. It is concluded that heavy metal pollution is not critical for the studied beech stands any longer. PMID:28709055

  17. A slight recovery of soils from Acid Rain over the last three decades is not reflected in the macro nutrition of beech (Fagus sylvatica) at 97 forest stands of the Vienna Woods✰

    PubMed Central

    Berger, Pétra; Lindebner, Leopold

    2016-01-01

    Rigorous studies of recovery from soil acidification are rare. Hence, we resampled 97 old-growth beech stands in the Vienna Woods. This study exploits an extensive data set of soil (infiltration zone of stemflow and between trees area at different soil depths) and foliar chemistry from three decades ago. It was hypothesized that declining acidic deposition is reflected in soil and foliar chemistry. Top soil pH within the stemflow area increased significantly by 0.6 units in both H2O and KCl extracts from 1984 to 2012. Exchangeable Ca and Mg increased markedly in the stemflow area and to a lower extent in the top soil of the between trees area. Trends of declining base cations in the lower top soil were probably caused by mobilization of organic S and associated leaching with high amounts of sulfate. Contents of C, N and S decreased markedly in the stemflow area from 1984 to 2012, suggesting that mineralization rates of organic matter increased due to more favorable soil conditions. It is concluded that the top soil will continue to recover from acidic deposition. However, in the between trees areas and especially in deeper soil horizons recovery may be highly delayed. The beech trees of the Vienna Woods showed no sign of recovery from acidification although S deposition levels decreased. Release of historic S even increased foliar S contents. Base cation levels in the foliage declined but are still adequate for beech trees. Increasing N/nutrient ratios over time were considered not the result of marginally higher N foliar contents in 2012 but of diminishing nutrient uptake due to the decrease in ion concentration in soil solution. The mean foliar N/P ratio already increased to the alarming value of 31. Further nutritional imbalances will predispose trees to vitality loss. PMID:27344089

  18. Declining atmospheric deposition of heavy metals over the last three decades is reflected in soil and foliage of 97 beech (Fagus sylvatica) stands in the Vienna Woods.

    PubMed

    Türtscher, Selina; Berger, Pétra; Lindebner, Leopold; Berger, Torsten W

    2017-11-01

    Rigorous studies on long-term changes of heavy metal distribution in forest soils since the implementation of emission controls are rare. Hence, we resampled 97 old-growth beech stands in the Vienna Woods. This study exploits an extensive data set of soil (infiltration zone of stemflow and between trees area) and foliar chemistry from three decades ago. It was hypothesized that declining deposition of heavy metals is reflected in soil and foliar total contents of Pb, Cu, Zn, Ni, Mn and Fe. Mean soil contents of Pb in the stemflow area declined at the highest rate from 223 to 50 mg kg -1 within the last three decades. Soil contents of Pb and Ni decreased significantly both in the stemflow area and the between trees area down to 80-90 cm soil depth from 1984 to 2012. Top soil (0-5 cm) accumulation and simultaneous loss in the lower soil over time for the plant micro nutrients Cu and Zn are suggested to be caused by plant uptake from deep horizons. Reduced soil leaching, due to a mean soil pH (H 2 O) increase from 4.3 to 4.9, and increased plant cycling are put forward to explain the significant increase of total Mn contents in the infiltration zone of beech stemflow. Top soil Pb contents in the stemflow area presently exceed the critical value at which toxicity symptoms may occur at numerous sites. Mean foliar contents of all six studied heavy metals decreased within the last three decades, but plant supply with the micro nutrients Cu, Zn, Mn and Fe is still in the optimum range for beech trees. It is concluded that heavy metal pollution is not critical for the studied beech stands any longer. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Kindling of Life Stress in Bipolar Disorder: Effects of Early Adversity.

    PubMed

    Shapero, Benjamin G; Weiss, Rachel B; Burke, Taylor A; Boland, Elaine M; Abramson, Lyn Y; Alloy, Lauren B

    2017-05-01

    Most theoretical frameworks regarding the role of life stress in bipolar disorders (BD) do not incorporate the possibility of a changing relationship between psychosocial context and episode initiation across the course of the disorder. The kindling hypothesis theorizes that over the longitudinal course of recurrent affective disorders, the relationship between major life stressors and episode initiation declines (Post, 1992). The present study aimed to test an extension of the kindling hypothesis in BD by examining the effect of early life adversity on the relationship between proximal life events and prospectively assessed mood episodes. Data from 145 bipolar participants (59.3% female, 75.2% Caucasian, and mean age of 20.19 years; SD = 1.75 years) were collected as part of the Temple-Wisconsin Longitudinal Investigation of Bipolar Spectrum Project (112 Bipolar II; 33 Cyclothymic disorder). Participants completed a self-report measure of early adversity at baseline and interview-assessed mood episodes and life events at regular 4-month follow-ups. Results indicate that early childhood adversity sensitized bipolar participants to the effects of recent stressors only for depressive episodes and not hypomanic episodes within BD. This was particularly the case with minor negative events. The current study extends prior research examining the kindling model in BD using a methodologically rigorous assessment of life stressors and mood episode occurrence. Clinicians should assess experiences of early adversity in individuals with BD as it may impact reactivity to developing depressive episodes in response to future stressors. Copyright © 2017. Published by Elsevier Ltd.

  20. Repeated cognitive stimulation alleviates memory impairments in an Alzheimer's disease mouse model.

    PubMed

    Martinez-Coria, Hilda; Yeung, Stephen T; Ager, Rahasson R; Rodriguez-Ortiz, Carlos J; Baglietto-Vargas, David; LaFerla, Frank M

    2015-08-01

    Alzheimer's disease is a neurodegenerative disease associated with progressive memory and cognitive decline. Previous studies have identified the benefits of cognitive enrichment on reducing disease pathology. Additionally, epidemiological and clinical data suggest that repeated exercise, and cognitive and social enrichment, can improve and/or delay the cognitive deficiencies associated with aging and neurodegenerative diseases. In the present study, 3xTg-AD mice were exposed to a rigorous training routine beginning at 3 months of age, which consisted of repeated training in the Morris water maze spatial recognition task every 3 months, ending at 18 months of age. At the conclusion of the final Morris water maze training session, animals subsequently underwent testing in another hippocampus-dependent spatial task, the Barnes maze task, and on the more cortical-dependent novel object recognition memory task. Our data show that periodic cognitive enrichment throughout aging, via multiple learning episodes in the Morris water maze task, can improve the memory performance of aged 3xTg-AD mice in a separate spatial recognition task, and in a preference memory task, when compared to naïve aged matched 3xTg-AD mice. Furthermore, we observed that the cognitive enrichment properties of Morris water maze exposer, was detectable in repeatedly trained animals as early as 6 months of age. These findings suggest early repeated cognitive enrichment can mitigate the diverse cognitive deficits observed in Alzheimer's disease. Published by Elsevier Inc.

  1. Overcoming recruitment challenges in palliative care clinical trials.

    PubMed

    LeBlanc, Thomas W; Lodato, Jordan E; Currow, David C; Abernethy, Amy P

    2013-11-01

    Palliative care is increasingly viewed as a necessary component of cancer care, especially for patients with advanced disease. Rigorous clinical trials are thus needed to build the palliative care evidence base, but clinical research-especially participant recruitment-is difficult. Major barriers include (1) patient factors, (2) "gatekeeping," and (3) ethical concerns. Here we discuss an approach to overcoming these barriers, using the Palliative Care Trial (PCT) as a case study. The PCT was a 2 × 2 × 2 factorial randomized controlled trial (RCT) of different service delivery models to improve pain control in the palliative setting. It used a recruitment protocol that fused evidence-based strategies with principles of "social marketing," an approach involving the systematic application of marketing techniques. Main components included (1) an inclusive triage algorithm, (2) information booklets targeting particular stakeholders, (3) a specialized recruitment nurse, and (4) standardization of wording across all study communications. From an eligible pool of 607 patients, the PCT enrolled 461 patients over 26 months. Twenty percent of patients referred to the palliative care service were enrolled (76% of those eligible after screening). Several common barriers were minimized; among those who declined participation, family disinterest was uncommon (5%), as was the perception of burden imposed (4%). Challenges to clinical trial recruitment in palliative care are significant but not insurmountable. A carefully crafted recruitment and retention protocol can be effective. Our experience with designing and deploying a social-marketing-based protocol shows the benefits of such an approach.

  2. A Randomized Study of How Physicians Interpret Research Funding Disclosures

    PubMed Central

    Kesselheim, Aaron S.; Robertson, Christopher T.; Myers, Jessica A.; Rose, Susannah L.; Gillet, Victoria; Ross, Kathryn M.; Glynn, Robert J.; Joffe, Steven; Avorn, Jerry

    2012-01-01

    BACKGROUND The effects of clinical-trial funding on the interpretation of trial results are poorly understood. We examined how such support affects physicians’ reactions to trials with a high, medium, or low level of methodologic rigor. METHODS We presented 503 board-certified internists with abstracts that we designed describing clinical trials of three hypothetical drugs. The trials had high, medium, or low methodologic rigor, and each report included one of three support disclosures: funding from a pharmaceutical company, NIH funding, or none. For both factors studied (rigor and funding), one of the three possible variations was randomly selected for inclusion in the abstracts. Follow-up questions assessed the physicians’ impressions of the trials’ rigor, their confidence in the results, and their willingness to prescribe the drugs. RESULTS The 269 respondents (53.5% response rate) perceived the level of study rigor accurately. Physicians reported that they would be less willing to prescribe drugs tested in low-rigor trials than those tested in medium-rigor trials (odds ratio, 0.64; 95% confidence interval [CI], 0.46 to 0.89; P = 0.008) and would be more willing to prescribe drugs tested in high-rigor trials than those tested in medium-rigor trials (odds ratio, 3.07; 95% CI, 2.18 to 4.32; P<0.001). Disclosure of industry funding, as compared with no disclosure of funding, led physicians to downgrade the rigor of a trial (odds ratio, 0.63; 95% CI, 0.46 to 0.87; P = 0.006), their confidence in the results (odds ratio, 0.71; 95% CI, 0.51 to 0.98; P = 0.04), and their willingness to prescribe the hypothetical drugs (odds ratio, 0.68; 95% CI, 0.49 to 0.94; P = 0.02). Physicians were half as willing to prescribe drugs studied in industry-funded trials as they were to prescribe drugs studied in NIH-funded trials (odds ratio, 0.52; 95% CI, 0.37 to 0.71; P<0.001). These effects were consistent across all levels of methodologic rigor. CONCLUSIONS Physicians discriminate among trials of varying degrees of rigor, but industry sponsorship negatively influences their perception of methodologic quality and reduces their willingness to believe and act on trial findings, independently of the trial’s quality. These effects may influence the translation of clinical research into practice. PMID:22992075

  3. Twenty-Year Alcohol-Consumption and Drinking-Problem Trajectories of Older Men and Women*

    PubMed Central

    Brennan, Penny L.; Schutte, Kathleen K.; Moos, Bernice S.; Moos, Rudolf H.

    2011-01-01

    Objective: The aim of this study was to describe older adults' 20-year alcohol-consumption and drinking-problem trajectories, identify baseline predictors of them, and determine whether older men and women differ on late-life drinking trajectory characteristics and predictors. Method: Two-group simultaneous latent growth modeling was used to describe the characteristics and baseline predictors of older community-residing men's (n = 399) and women's (n = 320) 20-year drinking trajectories. Chi-square difference tests of increment in fit of latent growth models with and without gender invariance constraints were used to determine gender differences in drinking trajectory characteristics and predictors. Results: Unconditional quadratic growth models best described older individuals' within-individual, 20-year drinking trajectories, with alcohol consumption following an average pattern of delayed decline, and drinking problems an average pattern of decline followed by leveling off. On average, older men declined in alcohol consumption somewhat later than did older women. The best baseline predictors of more rapid decline in alcohol consumption and drinking problems were drinking variables indicative of heavier, more problematic alcohol use at late middle age. Conclusions: The course of alcohol consumption and drinking problems from late middle age onward is one of net decline, but this decline is neither swift nor invariable. Gender differences in the timing of decline in drinking suggest that ongoing monitoring of alcohol consumption may be especially important for older men. Further research is needed to identify factors known at late middle age that prospectively explain long-term change in late-life use of alcohol. PMID:21388604

  4. [Poliomyelitis--why we must continue to vaccinate!].

    PubMed

    Windorfer, A; Beyrer, K

    2005-02-24

    The eradication of polio--that is the worldwide elimination of the wild poliovirus--is now within reach. The current success of this international project is due largely to the rigorous immunization of the general population. Both live oral polio vaccine (OPV) and inactivated vaccine (IPV) administered by injection are applied, the pros and cons of each having to be weighed up. Since 1998, only the dead IPV vaccine has been recommended in Germany. It is essential that the acceptance of the need for immunization should not decline, and that the inoculation rate in countries in which polio has apparently been eliminated, should not fall below the critical threshold of about 85-80%. If in the future this figure is not reached, the population would be put at risk by the re-introduction of the polio virus into the country. Even when global elimination has been achieved, vaccination must be continued for several years. The recommended immunization schedule covers three vaccinations for basic immunization plus a booster vaccination in adolescence.

  5. Snippets from the past: the evolution of Wade Hampton Frost's epidemiology as viewed from the American Journal of Hygiene/Epidemiology.

    PubMed

    Morabia, Alfredo

    2013-10-01

    Wade Hampton Frost, who was a Professor of Epidemiology at Johns Hopkins University from 1919 to 1938, spurred the development of epidemiologic methods. His 6 publications in the American Journal of Hygiene, which later became the American Journal of Epidemiology, comprise a 1928 Cutter lecture on a theory of epidemics, a survey-based study of tonsillectomy and immunity to Corynebacterium diphtheriae (1931), 2 papers from a longitudinal study of the incidence of minor respiratory diseases (1933 and 1935), an attack rate ratio analysis of the decline of diphtheria in Baltimore (1936), and a 1936 lecture on the age, time, and cohort analysis of tuberculosis mortality. These 6 American Journal of Hygiene /American Journal of Epidemiology papers attest that Frost's personal evolution mirrored that of the emerging "early" epidemiology: The scope of epidemiology extended beyond the study of epidemics of acute infectious diseases, and rigorous comparative study designs and their associated quantitative methods came to light.

  6. Snippets From the Past: The Evolution of Wade Hampton Frost's Epidemiology as Viewed From the American Journal of Hygiene/Epidemiology

    PubMed Central

    Morabia, Alfredo

    2013-01-01

    Wade Hampton Frost, who was a Professor of Epidemiology at Johns Hopkins University from 1919 to 1938, spurred the development of epidemiologic methods. His 6 publications in the American Journal of Hygiene, which later became the American Journal of Epidemiology, comprise a 1928 Cutter lecture on a theory of epidemics, a survey-based study of tonsillectomy and immunity to Corynebacterium diphtheriae (1931), 2 papers from a longitudinal study of the incidence of minor respiratory diseases (1933 and 1935), an attack rate ratio analysis of the decline of diphtheria in Baltimore (1936), and a 1936 lecture on the age, time, and cohort analysis of tuberculosis mortality. These 6 American Journal of Hygiene /American Journal of Epidemiology papers attest that Frost's personal evolution mirrored that of the emerging “early” epidemiology: The scope of epidemiology extended beyond the study of epidemics of acute infectious diseases, and rigorous comparative study designs and their associated quantitative methods came to light. PMID:24022889

  7. Energy Density, Portion Size, and Eating Occasions: Contributions to Increased Energy Intake in the United States, 1977–2006

    PubMed Central

    Duffey, Kiyah J.; Popkin, Barry M.

    2011-01-01

    Background Competing theories attempt to explain changes in total energy (TE) intake; however, a rigorous, comprehensive examination of these explanations has not been undertaken. Our objective was to examine the relative contribution of energy density (ED), portion size (PS), and the number of eating/drinking occasions (EOs) to changes in daily TE. Methods and Findings Using cross-sectional nationally representative data from the Nationwide Food Consumption Survey (1977–78), Continuing Survey of Food Intakes of Individuals (1989–91), and National Health and Nutrition Examination Surveys (1994–98 and 2003–06) for adults (aged ≥19 y), we mathematically decompose TE (kcal/d) to understand the relative contributions of each component—PS (grams/EO), ED (kcal/g/EO) and EO(number)—to changes in TE over time. There was an increase in TE intake (+570 kcal/d) and the number of daily EOs (+1.1) between 1977–78 and 2003–06. The average PS increased between 1977–78 and 1994–98, then dropped slightly between 1994–98 and 2003–06, while the average ED remained steady between 1977–78 and 1989–91, then declined slightly between 1989–91 and 1994–98. Estimates from the decomposition statistical models suggest that between 1977–78 and 1989–91, annualized changes in PS contributed nearly 15 kcal/d/y to increases in TE, while changes in EO accounted for just 4 kcal/d/y. Between 1994–98 and 2003–06 changes in EO accounted for 39 kcal/d/y of increase and changes in PS accounted for 1 kcal/d/y of decline in the annualized change in TE. Conclusions While all three components have contributed to some extent to 30-y changes in TE, changes in EO and PS have accounted for most of the change. These findings suggest a new focus for efforts to reduce energy imbalances in US adults. Please see later in the article for the Editors' Summary PMID:21738451

  8. Rigor or mortis: best practices for preclinical research in neuroscience.

    PubMed

    Steward, Oswald; Balice-Gordon, Rita

    2014-11-05

    Numerous recent reports document a lack of reproducibility of preclinical studies, raising concerns about potential lack of rigor. Examples of lack of rigor have been extensively documented and proposals for practices to improve rigor are appearing. Here, we discuss some of the details and implications of previously proposed best practices and consider some new ones, focusing on preclinical studies relevant to human neurological and psychiatric disorders. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. [A new formula for the measurement of rigor mortis: the determination of the FRR-index (author's transl)].

    PubMed

    Forster, B; Ropohl, D; Raule, P

    1977-07-05

    The manual examination of rigor mortis as currently used and its often subjective evaluation frequently produced highly incorrect deductions. It is therefore desirable that such inaccuracies should be replaced by the objective measuring of rigor mortis at the extremities. To that purpose a method is described which can also be applied in on-the-spot investigations and a new formula for the determination of rigor mortis--indices (FRR) is introduced.

  10. Use of the Rigor Mortis Process as a Tool for Better Understanding of Skeletal Muscle Physiology: Effect of the Ante-Mortem Stress on the Progression of Rigor Mortis in Brook Charr (Salvelinus fontinalis).

    ERIC Educational Resources Information Center

    Diouf, Boucar; Rioux, Pierre

    1999-01-01

    Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…

  11. Pharmacokinetics of Cannabis in Cancer Cachexia-Anorexia Syndrome.

    PubMed

    Reuter, Stephanie E; Martin, Jennifer H

    2016-07-01

    Anorexia can affect up to 90 % of people with advanced cancer. It is a complex symptom associated with changes in taste, lack of hunger at mealtimes and lack of food enjoyment. Associated weight loss is part of the physical decline that occurs as cancer worsens. Weight loss can also occur from cachexia, the increased metabolism of energy due to raised inflammatory cytokines, liver metastases and other factors seen in several advanced cancers. Independent of anorexia, although frequently associated (where it is referred to as the cachexia-anorexia syndrome), it accounts for a significant amount of morbidity and deaths in people with cancer. In particular, quality of life for the patient and the family is significantly affected with this syndrome as it causes anxiety and distress. Therefore, it is important that research into therapies is undertaken, particularly focusing on an understanding of the pharmacokinetic properties of compounds in this cachexic population. Cannabinoids are one such group of therapies that have received a large amount of media focus recently. However, there appears to be a lack on rigorous pharmacokinetic data of these complex and varied compounds in the cachexic population. Similarly, there is a lack of pharmacokinetic data in any population group for the non- tetrahydrocannabinol (THC) and cannabidiol (CBD) cannabinoids (often due to the lack of analytical standards for quantification). This review will thus examine the pharmacokinetics of major cannabinoids i.e. THC and CBD in a cancer population. Overall, based on the current literature, evidence for the use of cannabinoids for the treatment of cancer-related cachexia-anorexia syndrome remains equivocal. A high-quality, rigorous, phase I/II study to elicit pharmacokinetic dose-concentration and concentration-response data, with a clinically acceptable mode of delivery to reduce intrapatient variability and enable more consistent bioavailability is needed in this population.

  12. Use it or lose it: engaged lifestyle as a buffer of cognitive decline in aging?

    PubMed

    Hultsch, D F; Hertzog, C; Small, B J; Dixon, R A

    1999-06-01

    Data from the Victoria Longitudinal Study were used to examine the hypothesis that maintaining intellectual engagement through participation in everyday activities buffers individuals against cognitive decline in later life. The sample consisted of 250 middle-aged and older adults tested 3 times over 6 years. Structural equation modeling techniques were used to examine the relationships among changes in lifestyle variables and an array of cognitive variables. There was a relationship between changes in intellectually related activities and changes in cognitive functioning. These results are consistent with the hypothesis that intellectually engaging activities serve to buffer individuals against decline. However, an alternative model suggested the findings were also consistent with the hypothesis that high-ability individuals lead intellectually active lives until cognitive decline in old age limits their activities.

  13. Upgrading geometry conceptual understanding and strategic competence through implementing rigorous mathematical thinking (RMT)

    NASA Astrophysics Data System (ADS)

    Nugraheni, Z.; Budiyono, B.; Slamet, I.

    2018-03-01

    To reach higher order thinking skill, needed to be mastered the conceptual understanding and strategic competence as they are two basic parts of high order thinking skill (HOTS). RMT is a unique realization of the cognitive conceptual construction approach based on Feurstein with his theory of Mediated Learning Experience (MLE) and Vygotsky’s sociocultural theory. This was quasi-experimental research which compared the experimental class that was given Rigorous Mathematical Thinking (RMT) as learning method and the control class that was given Direct Learning (DL) as the conventional learning activity. This study examined whether there was different effect of two learning model toward conceptual understanding and strategic competence of Junior High School Students. The data was analyzed by using Multivariate Analysis of Variance (MANOVA) and obtained a significant difference between experimental and control class when considered jointly on the mathematics conceptual understanding and strategic competence (shown by Wilk’s Λ = 0.84). Further, by independent t-test is known that there was significant difference between two classes both on mathematical conceptual understanding and strategic competence. By this result is known that Rigorous Mathematical Thinking (RMT) had positive impact toward Mathematics conceptual understanding and strategic competence.

  14. Investigating outliers to improve conceptual models of bedrock aquifers

    NASA Astrophysics Data System (ADS)

    Worthington, Stephen R. H.

    2018-06-01

    Numerical models play a prominent role in hydrogeology, with simplifying assumptions being inevitable when implementing these models. However, there is a risk of oversimplification, where important processes become neglected. Such processes may be associated with outliers, and consideration of outliers can lead to an improved scientific understanding of bedrock aquifers. Using rigorous logic to investigate outliers can help to explain fundamental scientific questions such as why there are large variations in permeability between different bedrock lithologies.

  15. Improving students’ mathematical critical thinking through rigorous teaching and learning model with informal argument

    NASA Astrophysics Data System (ADS)

    Hamid, H.

    2018-01-01

    The purpose of this study is to analyze an improvement of students’ mathematical critical thinking (CT) ability in Real Analysis course by using Rigorous Teaching and Learning (RTL) model with informal argument. In addition, this research also attempted to understand students’ CT on their initial mathematical ability (IMA). This study was conducted at a private university in academic year 2015/2016. The study employed the quasi-experimental method with pretest-posttest control group design. The participants of the study were 83 students in which 43 students were in the experimental group and 40 students were in the control group. The finding of the study showed that students in experimental group outperformed students in control group on mathematical CT ability based on their IMA (high, medium, low) in learning Real Analysis. In addition, based on medium IMA the improvement of mathematical CT ability of students who were exposed to RTL model with informal argument was greater than that of students who were exposed to CI (conventional instruction). There was also no effect of interaction between RTL model and CI model with both (high, medium, and low) IMA increased mathematical CT ability. Finally, based on (high, medium, and low) IMA there was a significant improvement in the achievement of all indicators of mathematical CT ability of students who were exposed to RTL model with informal argument than that of students who were exposed to CI.

  16. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors

    NASA Astrophysics Data System (ADS)

    Cenek, Martin; Dahl, Spencer K.

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  17. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors.

    PubMed

    Cenek, Martin; Dahl, Spencer K

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  18. Unperturbed Schelling Segregation in Two or Three Dimensions

    NASA Astrophysics Data System (ADS)

    Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew

    2016-09-01

    Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).

  19. Clarity versus complexity: land-use modeling as a practical tool for decision-makers

    USGS Publications Warehouse

    Sohl, Terry L.; Claggett, Peter

    2013-01-01

    The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.

  20. Integration of Technology into the Classroom: Case Studies.

    ERIC Educational Resources Information Center

    Johnson, D. LaMont, Ed.; Maddux, Cleborne D., Ed.; Liu, Leping, Ed.

    This book contains the following case studies on the integration of technology in education: (1) "First Steps toward a Statistically Generated Information Technology Integration Model" (D. LaMont Johnson and Leping Liu); (2) "Case Studies: Are We Rejecting Rigor or Rediscovering Richness?" (Cleborne D. Maddux); (3)…

  1. Cellular Automata and the Humanities.

    ERIC Educational Resources Information Center

    Gallo, Ernest

    1994-01-01

    The use of cellular automata to analyze several pre-Socratic hypotheses about the evolution of the physical world is discussed. These hypotheses combine characteristics of both rigorous and metaphoric language. Since the computer demands explicit instructions for each step in the evolution of the automaton, such models can reveal conceptual…

  2. Retrospective Analysis of a Classical Biological Control Programme

    USDA-ARS?s Scientific Manuscript database

    1. Classical biological control has been a key technology in the management of invasive arthropod pests globally for over 120 years, yet rigorous quantitative evaluations of programme success or failure are rare. Here, I used life table and matrix model analyses, and life table response experiments ...

  3. Association between Soluble Klotho and Change in Kidney Function: The Health Aging and Body Composition Study.

    PubMed

    Drew, David A; Katz, Ronit; Kritchevsky, Stephen; Ix, Joachim; Shlipak, Michael; Gutiérrez, Orlando M; Newman, Anne; Hoofnagle, Andy; Fried, Linda; Semba, Richard D; Sarnak, Mark

    2017-06-01

    CKD appears to be a condition of soluble klotho deficiency. Despite known associations between low soluble klotho levels and conditions that promote kidney damage, such as oxidative stress and fibrosis, little information exists regarding the longitudinal association between soluble klotho levels and change in kidney function. We assayed serum soluble α -klotho in 2496 participants within the Health Aging and Body Composition study, a cohort of older adults. The associations between soluble klotho levels and decline in kidney function (relative decline: eGFR decline ≥30%; absolute decline: eGFR decline >3 ml/min per year) and incident CKD (incident eGFR <60 ml/min per 1.73 m 2 and >1 ml/min per year decline) were evaluated. We adjusted models for demographics, baseline eGFR, urine albumin-to-creatinine ratio, comorbidity, and measures of mineral metabolism. Among participants, the mean (SD) age was 75 (3) years, 52% were women, and 38% were black. Median (25th, 75th percentiles) klotho level was 630 (477, 817) pg/ml. In fully adjusted models, each two-fold higher level of klotho associated with lower odds of decline in kidney function (odds ratio, 0.78 [95% confidence interval, 0.66 to 0.93] for 30% decline in eGFR, and 0.85 [95% confidence interval, 0.73 to 0.98] for >3 ml/min per year decline in eGFR), but not of incident CKD (incident rate ratio, 0.90 [95% confidence interval, 0.78 to 1.04]). Overall, a higher soluble klotho level independently associated with a lower risk of decline in kidney function. Future studies should attempt to replicate these results in other cohorts and evaluate the underlying mechanism. Copyright © 2017 by the American Society of Nephrology.

  4. Reed warbler hosts fine-tune their defenses to track three decades of cuckoo decline.

    PubMed

    Thorogood, Rose; Davies, Nicholas B

    2013-12-01

    Interactions between avian hosts and brood parasites can provide a model for how animals adapt to a changing world. Reed warbler (Acrocephalus scirpaceus) hosts employ costly defenses to combat parasitism by common cuckoos (Cuculus canorus). During the past three decades cuckoos have declined markedly across England, reducing parasitism at our study site (Wicken Fen) from 24% of reed warbler nests in 1985 to 1% in 2012. Here we show with experiments that host mobbing and egg rejection defenses have tracked this decline in local parasitism risk: the proportion of reed warbler pairs mobbing adult cuckoos (assessed by responses to cuckoo mounts and models) has declined from 90% to 38%, and the proportion rejecting nonmimetic cuckoo eggs (assessed by responses to model eggs) has declined from 61% to 11%. This is despite no change in response to other nest enemies or mimetic model eggs. Individual variation in both defenses is predicted by parasitism risk during the host's egg-laying period. Furthermore, the response of our study population to temporal variation in parasitism risk can also explain spatial variation in egg rejection behavior in other populations across Europe. We suggest that spatial and temporal variation in parasitism risk has led to the evolution of plasticity in reed warbler defenses. © 2013 The Authors. Evolution published by Wiley Periodicals, Inc. on behalf of The Society for the Study of Evolution.

  5. Hippocampal activation is associated with longitudinal amyloid accumulation and cognitive decline

    DOE PAGES

    Leal, Stephanie L.; Landau, Susan M.; Bell, Rachel K.; ...

    2017-02-08

    The amyloid hypothesis suggests that beta-amyloid (Aβ) deposition leads to alterations in neural function and ultimately to cognitive decline in Alzheimer’s disease. However, factors that underlie Aβ deposition are incompletely understood. One proposed model suggests that synaptic activity leads to increased Aβ deposition. More specifically, hyperactivity in the hippocampus may be detrimental and could be one factor that drives Aβ deposition. To test this model, we examined the relationship between hippocampal activity during a memory task using fMRI and subsequent longitudinal change in Aβ using PIB-PET imaging in cognitively normal older adults. We found that greater hippocampal activation at baselinemore » was associated with increased Aβ accumulation. Furthermore, increasing Aβ accumulation mediated the influence of hippocampal activation on declining memory performance, demonstrating a crucial role of Aβ in linking hippocampal activation and memory. These findings support a model linking increased hippocampal activation to subsequent Aβ deposition and cognitive decline.« less

  6. Hippocampal activation is associated with longitudinal amyloid accumulation and cognitive decline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leal, Stephanie L.; Landau, Susan M.; Bell, Rachel K.

    The amyloid hypothesis suggests that beta-amyloid (Aβ) deposition leads to alterations in neural function and ultimately to cognitive decline in Alzheimer’s disease. However, factors that underlie Aβ deposition are incompletely understood. One proposed model suggests that synaptic activity leads to increased Aβ deposition. More specifically, hyperactivity in the hippocampus may be detrimental and could be one factor that drives Aβ deposition. To test this model, we examined the relationship between hippocampal activity during a memory task using fMRI and subsequent longitudinal change in Aβ using PIB-PET imaging in cognitively normal older adults. We found that greater hippocampal activation at baselinemore » was associated with increased Aβ accumulation. Furthermore, increasing Aβ accumulation mediated the influence of hippocampal activation on declining memory performance, demonstrating a crucial role of Aβ in linking hippocampal activation and memory. These findings support a model linking increased hippocampal activation to subsequent Aβ deposition and cognitive decline.« less

  7. On the discrepancy between observed and CMIP5 multi-model simulated Barents Sea winter sea ice decline

    NASA Astrophysics Data System (ADS)

    Li, Dawei; Zhang, Rong; Knutson, Thomas R.

    2017-04-01

    This study aims to understand the relative roles of external forcing versus internal climate variability in causing the observed Barents Sea winter sea ice extent (SIE) decline since 1979. We identify major discrepancies in the spatial patterns of winter Northern Hemisphere sea ice concentration trends over the satellite period between observations and CMIP5 multi-model mean externally forced response. The CMIP5 externally forced decline in Barents Sea winter SIE is much weaker than that observed. Across CMIP5 ensemble members, March Barents Sea SIE trends have little correlation with global mean surface air temperature trends, but are strongly anti-correlated with trends in Atlantic heat transport across the Barents Sea Opening (BSO). Further comparison with control simulations from coupled climate models suggests that enhanced Atlantic heat transport across the BSO associated with regional internal variability may have played a leading role in the observed decline in winter Barents Sea SIE since 1979.

  8. A hybrid of monopoly and perfect competition model for hi-tech products

    NASA Astrophysics Data System (ADS)

    Yang, P. C.; Wee, H. M.; Pai, S.; Yang, H. J.; Wee, P. K. P.

    2010-11-01

    For Hi-tech products, the demand rate, the component cost as well as the selling price usually decline significantly with time. In the case of perfect competition, shortages usually result in lost sales; while in a monopoly, shortages will be completely backordered. However, neither perfect competition nor monopoly exists. Therefore, there is a need to develop a replenishment model considering a hybrid of perfect competition and monopoly when the cost, price and demand are decreasing simultaneously. A numerical example and sensitivity analysis are carried out to illustrate this model. The results show that a higher decline-rate in the component cost leads to a smaller service level and a larger replenishment interval. When the component cost decline rate increases and the selling price decline rate decreases simultaneously, the replenishment interval decreases. In perfect competition it is better to have a high service level, while for the case with monopoly, keeping a low service level is better due to complete backordering.

  9. [Experimental study of restiffening of the rigor mortis].

    PubMed

    Wang, X; Li, M; Liao, Z G; Yi, X F; Peng, X M

    2001-11-01

    To observe changes of the length of sarcomere of rat when restiffening. We measured the length of sarcomere of quadriceps in 40 rats in different condition by scanning electron microscope. The length of sarcomere of rigor mortis without destroy is obviously shorter than that of restiffening. The length of sarcomere is negatively correlative to the intensity of rigor mortis. Measuring the length of sarcomere can determine the intensity of rigor mortis and provide evidence for estimation of time since death.

  10. Rigorous Science: a How-To Guide

    PubMed Central

    Fang, Ferric C.

    2016-01-01

    ABSTRACT Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word “rigor” is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education. PMID:27834205

  11. Complex dynamics of an SEIR epidemic model with saturated incidence rate and treatment

    NASA Astrophysics Data System (ADS)

    Khan, Muhammad Altaf; Khan, Yasir; Islam, Saeed

    2018-03-01

    In this paper, we describe the dynamics of an SEIR epidemic model with saturated incidence, treatment function, and optimal control. Rigorous mathematical results have been established for the model. The stability analysis of the model is investigated and found that the model is locally asymptotically stable when R0 < 1. The model is locally as well as globally asymptotically stable at endemic equilibrium when R0 > 1. The proposed model may possess a backward bifurcation. The optimal control problem is designed and obtained their necessary results. Numerical results have been presented for justification of theoretical results.

  12. On Large Time Behavior and Selection Principle for a Diffusive Carr-Penrose Model

    NASA Astrophysics Data System (ADS)

    Conlon, Joseph G.; Dabkowski, Michael; Wu, Jingchen

    2016-04-01

    This paper is concerned with the study of a diffusive perturbation of the linear LSW model introduced by Carr and Penrose. A main subject of interest is to understand how the presence of diffusion acts as a selection principle, which singles out a particular self-similar solution of the linear LSW model as determining the large time behavior of the diffusive model. A selection principle is rigorously proven for a model which is a semiclassical approximation to the diffusive model. Upper bounds on the rate of coarsening are also obtained for the full diffusive model.

  13. Putrefactive rigor: apparent rigor mortis due to gas distension.

    PubMed

    Gill, James R; Landi, Kristen

    2011-09-01

    Artifacts due to decomposition may cause confusion for the initial death investigator, leading to an incorrect suspicion of foul play. Putrefaction is a microorganism-driven process that results in foul odor, skin discoloration, purge, and bloating. Various decompositional gases including methane, hydrogen sulfide, carbon dioxide, and hydrogen will cause the body to bloat. We describe 3 instances of putrefactive gas distension (bloating) that produced the appearance of inappropriate rigor, so-called putrefactive rigor. These gases may distend the body to an extent that the extremities extend and lose contact with their underlying support surface. The medicolegal investigator must recognize that this is not true rigor mortis and the body was not necessarily moved after death for this gravity-defying position to occur.

  14. Studies on the estimation of the postmortem interval. 3. Rigor mortis (author's transl).

    PubMed

    Suzutani, T; Ishibashi, H; Takatori, T

    1978-11-01

    The authors have devised a method for classifying rigor mortis into 10 types based on its appearance and strength in various parts of a cadaver. By applying the method to the findings of 436 cadavers which were subjected to medico-legal autopsies in our laboratory during the last 10 years, it has been demonstrated that the classifying method is effective for analyzing the phenomenon of onset, persistence and disappearance of rigor mortis statistically. The investigation of the relationship between each type of rigor mortis and the postmortem interval has demonstrated that rigor mortis may be utilized as a basis for estimating the postmortem interval but the values have greater deviation than those described in current textbooks.

  15. Apolipoprotein E genotype does not moderate the associations of depressive symptoms, neuroticism and allostatic load with cognitive ability and cognitive aging in the Lothian Birth Cohort 1936.

    PubMed

    Crook, Zander; Booth, Tom; Cox, Simon R; Corley, Janie; Dykiert, Dominika; Redmond, Paul; Pattie, Alison; Taylor, Adele M; Harris, Sarah E; Starr, John M; Deary, Ian J

    2018-01-01

    In this replication-and-extension study, we tested whether depressive symptoms, neuroticism, and allostatic load (multisystem physiological dysregulation) were related to lower baseline cognitive ability and greater subsequent cognitive decline in older adults, and whether these relationships were moderated by the E4 allele of the apolipoprotein E (APOE) gene. We also tested whether allostatic load mediated the relationships between neuroticism and cognitive outcomes. We used data from the Lothian Birth Cohort 1936 (n at Waves 1-3: 1,028 [M age = 69.5 y]; 820 [M duration since Wave 1 = 2.98 y]; 659 [M duration since Wave 1 = 6.74 y]). We fitted latent growth curve models of general cognitive ability (modeled using five cognitive tests) with groups of APOE E4 non-carriers and carriers. In separate models, depressive symptoms, neuroticism, and allostatic load predicted baseline cognitive ability and subsequent cognitive decline. In addition, models tested whether allostatic load mediated relationships between neuroticism and cognitive outcomes. Baseline cognitive ability had small-to-moderate negative associations with depressive symptoms (β range = -0.20 to -0.17), neuroticism (β range = -0.27 to -0.23), and allostatic load (β range = -0.11 to 0.09). Greater cognitive decline was linked to baseline allostatic load (β range = -0.98 to -0.83) and depressive symptoms (β range = -1.00 to -0.88). However, APOE E4 allele possession did not moderate the relationships of depressive symptoms, neuroticism and allostatic load with cognitive ability and cognitive decline. Additionally, the associations of neuroticism with cognitive ability and cognitive decline were not mediated through allostatic load. Our results suggest that APOE E4 status does not moderate the relationships of depressive symptoms, neuroticism, and allostatic load with cognitive ability and cognitive decline in healthy older adults. The most notable positive finding in the current research was the strong association between allostatic load and cognitive decline.

  16. Science to Manage a Very Rare Fish in a Very Large River - Pallid Sturgeon in the Missouri River, U.S.A.

    NASA Astrophysics Data System (ADS)

    Jacobson, R. B.; Colvin, M. E.; Marmorek, D.; Randall, M.

    2017-12-01

    The Missouri River Recovery Program (MRRP) seeks to revise river-management strategies to avoid jeopardizing the existence of three species: pallid sturgeon (Scaphirhynchus albus), interior least tern (Sterna antillarum)), and piping plover (Charadrius melodus). Managing the river to maintain populations of the two birds (terns and plovers) is relatively straightforward: reproductive success can be modeled with some certainty as a direct, increasing function of exposed sandbar area. In contrast, the pallid sturgeon inhabits the benthic zone of a deep, turbid river and many parts of its complex life history are not directly observable. Hence, pervasive uncertainties exist about what factors are limiting population growth and what management actions may reverse population declines. These uncertainties are being addressed by the MRRP through a multi-step process. The first step was an Effects Analysis (EA), which: documented what is known and unknown about the river and the species; documented quality and quantity of existing information; used an expert-driven process to develop conceptual ecological models and to prioritize management hypotheses; and developed quantitative models linking management actions (flows, channel reconfigurations, and stocking) to population responses. The EA led to development of a science and adaptive-management plan with prioritized allocation of investment among 4 levels of effort ranging from fundamental research to full implementation. The plan includes learning from robust, hypothesis-driven effectiveness monitoring for all actions, with statistically sound experimental designs, multiple metrics, and explicit decision criteria to guide management. Finally, the science plan has been fully integrated with a new adaptive-management structure that links science to decision makers. The reinvigorated investment in science stems from the understanding that costly river-management decisions are not socially or politically supportable without better understanding of how this endangered fish will respond. While some hypotheses can be evaluated without actually implementing management actions in the river, assessing the effectiveness of other forms of habitat restoration requires in-river implementation within a rigorous experimental design.

  17. Rigorous proof for the nonlocal correlation function in the transverse Ising model with ring frustration.

    PubMed

    Dong, Jian-Jun; Zheng, Zhen-Yu; Li, Peng

    2018-01-01

    An unusual correlation function was conjectured by Campostrini et al. [Phys. Rev. E 91, 042123 (2015)PLEEE81539-375510.1103/PhysRevE.91.042123] for the ground state of a transverse Ising chain with geometrical frustration. Later, we provided a rigorous proof for it and demonstrated its nonlocal nature based on an evaluation of a Toeplitz determinant in the thermodynamic limit [J. Stat. Mech. (2016) 11310210.1088/1742-5468/2016/11/113102]. In this paper, we further prove that all the low excited energy states forming the gapless kink phase share the same asymptotic correlation function with the ground state. As a consequence, the thermal correlation function almost remains constant at low temperatures if one assumes a canonical ensemble.

  18. Research into Theory into Practice: An Overview of Family Based Interventions for Child Antisocial Behavior Developed at the Oregon Social Learning Center

    PubMed Central

    Fisher, Philip A.; Gilliam, Kathryn S.

    2017-01-01

    Although many psychotherapeutic approaches exist for treating troubled children and their families, not all have been evaluated to be effective through research. Moreover, among those that have been determined to be “evidence-based,” few have followed as coherent and rigorous a path of rigorous scientific investigation as the interventions that have been developed at the Oregon Social Learning Center. As such, these interventions serve as a model of “research to theory to practice” that may not only be employed to support families with children in need of treatment, but may also guide other programs of treatment development. This is the story of how this work has unfolded over the past four decades. PMID:29225459

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bodvarsson, G.S.; Pruess, K.; Stefansson, V.

    A detailed three-dimensional well-by-well model of the East Olkaria geothermal field in Kenya has been developed. The model matches reasonably well the flow rate and enthalpy data from all wells, as well as the overall pressure decline in the reservoir. The model is used to predict the generating capacity of the field, well decline, enthalpy behavior, the number of make-up wells needed and the effects of injection on well performance and overall reservoir depletion. 26 refs., 10 figs.

  20. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    PubMed

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Effects of uncertainty and variability on population declines and IUCN Red List classifications.

    PubMed

    Rueda-Cediel, Pamela; Anderson, Kurt E; Regan, Tracey J; Regan, Helen M

    2018-01-22

    The International Union for Conservation of Nature (IUCN) Red List Categories and Criteria is a quantitative framework for classifying species according to extinction risk. Population models may be used to estimate extinction risk or population declines. Uncertainty and variability arise in threat classifications through measurement and process error in empirical data and uncertainty in the models used to estimate extinction risk and population declines. Furthermore, species traits are known to affect extinction risk. We investigated the effects of measurement and process error, model type, population growth rate, and age at first reproduction on the reliability of risk classifications based on projected population declines on IUCN Red List classifications. We used an age-structured population model to simulate true population trajectories with different growth rates, reproductive ages and levels of variation, and subjected them to measurement error. We evaluated the ability of scalar and matrix models parameterized with these simulated time series to accurately capture the IUCN Red List classification generated with true population declines. Under all levels of measurement error tested and low process error, classifications were reasonably accurate; scalar and matrix models yielded roughly the same rate of misclassifications, but the distribution of errors differed; matrix models led to greater overestimation of extinction risk than underestimations; process error tended to contribute to misclassifications to a greater extent than measurement error; and more misclassifications occurred for fast, rather than slow, life histories. These results indicate that classifications of highly threatened taxa (i.e., taxa with low growth rates) under criterion A are more likely to be reliable than for less threatened taxa when assessed with population models. Greater scrutiny needs to be placed on data used to parameterize population models for species with high growth rates, particularly when available evidence indicates a potential transition to higher risk categories. © 2018 Society for Conservation Biology.

  2. Hydrology and digital simulation of the regional aquifer system, eastern Snake River Plain, Idaho

    USGS Publications Warehouse

    Garabedian, S.P.

    1992-01-01

    The transient model was used to simulate aquifer changes from 1981 to 2010 in response to three hypothetical development alternatives: (1) Continuation of 1980 hydrologic conditions, (2) increased pumpage, and (3) increased recharge. Simulation of continued 1980 hydrologic conditions for 30 years indicated that head declines of 2 to 8 feet might be expected in the central part of the plain. The magnitude of simulated head declines was con- sistent with head declines measured during the 1980 water year. Larger declines were calculated along model boundaries, but these changes may have resulted from underestimation of tribu- tary drainage-basin underflow and inadequate aquifer definition. Simulation of increased ground-water pumpage (an additional 2,400 cubic feet per second) for 30 years indicated head declines of 10 to 50 feet in the central part of the plain. These relatively large head declines were accompanied by increased simulated river leakage of 50 percent and decreased spring discharge of 20 percent. The effect of increased recharge (800 cubic feet per sec- ond) for 30 years was a rise in simulated heads of 0 to 5 feet in the central part of the plain.

  3. A Study of the Behavior and Micromechanical Modelling of Granular Soil. Volume 3. A Numerical Investigation of the Behavior of Granular Media Using Nonlinear Discrete Element Simulation

    DTIC Science & Technology

    1991-05-22

    plasticity, including those of DiMaggio and Sandier (1971), Baladi and Rohani (1979), Lade (1977), Prevost (1978, 1985), Dafalias and Herrmann (1982). In...distribution can be achieved only if the behavior at the contact is fully understood and rigorously modelled. 18 REFERENCES Baladi , G.Y. and Rohani, B. (1979

  4. Terrestrial solar spectral modeling. [SOLTRAN, BRITE, and FLASH codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bird, R.E.

    The utility of accurate computer codes for calculating the solar spectral irradiance under various atmospheric conditions was recognized. New absorption and extraterrestrial spectral data are introduced. Progress is made in radiative transfer modeling outside of the solar community, especially for space and military applications. Three rigorous radiative transfer codes SOLTRAN, BRITE, and FLASH are employed. The SOLTRAN and BRITE codes are described and results from their use are presented.

  5. A Ninth Grade Student Transition Model: A Study of Student Perceptions Related to Rigor, Relevancy, and Relationships within a Ninth Grade Transitional Program

    ERIC Educational Resources Information Center

    Shimp, Timothy M.

    2017-01-01

    This descriptive case study is a qualitative investigation into the perceptions of tenth grade students who experienced a ninth-grade transitional model high school academy within a large PreK-12 suburban school district. Specifically, this study provided the opportunity to examine the success of one Ninth Grade Academy, identify areas of concern…

  6. Two Novel Methods and Multi-Mode Periodic Solutions for the Fermi-Pasta-Ulam Model

    NASA Astrophysics Data System (ADS)

    Arioli, Gianni; Koch, Hans; Terracini, Susanna

    2005-04-01

    We introduce two novel methods for studying periodic solutions of the FPU β-model, both numerically and rigorously. One is a variational approach, based on the dual formulation of the problem, and the other involves computer-assisted proofs. These methods are used e.g. to construct a new type of solutions, whose energy is spread among several modes, associated with closely spaced resonances.

  7. Free-Energy Fluctuations and Chaos in the Sherrington-Kirkpatrick Model

    NASA Astrophysics Data System (ADS)

    Aspelmeier, T.

    2008-03-01

    The sample-to-sample fluctuations ΔFN of the free-energy in the Sherrington-Kirkpatrick model are shown rigorously to be related to bond chaos. Via this connection, the fluctuations become analytically accessible by replica methods. The replica calculation for bond chaos shows that the exponent μ governing the growth of the fluctuations with system size N, ΔFN˜Nμ, is bounded by μ≤(1)/(4).

  8. Coarse-Grained Lattice Model Simulations of Sequence-Structure Fitness of a Ribosome-Inactivating Protein

    DTIC Science & Technology

    2007-11-05

    limits of what is considered practical when applying all-atom molecular - dynamics simulation methods. Lattice models provide computationally robust...of expectation values from the density of states. All-atom molecular - dynamics simulations provide the most rigorous sampling method to generate con... molecular - dynamics simulations of protein folding,6–9 reported studies of computing a heat capacity or other calorimetric observables have been limited to

  9. A Developmental Test of Mertonian Anomie Theory.

    ERIC Educational Resources Information Center

    Menard, Scott

    1995-01-01

    Carefully reviewed Merton's writings on anomie theory to construct a more complete and rigorous test of the theory for respondents in early, middle, and late adolescence. Concluded that misspecified models of strain theory have underestimated the predictive power of strain theory in general and of anomie theory in particular. (JBJ)

  10. Child Forensic Interviewing in Children's Advocacy Centers: Empirical Data on a Practice Model

    ERIC Educational Resources Information Center

    Cross, Theodore P.; Jones, Lisa M.; Walsh, Wendy A.; Simone, Monique; Kolko, David

    2007-01-01

    Objective: Children's Advocacy Centers (CACs) aim to improve child forensic interviewing following allegations of child abuse by coordinating multiple investigations, providing child-friendly interviewing locations, and limiting redundant interviewing. This analysis presents one of the first rigorous evaluations of CACs' implementation of these…

  11. The Menu for Every Young Mathematician's Appetite

    ERIC Educational Resources Information Center

    Legnard, Danielle S.; Austin, Susan L.

    2012-01-01

    Math Workshop offers differentiated instruction to foster a deep understanding of rich, rigorous mathematics that is attainable by all learners. The inquiry-based model provides a menu of multilevel math tasks, within the daily math block, that focus on similar mathematical content. Math Workshop promotes a culture of engagement and…

  12. Tomorrow's Research Library: Vigor or Rigor Mortis?

    ERIC Educational Resources Information Center

    Hacken, Richard D.

    1988-01-01

    Compares, contrasts, and critiques predictions that have been made about the future of research libraries, focusing on the impact of technology on the library's role and users' needs. The discussion includes models for the adaptation of new technologies that may assist in library planning and change. (38 references) (CLB)

  13. Rigorous Mathematical Modeling of the Adsorption System with Electrothermal Regeneration of the Used Adsorbent

    DTIC Science & Technology

    2003-09-29

    NanoTechnology and Metallurgy Belgrade 11000 Yugoslavia 8. PERFORMING ORGANIZATION REPORT NUMBER N/A 10. SPONSOR/MONITOR’S ACRONYM(S)9...outlet annular tube I - ZONE I II - ZONE II 39 References: 1. Tayo Kaken Company, A means of reactivating worked charcoal , Japanese

  14. Anticipating and Incorporating Stakeholder Feedback When Developing Value-Added Models

    ERIC Educational Resources Information Center

    Balch, Ryan; Koedel, Cory

    2014-01-01

    State and local education agencies across the United States are increasingly adopting rigorous teacher evaluation systems. Most systems formally incorporate teacher performance as measured by student test-score growth, sometimes by state mandate. An important consideration that will influence the long-term persistence and efficacy of these systems…

  15. A Transformative Model for Undergraduate Quantitative Biology Education

    ERIC Educational Resources Information Center

    Usher, David C.; Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The "BIO2010" report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3)…

  16. Bouncing Back: Erikson, Maslow and Recovery from Divorce.

    ERIC Educational Resources Information Center

    Charnofsky, Stan

    Counseling for recovery from divorce may be significantly enhanced if a general model of emotional health/deficiency can be applied. This article introduces an amalgam of Erik Erikson's developmental stages and Abraham Maslow's motivational hierarchy as a means of understanding the rigors of marital dissolution. The paradigm promotes client…

  17. Thermodynamic theory of intrinsic finite-size effects in PbTiO3 nanocrystals. I. Nanoparticle size-dependent tetragonal phase stability

    NASA Astrophysics Data System (ADS)

    Akdogan, E. K.; Safari, A.

    2007-03-01

    We propose a phenomenological intrinsic finite-size effect model for single domain, mechanically free, and surface charge compensated ΔG-P ⃗s-ξ space, which describes the decrease in tetragonal phase stability with decreasing ξ rigorously.

  18. Bayesian Decision Theory Guiding Educational Decision-Making: Theories, Models and Application

    ERIC Educational Resources Information Center

    Pan, Yilin

    2016-01-01

    Given the importance of education and the growing public demand for improving education quality under tight budget constraints, there has been an emerging movement to call for research-informed decisions in educational resource allocation. Despite the abundance of rigorous studies on the effectiveness, cost, and implementation of educational…

  19. Accumulating Knowledge: When Are Reading Intervention Results Meaningful?

    ERIC Educational Resources Information Center

    Fletcher, Jack M.; Wagner, Richard K.

    2014-01-01

    The three target articles provide examples of intervention studies that are excellent models for the field. They rely on rigorous and elegant designs, the interventions are motivated by attention to underlying theoretical mechanisms, and longitudinal designs are used to examine the duration of effects of interventions that occur. When studies are…

  20. Exploration of the Maximum Entropy/Optimal Projection Approach to Control Design Synthesis for Large Space Structures.

    DTIC Science & Technology

    1985-02-01

    Energy Analysis , a branch of dynamic modal analysis developed for analyzing acoustic vibration problems, its present stage of development embodies a...Maximum Entropy Stochastic Modelling and Reduced-Order Design Synthesis is a rigorous new approach to this class of problems. Inspired by Statistical

  1. Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach

    ERIC Educational Resources Information Center

    Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…

  2. Probabilistic Space Weather Forecasting: a Bayesian Perspective

    NASA Astrophysics Data System (ADS)

    Camporeale, E.; Chandorkar, M.; Borovsky, J.; Care', A.

    2017-12-01

    Most of the Space Weather forecasts, both at operational and research level, are not probabilistic in nature. Unfortunately, a prediction that does not provide a confidence level is not very useful in a decision-making scenario. Nowadays, forecast models range from purely data-driven, machine learning algorithms, to physics-based approximation of first-principle equations (and everything that sits in between). Uncertainties pervade all such models, at every level: from the raw data to finite-precision implementation of numerical methods. The most rigorous way of quantifying the propagation of uncertainties is by embracing a Bayesian probabilistic approach. One of the simplest and most robust machine learning technique in the Bayesian framework is Gaussian Process regression and classification. Here, we present the application of Gaussian Processes to the problems of the DST geomagnetic index forecast, the solar wind type classification, and the estimation of diffusion parameters in radiation belt modeling. In each of these very diverse problems, the GP approach rigorously provide forecasts in the form of predictive distributions. In turn, these distributions can be used as input for ensemble simulations in order to quantify the amplification of uncertainties. We show that we have achieved excellent results in all of the standard metrics to evaluate our models, with very modest computational cost.

  3. Protective effect of HOE642, a selective blocker of Na+-H+ exchange, against the development of rigor contracture in rat ventricular myocytes.

    PubMed

    Ruiz-Meana, M; Garcia-Dorado, D; Juliá, M; Inserte, J; Siegmund, B; Ladilov, Y; Piper, M; Tritto, F P; González, M A; Soler-Soler, J

    2000-01-01

    The objective of this study was to investigate the effect of Na+-H+ exchange (NHE) and HCO3--Na+ symport inhibition on the development of rigor contracture. Freshly isolated adult rat cardiomyocytes were subjected to 60 min metabolic inhibition (MI) and 5 min re-energization (Rx). The effects of perfusion of HCO3- or HCO3--free buffer with or without the NHE inhibitor HOE642 (7 microM) were investigated during MI and Rx. In HCO3--free conditions, HOE642 reduced the percentage of cells developing rigor during MI from 79 +/- 1% to 40 +/- 4% (P < 0.001) without modifying the time at which rigor appeared. This resulted in a 30% reduction of hypercontracture during Rx (P < 0.01). The presence of HCO3- abolished the protective effect of HOE642 against rigor. Cells that had developed rigor underwent hypercontracture during Rx independently of treatment allocation. Ratiofluorescence measurement demonstrated that the rise in cytosolic Ca2+ (fura-2) occurred only after the onset of rigor, and was not influenced by HOE642. NHE inhibition did not modify Na+ rise (SBFI) during MI, but exaggerated the initial fall of intracellular pH (BCEFC). In conclusion, HOE642 has a protective effect against rigor during energy deprivation, but only when HCO3--dependent transporters are inhibited. This effect is independent of changes in cytosolic Na+ or Ca2+ concentrations.

  4. Simulation of groundwater flow in the Edwards-Trinity and related aquifers in the Pecos County region, Texas

    USGS Publications Warehouse

    Clark, Brian R.; Bumgarner, Johnathan R.; Houston, Natalie A.; Foster, Adam L.

    2014-01-01

    The model was used to simulate groundwater-level altitudes resulting from prolonged pumping to evaluate sustainability of current and projected water-use demands. Each of three scenarios utilized a continuation of the calibrated model. Scenario 1 extended recent (2008) irrigation and nonirrigation pumping values for a 30-year period from 2010 to 2040. Projected groundwater-level changes in and around the Fort Stockton area under scenario 1 change little from current conditions, indicating that the groundwater system is near equilibrium with respect to recent (2008) pumping stress. Projected groundwater-level declines in the eastern part of the model area ranging from 5.0 to 15.0 feet are likely the result of nonequilibrium conditions associated with recent increases in pumping after a prolonged water-level recovery period of little or no pumping. Projected groundwater-level declines (from 15.0 to 31.0 feet) occurred in localized areas by the end of scenario 1 in the Leon-Belding area. Scenario 2 evaluated the effects of extended recent (2008) pumping rates as assigned in scenario 1 with year-round maximum permitted pumping rates in the Belding area. Results of scenario 2 are similar in water-level decline and extent as those of scenario 1. The extent of the projected groundwater-level decline in the range from 5.0 to 15.0 feet in the Leon-Belding irrigation area expanded slightly (about a 2-percent increase) from that of scenario 1. Maximum projected groundwater-level declines in the Leon-Belding irrigation area were approximately 31.3 feet in small isolated areas. Scenario 3 evaluated the effects of periodic increases in pumping rates over the 30-year extended period. Results of scenario 3 are similar to those of scenario 2 in terms of the areas of groundwater-level decline; however, the maximum projected groundwater-level decline increased to approximately 34.5 feet in the Leon-Belding area, and the extent of the decline was larger in area (about a 17-percent increase) than that of scenario 2. Additionally, the area of projected groundwater-level declines in the eastern part of the model area increased from that of scenario 2—two individual areas of decline coalesced into one larger area. The localized nature of the projected groundwater-level declines is a reflection of the high degree of fractured control on storage and hydraulic conductivity in the Edwards-Trinity aquifer. Additionally, the finding that simulated spring flow is highly dependent on the transient nature of hydraulic heads in the underlying aquifer indicates the importance of adequately understanding and characterizing the entire groundwater system.

  5. Interface Pattern Selection in Directional Solidification

    NASA Technical Reports Server (NTRS)

    Trivedi, Rohit; Tewari, Surendra N.

    2001-01-01

    The central focus of this research is to establish key scientific concepts that govern the selection of cellular and dendritic patterns during the directional solidification of alloys. Ground-based studies have established that the conditions under which cellular and dendritic microstructures form are precisely where convection effects are dominant in bulk samples. Thus, experimental data can not be obtained terrestrially under pure diffusive regime. Furthermore, reliable theoretical models are not yet possible which can quantitatively incorporate fluid flow in the pattern selection criterion. Consequently, microgravity experiments on cellular and dendritic growth are designed to obtain benchmark data under diffusive growth conditions that can be quantitatively analyzed and compared with the rigorous theoretical model to establish the fundamental principles that govern the selection of specific microstructure and its length scales. In the cellular structure, different cells in an array are strongly coupled so that the cellular pattern evolution is controlled by complex interactions between thermal diffusion, solute diffusion and interface effects. These interactions give infinity of solutions, and the system selects only a narrow band of solutions. The aim of this investigation is to obtain benchmark data and develop a rigorous theoretical model that will allow us to quantitatively establish the physics of this selection process.

  6. Effects of illegal harvest of eggs on the population decline of leatherback turtles in Las Baulas Marine National Park, Costa Rica.

    PubMed

    Tomillo, Pilar Santidrián; Saba, Vincent S; Piedra, Rotney; Paladino, Frank V; Spotila, James R

    2008-10-01

    Within 19 years the nesting population of leatherback turtles (Dermochelys coriacea) at Parque Nacional Marino Las Baulas declined from 1500 turtles nesting per year to about 100. We analyzed the effects of fishery bycatch and illegal harvesting (poaching) of eggs on this population. We modeled the population response to different levels of egg harvest (90, 75, 50, and 25%) and the effect of eradicating poaching at different times during the population decline. We compared effects of 90% poaching with those of 20% adult mortality because both of these processes were present in the population at Las Baulas. There was a stepwise decline in number of nesting turtles at all levels of egg harvest. Extirpation times for different levels of poaching ranged from 45 to 282 years. The nesting population declined more slowly and survived longer with 20% adult mortality (146 years) than it did with 90% poaching (45 years). Time that elapsed until poaching stopped determined the average population size at which the population stabilized, ranging from 90 to 420 nesting turtles. Our model predicted that saving clutches lost naturally would restore the population when adult mortality rates were low and would contribute more to population recovery when there were short remigration intervals between nesting seasons and a large proportion of natural loss of clutches. Because the model indicated that poaching was the most important cause of the leatherback decline at Las Baulas, protecting nests on the beach and protecting the beach from development are critical for survival of this population. Nevertheless, the model predicted that current high mortality rates of adults will prevent population recovery. Therefore, protection of the beach habitat and nests must be continued and fishery bycatch must be reduced to save this population.

  7. Relationship of Having Hobbies and a Purpose in Life With Mortality, Activities of Daily Living, and Instrumental Activities of Daily Living Among Community-Dwelling Elderly Adults

    PubMed Central

    Tomioka, Kimiko; Kurumatani, Norio; Hosoi, Hiroshi

    2016-01-01

    Background This study’s aim was to clarify the relationship of having hobbies and a purpose in life (PIL; in Japanese, ikigai) with mortality and a decline in the activities of daily living (ADL) and instrumental ADL (IADL) among the community-dwelling elderly. Methods Prospective observational data from residents aged ≥65 years who were at increased risk for death (n = 1853) and developing a decline in ADL (n = 1254) and IADL (n = 1162) were analyzed. Cox proportional hazard models were used for mortality analysis of data from February 2011 to November 2014. ADL and IADL were evaluated using the Barthel Index and the Tokyo Metropolitan Institute of Gerontology Index of Competence, respectively. ADL and IADL were assessed at baseline and follow-up and were evaluated using logistic regression models. Fully adjusted models included terms for age, gender, BMI, income, alcohol intake, smoking history, number of chronic diseases, cognitive function, and depression. Results During the follow-up of eligible participants, 248 had died, 119 saw a decline in ADL, and 178 saw a decline in IADL. In fully adjusted models, having neither hobbies nor PIL was significantly associated with an increased risk of mortality (hazard ratio 2.08; 95% confidence interval [CI], 1.47–2.94), decline in ADL (odds ratio 2.74; 95% CI, 1.44–5.21), and decline in IADL (odds ratio 1.89; 95% CI, 1.01–3.55) compared to having both hobbies and PIL. Conclusions Although effect modifications by cognitive functioning and depression cannot be ruled out, our findings suggest that having hobbies and PIL may extend not only longevity, but also healthy life expectancy among community-dwelling older adults. PMID:26947954

  8. Relationship of Having Hobbies and a Purpose in Life With Mortality, Activities of Daily Living, and Instrumental Activities of Daily Living Among Community-Dwelling Elderly Adults.

    PubMed

    Tomioka, Kimiko; Kurumatani, Norio; Hosoi, Hiroshi

    2016-07-05

    This study's aim was to clarify the relationship of having hobbies and a purpose in life (PIL; in Japanese, ikigai) with mortality and a decline in the activities of daily living (ADL) and instrumental ADL (IADL) among the community-dwelling elderly. Prospective observational data from residents aged ≥65 years who were at increased risk for death (n = 1853) and developing a decline in ADL (n = 1254) and IADL (n = 1162) were analyzed. Cox proportional hazard models were used for mortality analysis of data from February 2011 to November 2014. ADL and IADL were evaluated using the Barthel Index and the Tokyo Metropolitan Institute of Gerontology Index of Competence, respectively. ADL and IADL were assessed at baseline and follow-up and were evaluated using logistic regression models. Fully adjusted models included terms for age, gender, BMI, income, alcohol intake, smoking history, number of chronic diseases, cognitive function, and depression. During the follow-up of eligible participants, 248 had died, 119 saw a decline in ADL, and 178 saw a decline in IADL. In fully adjusted models, having neither hobbies nor PIL was significantly associated with an increased risk of mortality (hazard ratio 2.08; 95% confidence interval [CI], 1.47-2.94), decline in ADL (odds ratio 2.74; 95% CI, 1.44-5.21), and decline in IADL (odds ratio 1.89; 95% CI, 1.01-3.55) compared to having both hobbies and PIL. Although effect modifications by cognitive functioning and depression cannot be ruled out, our findings suggest that having hobbies and PIL may extend not only longevity, but also healthy life expectancy among community-dwelling older adults.

  9. Understanding the temporal dimension of the red-edge spectral region for forest decline detection using high-resolution hyperspectral and Sentinel-2a imagery.

    PubMed

    Zarco-Tejada, P J; Hornero, A; Hernández-Clemente, R; Beck, P S A

    2018-03-01

    The operational monitoring of forest decline requires the development of remote sensing methods that are sensitive to the spatiotemporal variations of pigment degradation and canopy defoliation. In this context, the red-edge spectral region (RESR) was proposed in the past due to its combined sensitivity to chlorophyll content and leaf area variation. In this study, the temporal dimension of the RESR was evaluated as a function of forest decline using a radiative transfer method with the PROSPECT and 3D FLIGHT models. These models were used to generate synthetic pine stands simulating decline and recovery processes over time and explore the temporal rate of change of the red-edge chlorophyll index (CI) as compared to the trajectories obtained for the structure-related Normalized Difference Vegetation Index (NDVI). The temporal trend method proposed here consisted of using synthetic spectra to calculate the theoretical boundaries of the subspace for healthy and declining pine trees in the temporal domain, defined by CI time=n /CI time=n+1 vs. NDVI time=n /NDVI time=n+1 . Within these boundaries, trees undergoing decline and recovery processes showed different trajectories through this subspace. The method was then validated using three high-resolution airborne hyperspectral images acquired at 40 cm resolution and 260 spectral bands of 6.5 nm full-width half-maximum (FWHM) over a forest with widespread tree decline, along with field-based monitoring of chlorosis and defoliation (i.e., 'decline' status) in 663 trees between the years 2015 and 2016. The temporal rate of change of chlorophyll vs. structural indices, based on reflectance spectra extracted from the hyperspectral images, was different for trees undergoing decline, and aligned towards the decline baseline established using the radiative transfer models. By contrast, healthy trees over time aligned towards the theoretically obtained healthy baseline . The applicability of this temporal trend method to the red-edge bands of the MultiSpectral Imager (MSI) instrument on board Sentinel-2a for operational forest status monitoring was also explored by comparing the temporal rate of change of the Sentinel-2-derived CI over areas with declining and healthy trees. Results demonstrated that the Sentinel-2a red-edge region was sensitive to the temporal dimension of forest condition, as the relationships obtained for pixels in healthy condition deviated from those of pixels undergoing decline.

  10. To Your Health: NLM update transcript - Improving medical research rigor?

    MedlinePlus

    ... be a well-tailored solution to enhance the quantitative rigor of medical research, suggests a viewpoint recently published in the Journal ... about 96 percent of medical and public health research articles (that report ... more quantitative rigor would attract widespread attention — if not high ...

  11. Evaluating Rigor in Qualitative Methodology and Research Dissemination

    ERIC Educational Resources Information Center

    Trainor, Audrey A.; Graue, Elizabeth

    2014-01-01

    Despite previous and successful attempts to outline general criteria for rigor, researchers in special education have debated the application of rigor criteria, the significance or importance of small n research, the purpose of interpretivist approaches, and the generalizability of qualitative empirical results. Adding to these complications, the…

  12. Looking for age-related growth decline in natural forests: unexpected biomass patterns from tree rings and simulated mortality

    USGS Publications Warehouse

    Foster, Jane R.; D'Amato, Anthony W.; Bradford, John B.

    2014-01-01

    Forest biomass growth is almost universally assumed to peak early in stand development, near canopy closure, after which it will plateau or decline. The chronosequence and plot remeasurement approaches used to establish the decline pattern suffer from limitations and coarse temporal detail. We combined annual tree ring measurements and mortality models to address two questions: first, how do assumptions about tree growth and mortality influence reconstructions of biomass growth? Second, under what circumstances does biomass production follow the model that peaks early, then declines? We integrated three stochastic mortality models with a census tree-ring data set from eight temperate forest types to reconstruct stand-level biomass increments (in Minnesota, USA). We compared growth patterns among mortality models, forest types and stands. Timing of peak biomass growth varied significantly among mortality models, peaking 20–30 years earlier when mortality was random with respect to tree growth and size, than when mortality favored slow-growing individuals. Random or u-shaped mortality (highest in small or large trees) produced peak growth 25–30 % higher than the surviving tree sample alone. Growth trends for even-aged, monospecific Pinus banksiana or Acer saccharum forests were similar to the early peak and decline expectation. However, we observed continually increasing biomass growth in older, low-productivity forests of Quercus rubra, Fraxinus nigra, and Thuja occidentalis. Tree-ring reconstructions estimated annual changes in live biomass growth and identified more diverse development patterns than previous methods. These detailed, long-term patterns of biomass development are crucial for detecting recent growth responses to global change and modeling future forest dynamics.

  13. Age-dependent cognitive impairment in a Drosophila fragile X model and its pharmacological rescue.

    PubMed

    Choi, Catherine H; McBride, Sean M J; Schoenfeld, Brian P; Liebelt, David A; Ferreiro, David; Ferrick, Neal J; Hinchey, Paul; Kollaros, Maria; Rudominer, Rebecca L; Terlizzi, Allison M; Koenigsberg, Eric; Wang, Yan; Sumida, Ai; Nguyen, Hanh T; Bell, Aaron J; McDonald, Thomas V; Jongens, Thomas A

    2010-06-01

    Fragile X syndrome afflicts 1 in 2,500 individuals and is the leading heritable cause of mental retardation worldwide. The overriding clinical manifestation of this disease is mild to severe cognitive impairment. Age-dependent cognitive decline has been identified in Fragile X patients, although it has not been fully characterized nor examined in animal models. A Drosophila model of this disease has been shown to display phenotypes bearing similarity to Fragile X symptoms. Most notably, we previously identified naive courtship and memory deficits in young adults with this model that appear to be due to enhanced metabotropic glutamate receptor (mGluR) signaling. Herein we have examined age-related cognitive decline in the Drosophila Fragile X model and found an age-dependent loss of learning during training. We demonstrate that treatment with mGluR antagonists or lithium can prevent this age-dependent cognitive impairment. We also show that treatment with mGluR antagonists or lithium during development alone displays differential efficacy in its ability to rescue naive courtship, learning during training and memory in aged flies. Furthermore, we show that continuous treatment during aging effectively rescues all of these phenotypes. These results indicate that the Drosophila model recapitulates the age-dependent cognitive decline observed in humans. This places Fragile X in a category with several other diseases that result in age-dependent cognitive decline. This demonstrates a role for the Drosophila Fragile X Mental Retardation Protein (dFMR1) in neuronal physiology with regard to cognition during the aging process. Our results indicate that misregulation of mGluR activity may be causative of this age onset decline and strengthens the possibility that mGluR antagonists and lithium may be potential pharmacologic compounds for counteracting several Fragile X symptoms.

  14. Using Functional Data Analysis Models to Estimate Future Time Trends in Age-Specific Breast Cancer Mortality for the United States and England–Wales

    PubMed Central

    Erbas, Bircan; Akram, Muhammed; Gertig, Dorota M; English, Dallas; Hopper, John L.; Kavanagh, Anne M; Hyndman, Rob

    2010-01-01

    Background Mortality/incidence predictions are used for allocating public health resources and should accurately reflect age-related changes through time. We present a new forecasting model for estimating future trends in age-related breast cancer mortality for the United States and England–Wales. Methods We used functional data analysis techniques both to model breast cancer mortality-age relationships in the United States from 1950 through 2001 and England–Wales from 1950 through 2003 and to estimate 20-year predictions using a new forecasting method. Results In the United States, trends for women aged 45 to 54 years have continued to decline since 1980. In contrast, trends in women aged 60 to 84 years increased in the 1980s and declined in the 1990s. For England–Wales, trends for women aged 45 to 74 years slightly increased before 1980, but declined thereafter. The greatest age-related changes for both regions were during the 1990s. For both the United States and England–Wales, trends are expected to decline and then stabilize, with the greatest decline in women aged 60 to 70 years. Forecasts suggest relatively stable trends for women older than 75 years. Conclusions Prediction of age-related changes in mortality/incidence can be used for planning and targeting programs for specific age groups. Currently, these models are being extended to incorporate other variables that may influence age-related changes in mortality/incidence trends. In their current form, these models will be most useful for modeling and projecting future trends of diseases for which there has been very little advancement in treatment and minimal cohort effects (eg. lethal cancers). PMID:20139657

  15. Predicting chemically-induced skin reactions. Part I: QSAR models of skin sensitization and their application to identify potentially hazardous compounds

    PubMed Central

    Alves, Vinicius M.; Muratov, Eugene; Fourches, Denis; Strickland, Judy; Kleinstreuer, Nicole; Andrade, Carolina H.; Tropsha, Alexander

    2015-01-01

    Repetitive exposure to a chemical agent can induce an immune reaction in inherently susceptible individuals that leads to skin sensitization. Although many chemicals have been reported as skin sensitizers, there have been very few rigorously validated QSAR models with defined applicability domains (AD) that were developed using a large group of chemically diverse compounds. In this study, we have aimed to compile, curate, and integrate the largest publicly available dataset related to chemically-induced skin sensitization, use this data to generate rigorously validated and QSAR models for skin sensitization, and employ these models as a virtual screening tool for identifying putative sensitizers among environmental chemicals. We followed best practices for model building and validation implemented with our predictive QSAR workflow using random forest modeling technique in combination with SiRMS and Dragon descriptors. The Correct Classification Rate (CCR) for QSAR models discriminating sensitizers from non-sensitizers were 71–88% when evaluated on several external validation sets, within a broad AD, with positive (for sensitizers) and negative (for non-sensitizers) predicted rates of 85% and 79% respectively. When compared to the skin sensitization module included in the OECD QSAR toolbox as well as to the skin sensitization model in publicly available VEGA software, our models showed a significantly higher prediction accuracy for the same sets of external compounds as evaluated by Positive Predicted Rate, Negative Predicted Rate, and CCR. These models were applied to identify putative chemical hazards in the ScoreCard database of possible skin or sense organ toxicants as primary candidates for experimental validation. PMID:25560674

  16. Lyman-α Models for LRO LAMP from MESSENGER MASCS and SOHO SWAN Data

    NASA Astrophysics Data System (ADS)

    Pryor, Wayne R.; Holsclaw, Gregory M.; McClintock, William E.; Snow, Martin; Vervack, Ronald J.; Gladstone, G. Randall; Stern, S. Alan; Retherford, Kurt D.; Miles, Paul F.

    From models of the interplanetary Lyman-α glow derived from observations by the Mercury Atmospheric and Surface Composition Spectrometer (MASCS) interplanetary Lyman-α data obtained in 2009-2011 on the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft mission, daily all-sky Lyman-α maps were generated for use by the Lunar Reconnaissance Orbiter (LRO) LAMP Lyman-Alpha Mapping Project (LAMP) experiment. These models were then compared with Solar and Heliospheric Observatory (SOHO) Solar Wind ANistropy (SWAN) Lyman-α maps when available. Although the empirical agreement across the sky between the scaled model and the SWAN maps is adequate for LAMP mapping purposes, the model brightness values best agree with the SWAN values in 2008 and 2009. SWAN's observations show a systematic decline in 2010 and 2011 relative to the model. It is not clear if the decline represents a failure of the model or a decline in sensitivity in SWAN in 2010 and 2011. MESSENGER MASCS and SOHO SWAN Lyman-α calibrations systematically differ in comparison with the model, with MASCS reporting Lyman-α values some 30 % lower than SWAN.

  17. Programmable lithography engine (ProLE) grid-type supercomputer and its applications

    NASA Astrophysics Data System (ADS)

    Petersen, John S.; Maslow, Mark J.; Gerold, David J.; Greenway, Robert T.

    2003-06-01

    There are many variables that can affect lithographic dependent device yield. Because of this, it is not enough to make optical proximity corrections (OPC) based on the mask type, wavelength, lens, illumination-type and coherence. Resist chemistry and physics along with substrate, exposure, and all post-exposure processing must be considered too. Only a holistic approach to finding imaging solutions will accelerate yield and maximize performance. Since experiments are too costly in both time and money, accomplishing this takes massive amounts of accurate simulation capability. Our solution is to create a workbench that has a set of advanced user applications that utilize best-in-class simulator engines for solving litho-related DFM problems using distributive computing. Our product, ProLE (Programmable Lithography Engine), is an integrated system that combines Petersen Advanced Lithography Inc."s (PAL"s) proprietary applications and cluster management software wrapped around commercial software engines, along with optional commercial hardware and software. It uses the most rigorous lithography simulation engines to solve deep sub-wavelength imaging problems accurately and at speeds that are several orders of magnitude faster than current methods. Specifically, ProLE uses full vector thin-mask aerial image models or when needed, full across source 3D electromagnetic field simulation to make accurate aerial image predictions along with calibrated resist models;. The ProLE workstation from Petersen Advanced Lithography, Inc., is the first commercial product that makes it possible to do these intensive calculations at a fraction of a time previously available thus significantly reducing time to market for advance technology devices. In this work, ProLE is introduced, through model comparison to show why vector imaging and rigorous resist models work better than other less rigorous models, then some applications of that use our distributive computing solution are shown. Topics covered describe why ProLE solutions are needed from an economic and technical aspect, a high level discussion of how the distributive system works, speed benchmarking, and finally, a brief survey of applications including advanced aberrations for lens sensitivity and flare studies, optical-proximity-correction for a bitcell and an application that will allow evaluation of the potential of a design to have systematic failures during fabrication.

  18. The utility of estimating population-level trajectories of terminal wellbeing decline within a growth mixture modelling framework.

    PubMed

    Burns, R A; Byles, J; Magliano, D J; Mitchell, P; Anstey, K J

    2015-03-01

    Mortality-related decline has been identified across multiple domains of human functioning, including mental health and wellbeing. The current study utilised a growth mixture modelling framework to establish whether a single population-level trajectory best describes mortality-related changes in both wellbeing and mental health, or whether subpopulations report quite different mortality-related changes. Participants were older-aged (M = 69.59 years; SD = 8.08 years) deceased females (N = 1,862) from the dynamic analyses to optimise ageing (DYNOPTA) project. Growth mixture models analysed participants' responses on measures of mental health and wellbeing for up to 16 years from death. Multi-level models confirmed overall terminal decline and terminal drop in both mental health and wellbeing. However, modelling data from the same participants within a latent class growth mixture framework indicated that most participants reported stability in mental health (90.3 %) and wellbeing (89.0 %) in the years preceding death. Whilst confirming other population-level analyses which support terminal decline and drop hypotheses in both mental health and wellbeing, we subsequently identified that most of this effect is driven by a small, but significant minority of the population. Instead, most individuals report stable levels of mental health and wellbeing in the years preceding death.

  19. Quantifying temporal trends in fisheries abundance using Bayesian dynamic linear models: A case study of riverine Smallmouth Bass populations

    USGS Publications Warehouse

    Schall, Megan K.; Blazer, Vicki S.; Lorantas, Robert M.; Smith, Geoffrey; Mullican, John E.; Keplinger, Brandon J.; Wagner, Tyler

    2018-01-01

    Detecting temporal changes in fish abundance is an essential component of fisheries management. Because of the need to understand short‐term and nonlinear changes in fish abundance, traditional linear models may not provide adequate information for management decisions. This study highlights the utility of Bayesian dynamic linear models (DLMs) as a tool for quantifying temporal dynamics in fish abundance. To achieve this goal, we quantified temporal trends of Smallmouth Bass Micropterus dolomieu catch per effort (CPE) from rivers in the mid‐Atlantic states, and we calculated annual probabilities of decline from the posterior distributions of annual rates of change in CPE. We were interested in annual declines because of recent concerns about fish health in portions of the study area. In general, periods of decline were greatest within the Susquehanna River basin, Pennsylvania. The declines in CPE began in the late 1990s—prior to observations of fish health problems—and began to stabilize toward the end of the time series (2011). In contrast, many of the other rivers investigated did not have the same magnitude or duration of decline in CPE. Bayesian DLMs provide information about annual changes in abundance that can inform management and are easily communicated with managers and stakeholders.

  20. Physical activity and motor decline in older persons.

    PubMed

    Buchman, A S; Boyle, P A; Wilson, R S; Bienias, Julia L; Bennett, D A

    2007-03-01

    We tested the hypothesis that physical activity modifies the course of age-related motor decline. More than 850 older participants of the Rush Memory and Aging Project underwent baseline assessment of physical activity and annual motor testing for up to 8 years. Nine strength measures and nine motor performance measures were summarized into composite measures of motor function. In generalized estimating equation models, global motor function declined during follow-up (estimate, -0.072; SE, 0.008; P < 0.001). Each additional hour of physical activity at baseline was associated with about a 5% decrease in the rate of global motor function decline (estimate, 0.004; SE, 0.001; P = 0.007). Secondary analyses suggested that the association of physical activity with motor decline was mostly due to the effect of physical activity on the rate of motor performance decline. Thus, higher levels of physical activity are associated with a slower rate of motor decline in older persons.

Top