... Events October 19, Hastings Center Seminar, Garrison : Human Genetic Engineering: Wh at Can We Do? What Should We ... Events October 19, Hastings Center Seminar, Garrison : Human Genetic Engineering: Wh at Can We Do? What Should We ...
Disrupting the Education Monopoly: A Conversation with Reed Hastings
ERIC Educational Resources Information Center
Jacobs, Joanne
2015-01-01
This article features an interview with Netflix CEO, Reed Hastings. In this interview, Hastings relates that he told the "Wall Street Journal" in 2008 that he started looking at education--trying to figure out why our education is lagging when our technology is increasing at great rates and there's great innovation in so many other…
VIEW OF PIEDMONT AVENUE AT INTERSECTION OF HASTE STREET, CHATEAU ...
VIEW OF PIEDMONT AVENUE AT INTERSECTION OF HASTE STREET, CHATEAU APARTMENTS BY CLARENCE CASEBOLT DAKIN, 1929 AT 2747 HASTE ON WEST SIDE OF PIEDMONT. LOOKING NORTH. Photograph by Fredrica Drotos and Michael Kelly, July 8, 2006 - Piedmont Way & the Berkeley Property Tract, East of College Avenue between Dwight Way & U.C. Memorial Stadium, Berkeley, Alameda County, CA
Not Available
1993-06-30
The decision document presents the selected interim remedial actions for the Well Number 3 ground water operable units. The Well Number 3 Subsite is a subsite of the Hastings Ground Water Contamination Site, Hastings, Nebraska. The interim action ROD addresses two separate areas of groundwater contamination. Plume 1 is characterized by carbon tetrachloride (CCl4) and chloroform (CHCl3) contamination. Plume 2 is characterized primarily by trichloroethene (TCE), 1,1,1-trichloroethane (TCA), tetrachloroethene (PCE) and dichloroethene (DEC) contamination. These interim ground water remedies were developed to protect public health, welfare and the environment by controlling the migration and reducing the volume and mass of contaminants present in the ground water beneath and downgradient from each source area of the Well Number 3 Subsite.
E SERIES MAGAZINES FROM HASTINGS ST. SHOWING ACCESS DRIVE AND ...
E SERIES MAGAZINES FROM HASTINGS ST. SHOWING ACCESS DRIVE AND LOADING PLATFORMS. E 103 MAGAZINES IN FORGROUND. - Naval Magazine Lualualei, Headquarters Branch, Magazine Type, Eleventh, Thirteenth, Fifteenth, Sixteenth, & Seventeenth Streets, Pearl City, Honolulu County, HI
VIEW OF PIEDMONT AVENUE AT INTERSECTION OF HASTE STREET, NOTE ...
VIEW OF PIEDMONT AVENUE AT INTERSECTION OF HASTE STREET, NOTE RECONSTRUCTION OF MEDIAN FROM PREVIOUS VIEW IN PHOTOGRAPH CA-2-5. LOOKING NW. Photograph by Brian Grogan, July 8, 2007 - Piedmont Way & the Berkeley Property Tract, East of College Avenue between Dwight Way & U.C. Memorial Stadium, Berkeley, Alameda County, CA
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-15
... Energy Regulatory Commission City of Hastings, MN; Notice of Application for Amendment of License and...: Amendment of License. b. Project No.: 4306-035. c. Date filed: May 21, 2012. d. Applicant: City of Hastings... the Mississippi River in the City of Hastings in Dakota County, Minnesota. g. Filed Pursuant...
The Hastings Center and the early years of bioethics.
Callahan, Daniel
2012-02-01
The Hastings Center was founded in 1969 to study ethical problems in medicine and biology. The Center arose from a confluence of three social currents: the increased public scrutiny of medicine and its practices, the concern about the moral problems being generated by technological developments, and the desire of one of its founders (Callahan) to make use of his philosophical training in a more applied way. The early years of the Center were devoted to raising money, developing an early agenda of issues, and identifying a cadre of people around the country interested in the issues. Various stresses and strains in the Center and the field are identified, and some final reflections are offered on the nature and value of the contributions made by bioethics as an academic field. PMID:22198414
Multiple-try Metropolis Hastings for modeling extreme PM10 data
NASA Astrophysics Data System (ADS)
Amin, Nor Azrita Mohd; Adam, Mohd Bakri; Ibrahim, Noor Akma
2014-07-01
Awareness of catastrophic events brings the attention to work out the relationship of these events by using statistical analysis of Extreme Value Theory (EVT). This study focused on extreme PM10 data using a Gumbel distribution which is one of the Extreme Value distributions. The parameters were estimated using the new Bayesian approach in extreme called Multiple Try Metropolis-Hastings algorithms. We compared this approach with another Markov Chain Monte Carlo approach which is the classical Metropolis-Hastings algorithm and the frequentist approach, Maximum Likelihood Estimation. It appears that these three approaches provide comparable results. Data are taken for Pasir Gudang station for year 1996 to 2010.
75 FR 45011 - MainStreet Savings Bank, FSB, Hastings, MI; Notice of Appointment of Receiver
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-30
... Office of Thrift Supervision MainStreet Savings Bank, FSB, Hastings, MI; Notice of Appointment of... Owners' Loan Act, the Office of Thrift Supervision has duly appointed the Federal Deposit Insurance.... Dated: July 23, 2010. By the Office of Thrift Supervision. Sandra E. Evans, Federal Register...
Metropolis-Hastings Robbins-Monro Algorithm for Confirmatory Item Factor Analysis
ERIC Educational Resources Information Center
Cai, Li
2010-01-01
Item factor analysis (IFA), already well established in educational measurement, is increasingly applied to psychological measurement in research settings. However, high-dimensional confirmatory IFA remains a numerical challenge. The current research extends the Metropolis-Hastings Robbins-Monro (MH-RM) algorithm, initially proposed for…
High-Dimensional Exploratory Item Factor Analysis by a Metropolis-Hastings Robbins-Monro Algorithm
ERIC Educational Resources Information Center
Cai, Li
2010-01-01
A Metropolis-Hastings Robbins-Monro (MH-RM) algorithm for high-dimensional maximum marginal likelihood exploratory item factor analysis is proposed. The sequence of estimates from the MH-RM algorithm converges with probability one to the maximum likelihood solution. Details on the computer implementation of this algorithm are provided. The…
General Metropolis-Hastings jump diffusions for automatic target recognition in infrared scenes
NASA Astrophysics Data System (ADS)
Lanterman, Aaron D.; Miller, Michael I.; Snyder, Donald L.
1997-04-01
To locate and recognize ground-based targets in forward- looking IR (FLIR) images, 3D faceted models with associated pose parameters are formulated to accommodate the variability found in FLIR imagery. Taking a Bayesian approach, scenes are simulated from the emissive characteristics of the CAD models and compared with the collected data by a likelihood function based on sensor statistics. This likelihood is combined with a prior distribution defined over the set of possible scenes to form a posterior distribution. To accommodate scenes with variable numbers of targets, the posterior distribution is defined over parameter vectors of varying dimension. An inference algorithm based on Metropolis-Hastings jump- diffusion processes empirically samples from the posterior distribution, generating configurations of templates and transformations that match the collected sensor data with high probability. The jumps accommodate the addition and deletion of targets and the estimation of target identities; diffusions refine the hypotheses by drifting along the gradient of the posterior distribution with respect to the orientation and position parameters. Previous results on jumps strategies analogous to the Metropolis acceptance/rejection algorithm, with proposals drawn from the prior and accepted based on the likelihood, are extended to encompass general Metropolis-Hastings proposal densities. In particular, the algorithm proposes moves by drawing from the posterior distribution over computationally tractible subsets of the parameter space. The algorithm is illustrated by an implementation on a Silicon Graphics Onyx/Reality Engine.
ERIC Educational Resources Information Center
Yang, Ji Seung; Cai, Li
2014-01-01
The main purpose of this study is to improve estimation efficiency in obtaining maximum marginal likelihood estimates of contextual effects in the framework of nonlinear multilevel latent variable model by adopting the Metropolis-Hastings Robbins-Monro algorithm (MH-RM). Results indicate that the MH-RM algorithm can produce estimates and standard…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-20
..., Federal Register (76 FR 15937), GIPSA requested applications for designation to provide official services... Grain Inspection, Packers and Stockyards Administration Designation for the Aberdeen, SD; Decatur, IL; Hastings, NE; Fulton, IL; the State of Missouri, and the State of South Carolina Areas AGENCY:...
Link, W.A.; Barker, R.J.
2008-01-01
Judicious choice of candidate generating distributions improves efficiency of the Metropolis-Hastings algorithm. In Bayesian applications, it is sometimes possible to identify an approximation to the target posterior distribution; this approximate posterior distribution is a good choice for candidate generation. These observations are applied to analysis of the Cormack-Jolly-Seber model and its extensions. ?? Springer Science+Business Media, LLC 2007.
Link, W.A.; Barker, R.J.
2008-01-01
Judicious choice of candidate generating distributions improves efficiency of the Metropolis-Hastings algorithm. In Bayesian applications, it is sometimes possible to identify an approximation to the target posterior distribution; this approximate posterior distribution is a good choice for candidate generation. These observations are applied to analysis of the Cormack?Jolly?Seber model and its extensions.
Emerson, J.A.; Sweet, J.N.; Peterson, D.W.
1994-05-01
We demonstrate the use of HAST and Assembly Test Chips to evaluate the susceptability of epoxy molding compounds to moisture induced corrosion of Al conductors. We show that the procedure is sufficiently sensitive to discriminate between assembly processes used by different molding facilities. Our data show that the location in time of the ``knee`` in the failure distribution is dependent on material properties of the epoxy. Reducing the failure rate in the early or ``extrinsic`` region of the time-failure distribution is key to achieving high reliability. Wt examine the failure modes in the extrinsic region for test chips encapsulated with a number of high quality molding compounds in an attempt to better understand this region.
Metropolis–Hastings thermal state sampling for numerical simulations of Bose–Einstein condensates
Grišins, Pjotrs; Mazets, Igor E.
2014-01-01
We demonstrate the application of the Metropolis–Hastings algorithm to sampling of classical thermal states of one-dimensional Bose–Einstein quasicondensates in the classical fields approximation, both in untrapped and harmonically trapped case. The presented algorithm can be easily generalized to higher dimensions and arbitrary trap geometry. For truncated Wigner simulations the quantum noise can be added with conventional methods (half a quantum of energy in every mode). The advantage of the presented method over the usual analytical and stochastic ones lies in its ability to sample not only from canonical and grand canonical distributions, but also from the generalized Gibbs ensemble, which can help to shed new light on thermodynamics of integrable systems. PMID:25843966
Dynamical behavior of fractional-order Hastings-Powell food chain model and its discretization
NASA Astrophysics Data System (ADS)
Matouk, A. E.; Elsadany, A. A.; Ahmed, E.; Agiza, H. N.
2015-10-01
In this work, the dynamical behavior of fractional-order Hastings-Powell food chain model is investigated and a new discretization method of the fractional-order system is introduced. A sufficient condition for existence and uniqueness of the solution of the proposed system is obtained. Local stability of the equilibrium points of the fractional-order system is studied. Furthermore, the necessary and sufficient conditions of stability of the discretized system are also studied. It is shown that the system's fractional parameter has effect on the stability of the discretized system which shows rich variety of dynamical behaviors such as Hopf bifurcation, an attractor crisis and chaotic attractors. Numerical simulations show the tea-cup chaotic attractor of the fractional-order system and the richer dynamical behavior of the corresponding discretized system.
A constrained Metropolis Hastings search for EMRIs in the Mock LISA Data Challenge 1B
NASA Astrophysics Data System (ADS)
Gair, Jonathan R.; Porter, Edward; Babak, Stanislav; Barack, Leor
2008-09-01
We describe a search for the extreme-mass-ratio inspiral sources in the Round 1B Mock LISA Data Challenge data sets. The search algorithm is a Monte Carlo search based on the Metropolis Hastings algorithm, but also incorporates simulated, thermostated and time annealing, plus a harmonic identification stage designed to reduce the chance of the chain locking onto secondary maxima. In this paper, we focus on describing the algorithm that we have been developing. We give the results of the search of the Round 1B data, although parameter recovery has improved since that deadline. Finally, we describe several modifications to the search pipeline that we are currently investigating for incorporation in future searches.
Time Lapse Gravity and Seismic Monitoring of CO2 Injection at the West Hastings Field, Texas
NASA Astrophysics Data System (ADS)
Ferguson, J. F.; Richards, T.; Klopping, F.; MacQueen, J.; Hosseini, S. A.
2015-12-01
Time lapse or 4D gravity and seismic reflection surveys are being conducted at the West Hastings Field near Houston, Texas to monitor the progress of CO2 injection. This Department of Energy supported CO2 sequestration experiment is conducted in conjunction with a Denbury Onshore, LLC tertiary recovery project. The reservoir is at a depth of 1.8 km in the Oligocene Frio sands and has been produced since the 1930s. Goals are an accounting and mapping of the injected CO2 and to determine if migration occurs along intra-reservoir faults. An integrated interpretation of the geophysical surveys will be made together with well logs and engineering data. Gravity monitoring of water versus gas replacement has been very successful, but liquid phase CO2 monitoring is problematic due to the smaller density contrast with respect to oil and water. This reservoir has a small volume to depth ratio and hence only a small gravity difference signal is expected on the surface. New borehole gravity technology introduced by Micro-g-Lacoste can make gravity measurements at near reservoir depths with a much higher signal to noise ratio. This method has been successfully evaluated on a simulation of the Hastings project. Field operations have been conducted for repeated surface and borehole gravity surveys beginning in 2013. The surface survey of 95 stations covers an area of 3 by 5 km and 22 borehole gravity logs are run in the interval above the Frio formation. 4D seismic reflection surveys are being made at 6 month intervals on the surface and in 3 VSP wells. CO2 injection into the targeted portion of the reservoir only began in early 2015 and monitoring will continue into 2017. To date only the baseline reservoir conditions have been assessed. The overall success of the gravity monitoring will not be determined until 2017.
Chaos control of Hastings-Powell model by combining chaotic motions
NASA Astrophysics Data System (ADS)
Danca, Marius-F.; Chattopadhyay, Joydev
2016-04-01
In this paper, we propose a Parameter Switching (PS) algorithm as a new chaos control method for the Hastings-Powell (HP) system. The PS algorithm is a convergent scheme that switches the control parameter within a set of values while the controlled system is numerically integrated. The attractor obtained with the PS algorithm matches the attractor obtained by integrating the system with the parameter replaced by the averaged value of the switched parameter values. The switching rule can be applied periodically or randomly over a set of given values. In this way, every stable cycle of the HP system can be approximated if its underlying parameter value equalizes the average value of the switching values. Moreover, the PS algorithm can be viewed as a generalization of Parrondo's game, which is applied for the first time to the HP system, by showing that losing strategy can win: "losing + losing = winning." If "loosing" is replaced with "chaos" and, "winning" with "order" (as the opposite to "chaos"), then by switching the parameter value in the HP system within two values, which generate chaotic motions, the PS algorithm can approximate a stable cycle so that symbolically one can write "chaos + chaos = regular." Also, by considering a different parameter control, new complex dynamics of the HP model are revealed.
Chaos control of Hastings-Powell model by combining chaotic motions.
Danca, Marius-F; Chattopadhyay, Joydev
2016-04-01
In this paper, we propose a Parameter Switching (PS) algorithm as a new chaos control method for the Hastings-Powell (HP) system. The PS algorithm is a convergent scheme that switches the control parameter within a set of values while the controlled system is numerically integrated. The attractor obtained with the PS algorithm matches the attractor obtained by integrating the system with the parameter replaced by the averaged value of the switched parameter values. The switching rule can be applied periodically or randomly over a set of given values. In this way, every stable cycle of the HP system can be approximated if its underlying parameter value equalizes the average value of the switching values. Moreover, the PS algorithm can be viewed as a generalization of Parrondo's game, which is applied for the first time to the HP system, by showing that losing strategy can win: "losing + losing = winning." If "loosing" is replaced with "chaos" and, "winning" with "order" (as the opposite to "chaos"), then by switching the parameter value in the HP system within two values, which generate chaotic motions, the PS algorithm can approximate a stable cycle so that symbolically one can write "chaos + chaos = regular." Also, by considering a different parameter control, new complex dynamics of the HP model are revealed. PMID:27131485
The Polya Tree Sampler: Towards Efficient and Automatic Independent Metropolis-Hastings Proposals
Hanson, Timothy E.; Monteiro, João V. D.; Jara, Alejandro
2011-01-01
We present a simple, efficient, and computationally cheap sampling method for exploring an un-normalized multivariate density on ℝd, such as a posterior density, called the Polya tree sampler. The algorithm constructs an independent proposal based on an approximation of the target density. The approximation is built from a set of (initial) support points – data that act as parameters for the approximation – and the predictive density of a finite multivariate Polya tree. In an initial “warming-up” phase, the support points are iteratively relocated to regions of higher support under the target distribution to minimize the distance between the target distribution and the Polya tree predictive distribution. In the “sampling” phase, samples from the final approximating mixture of finite Polya trees are used as candidates which are accepted with a standard Metropolis-Hastings acceptance probability. Several illustrations are presented, including comparisons of the proposed approach to Metropolis-within-Gibbs and delayed rejection adaptive Metropolis algorithm. PMID:22135487
ERIC Educational Resources Information Center
Yang, Ji Seung; Cai, Li
2013-01-01
The main purpose of this study is to improve estimation efficiency in obtaining full-information maximum likelihood (FIML) estimates of contextual effects in the framework of a nonlinear multilevel latent variable model by adopting the Metropolis-Hastings Robbins-Monro algorithm (MH-RM; Cai, 2008, 2010a, 2010b). Results indicate that the MH-RM…
NASA Astrophysics Data System (ADS)
Mohammadi, F.; Saberi, A. A.; Rouhani, S.
2009-09-01
In this paper, we analyze the scaling behavior of a diffusion-limited aggregation (DLA) simulated by the Hastings-Levitov method. We obtain the fractal dimension of the clusters by direct analysis of the geometrical patterns, in good agreement with one obtained from an analytical approach. We compute the two-point density correlation function and we show that, in the large-size limit, it agrees with the obtained fractal dimension. These support the statistical agreement between the patterns and DLA clusters. We also investigate the scaling properties of various length scales and their fluctuations, related to the boundary of the cluster. We find that all of the length scales do not have a simple scaling with the same correction to scaling exponent. The fractal dimension of the perimeter is obtained equal to that of the cluster. The growth exponent is computed from the evolution of the interface width equal to β = 0.557(2). We also show that the perimeter of the DLA cluster has an asymptotic multiscaling behavior.
Mohammadi, F; Saberi, A A; Rouhani, S
2009-09-16
In this paper, we analyze the scaling behavior of a diffusion-limited aggregation (DLA) simulated by the Hastings-Levitov method. We obtain the fractal dimension of the clusters by direct analysis of the geometrical patterns, in good agreement with one obtained from an analytical approach. We compute the two-point density correlation function and we show that, in the large-size limit, it agrees with the obtained fractal dimension. These support the statistical agreement between the patterns and DLA clusters. We also investigate the scaling properties of various length scales and their fluctuations, related to the boundary of the cluster. We find that all of the length scales do not have a simple scaling with the same correction to scaling exponent. The fractal dimension of the perimeter is obtained equal to that of the cluster. The growth exponent is computed from the evolution of the interface width equal to β = 0.557(2). We also show that the perimeter of the DLA cluster has an asymptotic multiscaling behavior. PMID:21832341
NASA Astrophysics Data System (ADS)
Cooper, Keith; Boyd, Sian; Eggleton, Jacqueline; Limpenny, David; Rees, Hubert; Vanstaen, Koen
2007-12-01
The aim of this study was to investigate the effect of dredging intensity on the physical and biological recovery times of the seabed following marine aggregate dredging. Two areas of seabed, previously subject to, respectively, relatively high and lower levels of dredging intensity, were identified on the Hastings Shingle Bank. Two reference areas were also selected for comparative purposes. All four sites were monitored annually over the period 2001-2004, using a combination of acoustic, video and grab sampling techniques. Since the site was last dredged in 1996, this was intended to provide a sequence of data 5-8 years after cessation of dredging. However, an unexpected resumption of dredging within the high intensity site, during 2002 and 2003, allowed an additional assessment of the immediate effects and aftermath of renewed dredging at the seabed. The early stages of recovery could then be assessed after dredging ceased in 2003. Results from both dredged sites provide a useful insight into the early and latter stages of physical and biological recovery. A comparison of recent and historic dredge track features provided evidence of track erosion. However, tracks were still visible 8 years after the cessation of dredging. Within the high dredging intensity site, recolonisation was relatively rapid after the cessation of dredging in 2003. Rather than indicating a full recovery, we suggest that this initial 'colonization community' may enter a transition phase before eventually reaching equilibrium. This hypothesis is supported by results from the low intensity site, where biological recovery was judged to have taken 7 years. Further monitoring is needed in order to test this. An alternative explanation is that the rapid recovery may be explained by the settlement of large numbers of Sabellaria spinulosa. As the resumption of dredging within the high intensity site limited our assessment of longer-term recovery it is not yet possible to assume that a 7-year
Schouten, Charlotte S.; de Bree, Remco; van der Putten, Lisa; Noij, Daniel P.; Hoekstra, Otto S.; Comans, Emile F.I.; Witte, Birgit I.; Doornaert, Patricia A.; Leemans, C. René
2014-01-01
Main problem Diffusion-weighted MRI (DW-MRI) has potential to predict chemoradiotherapy (CRT) response in head and neck squamous cell carcinoma (HNSCC) and is generally performed using echo-planar imaging (EPI). However, EPI-DWI is susceptible to geometric distortions. Half-fourier acquisition single-shot turbo spin-echo (HASTE)-DWI may be an alternative. This prospective pilot study evaluates the potential predictive value of EPI- and HASTE-DWI and 18F-fluorodeoxyglucose PET-CT (18F-FDG-PET-CT) early during CRT for locoregional outcome in HNSCC. Methods Eight patients with advanced HNSCC (7 primary tumors and 25 nodal metastases) scheduled for CRT, underwent DW-MRI (using both EPI- and HASTE-DWI) and 18F-FDG-PET(-CT) pretreatment, early during treatment and three months after treatment. Median follow-up time was 38 months. Results No local recurrences were detected during follow-up. Median Apparent Diffusion Coefficient (ADC)EPI-values in primary tumors increased from 77×10–5 mm2/s pretreatment, to 113×10–5 mm2/s during treatment (P=0.02), whereas ADCHASTE did not increase (74 and 74 mm2/s, respectively). Two regional recurrences were diagnosed. During treatment, ADCEPI tended to be higher for patients with regional control [(117.3±12.1)×10–5 mm2/s] than for patients with a recurrence [(98.0±4.2)×10–5 mm2/s]. This difference was not seen with ADCHASTE. No correlations between ΔADCEPI and ΔSUV (Standardized Uptake Value) were found in the primary tumor or nodal metastases. Conclusions HASTE-DWI seems to be inadequate in early CRT response prediction, compared to EPI-DWI which has potential to predict locoregional outcome. EPI-DWI and 18F-FDG-PET-CT potentially provide independent information in the early response to treatment, since no correlations were found between ΔADCEPI and ΔSUV. PMID:25202659
Bauer, Rebecca; Mentré, France; Kaddouri, Halima; Le Bras, Jacques; Le Nagard, Hervé
2014-12-01
Malaria is one of the world׳s most widespread parasitic diseases. The parasitic protozoans of the genus Plasmodium have developed resistance to several antimalarial drugs. Some patients are therefore infected by two or more strains with different levels of antimalarial drug sensitivity. We previously developed a model to estimate the drug concentration (IC50) that inhibits 50% of the growth of the parasite isolated from a patient infected with one strain. We propose here a new Two-Slopes model for patients infected by two strains. This model involves four parameters: the proportion of each strain and their IC50, and the sigmoidicity parameter. To estimate the parameters of this model, we have developed a new algorithm called PGBO (Population Genetics-Based Optimizer). It is based on the Metropolis-Hasting algorithm and is implemented in the statistical software R. We performed a simulation study and defined three evaluation criteria to evaluate its properties and compare it with three other algorithms (Gauss-Newton, Levenberg-Marquardt, and a simulated annealing). We also evaluated it using in vitro data and three ex vivo datasets from the French Malaria Reference Center. Our evaluation criteria in the simulation show that PGBO gives good estimates of the parameters even if the concentration design is poor. Moreover, our algorithm is less sensitive than Gauss-Newton algorithms to initial values. Although parameter estimation is good, interpretation of the results can be difficult if the proportion of the second strain is close to 0 or 1. For these reasons, this approach cannot yet be implemented routinely. PMID:25450214
NASA Astrophysics Data System (ADS)
MacBean, Natasha; Disney, Mathias; Lewis, Philip; Ineson, Phil
2010-05-01
profile as a whole. We present results from an Observing System Simulation Experiment (OSSE) designed to investigate the impact of management and climate change on peatland carbon fluxes, as well as how observations from satellites may be able to constrain modeled carbon fluxes. We use an adapted version of the Carnegie-Ames-Stanford Approach (CASA) model (Potter et al., 1993) that includes a representation of methane dynamics (Potter, 1997). The model formulation is further modified to allow for assimilation of satellite observations of surface soil moisture and land surface temperature. The observations are used to update model estimates using a Metropolis Hastings Markov Chain Monte Carlo (MCMC) approach. We examine the effect of temporal frequency and precision of satellite observations with a view to establishing how, and at what level, such observations would make a significant improvement in model uncertainty. We compare this with the system characteristics of existing and future satellites. We believe this is the first attempt to assimilate surface soil moisture and land surface temperature into an ecosystem model that includes a full representation of CH4 flux. Bubier, J., and T. Moore (1994), An ecological perspective on methane emissions from northern wetlands, TREE, 9, 460-464. Charman, D. (2002), Peatlands and Environmental Change, JohnWiley and Sons, Ltd, England. Gorham, E. (1991), Northern peatlands: Role in the carbon cycle and probable responses to climatic warming, Ecological Applications, 1, 182-195. Lai, D. (2009), Methane dynamics in northern peatlands: A review, Pedosphere, 19, 409-421. Le Mer, J., and P. Roger (2001), Production, oxidation, emission and consumption of methane by soils: A review, European Journal of Soil Biology, 37, 25-50. Limpens, J., F. Berendse, J. Canadell, C. Freeman, J. Holden, N. Roulet, H. Rydin, and Potter, C. (1997), An ecosystem simulation model for methane production and emission from wetlands, Global Biogeochemical
Avoid haste in defining human muscular Sarcocystosis
Technology Transfer Automated Retrieval System (TEKTRAN)
We appreciate Dr. Italiano’s [1] interest in our article [2] and agree that our case definition, described in our methods as ‘intentionally specific,’ may have resulted in the exclusion of some travelers infected with Sarcocystis nesbitti. Nevertheless, we believe published data from outbreak invest...
Cases in Bioethics from the Hastings Center Report.
ERIC Educational Resources Information Center
Levine, Carol, Ed.; Veatch, Robert M.
Case studies of ethical issues based on real events are followed by comments illustrating how people from various ethical traditions and frameworks and from different academic and professional disciplines analyze the issues and work toward a resolution of the conflict posed. The cases are intended to help the public and professional persons pursue…
Child malnutrition and the Millennium Development Goals: much haste but less speed?
Oruamabo, Raphael S
2015-02-01
The Millennium Development Goals (MDGs) provide a framework for measuring the progress of nations. Several of these goals relate to child malnutrition, which remains an important contributor to child morbidity and mortality, accounting for approximately 45% of child deaths globally. A high proportion of undernourished children still live in Africa and parts of Asia, and the uneven rate of reduction in the prevalence of various types of child malnutrition among different income groups worldwide is worrying. Attempts to reduce child malnutrition should therefore begin from the grassroots by improving primary healthcare services in developing countries with particular focus on basic requirements. Adequate nutrition should be provided from birth, through infancy, preschool and early childhood to adolescence. The overall strategy should be one of careful and meticulous planning involving all development sectors with an emphasis on a bottom-up approach within a stable and disciplined polity; the MDGs will be only be useful if they are seen not as narrow objectives with unidirectional interventions but as multifaceted and co-ordinated. The setting of deadlines, whether 2015 or 2035, should not be emphasised so as to avoid hasty decision making. The top priority should be the implementation of the essential social services of basic education, primary healthcare, nutrition, reproductive health care, water and sanitation in partnership with the developed economies. PMID:25613961
On the Uses of the Humanities: Vision and Application. A Report by the Hastings Center.
ERIC Educational Resources Information Center
Hastings Center, Hastings-on-Hudson, NY.
Designed to provide a general assessment of the rapidly growing applied humanities movement in the United States, this report sets forth issues found to be central to an assessment of current work in the applied humanities, presents a general survey of the nature of applied work in a variety of disciplines, and offers conclusions and observations…
Making Haste Slowly: The Evolution of a Unified Qualifications Framework in Scotland
ERIC Educational Resources Information Center
Raffe, David
2007-01-01
The Scottish Credit and Qualifications Framework is often claimed to be a success and an example to other countries. However, if other countries wish to learn from this example they should not only study the current framework; they should also examine the sequence of policy reforms, over a quarter of a century, through which it developed. The…
The Ethics of Legislative Life. A Report by the Hastings Center.
ERIC Educational Resources Information Center
Hastings Center, Hastings-on-Hudson, NY.
Results of a two-year research project (1982-84) on Legislative and Representative Ethics are presented in this report, which analyzes the basic principles of legislative ethics, discusses the special dilemmas and obligations of legislators, and offers conclusions about future steps that could be taken to enhance public discussion and to reinforce…
ERIC Educational Resources Information Center
Monroe, Scott; Cai, Li
2014-01-01
In Ramsay curve item response theory (RC-IRT) modeling, the shape of the latent trait distribution is estimated simultaneously with the item parameters. In its original implementation, RC-IRT is estimated via Bock and Aitkin's EM algorithm, which yields maximum marginal likelihood estimates. This method, however, does not produce the…
When does haste make waste? Speed-accuracy tradeoff, skill level, and the tools of the trade.
Beilock, Sian L; Bertenthal, Bennett I; Hoerger, Michael; Carr, Thomas H
2008-12-01
Novice and skilled golfers took a series of golf putts with a standard putter (Exp. 1) or a distorted funny putter (consisting of an s-shaped and arbitrarily weighted putter shaft; Exp. 2) under instructions to either (a) take as much time as needed to be accurate or to (b) putt as fast as possible while still being accurate. Planning and movement time were measured for each putt. In both experiments, novices produced the typical speed-accuracy trade-off. Going slower, in terms of both the planning and movement components of execution, improved performance. In contrast, skilled golfers benefited from reduced performance time when using the standard putter in Exp. 1 and, specifically, taking less time to plan improved performance. In Exp. 2, skilled golfers improved by going slower when using the funny putter, but only when it was unfamiliar. Thus, skilled performance benefits from speed instructions when wielding highly familiar tools (i.e., the standard putter) is harmed when using new tools (i.e., the funny putter), and benefits again by speed instructions as the new tool becomes familiar. Planning time absorbs these changes. PMID:19102617
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-22
...The designations of the official agencies listed below will end on September 30, 2011. We are asking persons or governmental agencies interested in providing official services in the areas presently served by these agencies to submit an application for designation. In addition, we are asking for comments on the quality of services provided by the following designated agencies: Aberdeen Grain......
Haste Makes Waste but Condition Matters: Molt Rate–Feather Quality Trade-Off in a Sedentary Songbird
Vágási, Csongor I.; Pap, Péter L.; Vincze, Orsolya; Benkő, Zoltán; Marton, Attila; Barta, Zoltán
2012-01-01
Background The trade-off between current and residual reproductive values is central to life history theory, although the possible mechanisms underlying this trade-off are largely unknown. The ‘molt constraint’ hypothesis suggests that molt and plumage functionality are compromised by the preceding breeding event, yet this candidate mechanism remains insufficiently explored. Methodology/Principal Findings The seasonal change in photoperiod was manipulated to accelerate the molt rate. This treatment simulates the case of naturally late-breeding birds. House sparrows Passer domesticus experiencing accelerated molt developed shorter flight feathers with more fault bars and body feathers with supposedly lower insulation capacity (i.e. shorter, smaller, with a higher barbule density and fewer plumulaceous barbs). However, the wing, tail and primary feather lengths were shorter in fast-molting birds if they had an inferior body condition, which has been largely overlooked in previous studies. The rachis width of flight feathers was not affected by the treatment, but it was still condition-dependent. Conclusions/Significance This study shows that sedentary birds might face evolutionary costs because of the molt rate–feather quality conflict. This is the first study to experimentally demonstrate that (1) molt rate affects several aspects of body feathers as well as flight feathers and (2) the costly effects of rapid molt are condition-specific. We conclude that molt rate and its association with feather quality might be a major mediator of life history trade-offs. Our findings also suggest a novel advantage of early breeding, i.e. the facilitation of slower molt and the condition-dependent regulation of feather growth. PMID:22808221
ERIC Educational Resources Information Center
Yang, Ji Seung
2012-01-01
Nonlinear multilevel latent variable modeling has been suggested as an alternative to traditional hierarchical linear modeling to more properly handle measurement error and sampling error issues in contextual effects modeling. However, a nonlinear multilevel latent variable model requires significant computational effort because the estimation…
When Does Haste Make Waste? Speed-Accuracy Tradeoff, Skill Level, and the Tools of the Trade
ERIC Educational Resources Information Center
Beilock, Sian L.; Bertenthal, Bennett I.; Hoerger, Michael; Carr, Thomas H.
2008-01-01
Novice and skilled golfers took a series of golf putts with a standard putter (Exp. 1) or a distorted "funny putter" (consisting of an s-shaped and arbitrarily weighted putter shaft; Exp. 2) under instructions to either (a) take as much time as needed to be accurate or to (b) putt as fast as possible while still being accurate. Planning and…
pHAST (pH-Driven Aptamer Switch for Thrombin) Catch-and-Release of Target Protein.
McConnell, E M; Bolzon, R; Mezin, P; Frahm, G; Johnston, M; DeRosa, M C
2016-06-15
A pH-driven DNA nanomachine based on the human α-thrombin binding aptamer was designed for the specific catch-and-release of human α-thrombin at neutral and acidic pH, respectively. In neutral conditions, the thrombin aptamer component of the nanomachine is exposed and exists in the G-quadruplex conformation required to bind to the target protein. At slightly acidic pH, the polyadenine tail of the nanomachine becomes partially protonated and A+(anti)•G(syn) mispairing results in a conformational change, causing the target protein to be released. Förster resonance energy transfer (FRET) was used to monitor conformational switching over multiple pH cycles. Electrophoretic mobility shift assay (EMSA) and fluorescence anisotropy were used to show pH dependent protein binding and release by the nanomachine. This approach could be applied generally to existing G-rich aptamers to develop novel biosensors, theranostics, and nanoswitches. PMID:27115292
Structural, Mechanistic, and Antigenic Characterization of the Human Astrovirus Capsid
York, Royce L.; Yousefi, Payam A.; Bogdanoff, Walter; Haile, Sara; Tripathi, Sarvind
2015-01-01
ABSTRACT Human astroviruses (HAstVs) are nonenveloped, positive-sense, single-stranded RNA viruses that are a leading cause of viral gastroenteritis. HAstV particles display T=3 icosahedral symmetry formed by 180 copies of the capsid protein (CP), which undergoes proteolytic maturation to generate infectious HAstV particles. Little is known about the molecular features that govern HAstV particle assembly, maturation, infectivity, and immunogenicity. Here we report the crystal structures of the two main structural domains of the HAstV CP: the core domain at 2.60-Å resolution and the spike domain at 0.95-Å resolution. Fitting of these structures into the previously determined 25-Å-resolution electron cryomicroscopy density maps of HAstV allowed us to characterize the molecular features on the surfaces of immature and mature T=3 HAstV particles. The highly electropositive inner surface of HAstV supports a model in which interaction of the HAstV CP core with viral RNA is a driving force in T=3 HAstV particle formation. Additionally, mapping of conserved residues onto the HAstV CP core and spike domains in the context of the immature and mature HAstV particles revealed dramatic changes to the exposure of conserved residues during virus maturation. Indeed, we show that antibodies raised against mature HAstV have reactivity to both the HAstV CP core and spike domains, revealing for the first time that the CP core domain is antigenic. Together, these data provide new molecular insights into HAstV that have practical applications for the development of vaccines and antiviral therapies. IMPORTANCE Astroviruses are a leading cause of viral diarrhea in young children, immunocompromised individuals, and the elderly. Despite the prevalence of astroviruses, little is known at the molecular level about how the astrovirus particle assembles and is converted into an infectious, mature virus. In this paper, we describe the high-resolution structures of the two main astrovirus
A quasi-Monte Carlo Metropolis algorithm
Owen, Art B.; Tribble, Seth D.
2005-01-01
This work presents a version of the Metropolis–Hastings algorithm using quasi-Monte Carlo inputs. We prove that the method yields consistent estimates in some problems with finite state spaces and completely uniformly distributed inputs. In some numerical examples, the proposed method is much more accurate than ordinary Metropolis–Hastings sampling. PMID:15956207
1. Photocopied October 1976, from F.B. Tower, Illistrations of the ...
1. Photocopied October 1976, from F.B. Tower, Illistrations of the Croton Aqueduct, New York: Wiley and Putnam, 1843. CROTON AQUEDUCT AT HASTINGS: ARCH PROVIDED ACCESS TO STONE QUARRY. PLATE XVII, PAGE 106. - Old Croton Aqueduct, Quarry Railroad Bridge, Aqueduct Lane at Williams Street, Hastings-on-Hudson, Westchester County, NY
Negative Emotional Reactions to Challenging Behaviour and Staff Burnout: Two Replication Studies
ERIC Educational Resources Information Center
Rose, David; Horne, Sharon; Rose, John L.; Hastings, Richard P.
2004-01-01
Background: Hastings, R. P. ["American Journal on Mental Retardation" (2002) Vol. 107, pp. 455-467] hypothesized that staff negative emotional reactions to challenging behaviour might accumulate over time to affect staff well-being. Only one previous study (Mitchell, G.& Hastings, R. P. ["American Journal on Mental Retardation" (2001) Vol. 106,…
ERIC Educational Resources Information Center
Dumas, Chad; Kautz, Craig
2014-01-01
In the Hastings Nebraska Public schools, two of the eight schools have been identified as national models of educational effectiveness. In seven of eight buildings, in just four years, student test scores have increased from around 60% proficiency to around 80% proficiency or better. At Hastings, central office leaders emphasize three key…
Surveillance of Human Astrovirus Infection in Brazil: The First Report of MLB1 Astrovirus
Xavier, Maria da Penha Trindade Pinheiro; Carvalho Costa, Filipe Aníbal; Rocha, Mônica Simões; de Andrade, Juliana da Silva Ribeiro; Diniz, Fernanda Kreischer Bandeira; de Andrade, Thais Ramos; Miagostovich, Marize Pereira; Leite, José Paulo Gagliardi; Volotão, Eduardo de Mello
2015-01-01
Human astrovirus (HAstV) represents the third most common virus associated with acute diarrhea (AD). This study aimed to estimate the prevalence of HAstV infection in Brazilian children under 5 years of age with AD, investigate the presence of recently described HAstV strains, through extensive laboratory-based surveillance of enteric viral agents in three Brazilian coastal regions between 2005 and 2011. Using reverse transcription-polymerase chain reaction (RT-PCR), the overall HAstV detection rate reached 7.1% (207/2.913) with percentage varying according to the geographic region: 3.9% (36/921) in the northeast, 7.9% in the south (71/903) and 9.2% in the southeast (100/1.089) (p < 0.001). HAstV were detected in cases of all age groups. Detection rates were slightly higher during the spring. Nucleotide sequence analysis of a 320-bp ORF2 fragment revealed that HAstV-1 was the predominant genotype throughout the seven years of the study. The novel AstV-MLB1 was detected in two children with AD from a subset of 200 samples tested, demonstrating the circulation of this virus both the in northeastern and southeastern regions of Brazil. These results provide additional epidemiological and molecular data on HAstV circulation in three Brazilian coastal regions, highlighting its potential to cause infantile AD. PMID:26274322
Edvardsen, Anne; Ryg, Morten; Akerø, Aina; Christensen, Carl Christian; Skjønsberg, Ole H
2013-11-01
The reduced pressure in an aircraft cabin may cause significant hypoxaemia and respiratory symptoms in patients with chronic obstructive pulmonary disease (COPD). The current study evaluated whether there is a relationship between hypoxaemia obtained during hypoxia-altitude simulation testing (HAST), simulating an altitude of 2438 m, and the reporting of respiratory symptoms during air travel. 82 patients with moderate to very severe COPD answered an air travel questionnaire. Arterial oxygen tensions during HAST (PaO2HAST) in subjects with and without in-flight respiratory symptoms were compared. The same questionnaire was answered within 1 year after the HAST. Mean ± sd PaO2HAST was 6.3 ± 0.6 kPa and 62 (76%) of the patients had PaO2HAST <6.6 kPa. 38 (46%) patients had experienced respiratory symptoms during air travel. There was no difference in PaO2HAST in those with and those without in-flight respiratory symptoms (6.3 ± 0.7 kPa versus 6.3 ± 0.6 kPa, respectively; p=0.926). 54 (66%) patients travelled by air after the HAST, and patients equipped with supplemental oxygen (n = 23, 43%) reported less respiratory symptoms when flying with than those without such treatment (four (17%) versus 11 (48%) patients; p=0.039). In conclusion, no difference in PaO2HAST was found between COPD patients with and without respiratory symptoms during air travel. PMID:23258777
Bonaparte, Rheba S.; Hair, Pamela S.; Banthia, Deepa; Marshall, Dawn M.; Cunnion, Kenji M.; Krishna, Neel K.
2008-01-01
Human astroviruses (HAstVs) belong to a family of nonenveloped, icosahedral RNA viruses that cause noninflammatory gastroenteritis, predominantly in infants. Eight HAstV serotypes have been identified, with a worldwide distribution. While the HAstVs represent a significant public health concern, very little is known about the pathogenesis of and host immune response to these viruses. Here we demonstrate that HAstV type 1 (HAstV-1) virions, specifically the viral coat protein (CP), suppress the complement system, a fundamental component of the innate immune response in vertebrates. HAstV-1 virions and purified CP both suppress hemolytic complement activity. Hemolytic assays utilizing sera depleted of individual complement factors as well as adding back purified factors demonstrated that HAstV CP suppresses classical pathway activation at the first component, C1. HAstV-1 CP bound the A chain of C1q and inhibited serum complement activation, resulting in decreased C4b, iC3b, and terminal C5b-9 formation. Inhibition of complement activation was also demonstrated for HAstV serotypes 2 to 4, suggesting that this phenomenon is a general feature of these human pathogens. Since complement is a major contributor to the initiation and amplification of inflammation, the observed CP-mediated inhibition of complement activity may contribute to the lack of inflammation associated with astrovirus-induced gastroenteritis. Although diverse mechanisms of inhibition of complement activation have been described for many enveloped animal viruses, this is the first report of a nonenveloped icosahedral virus CP inhibiting classical pathway activation at C1. PMID:17959658
Castillo, T.
1994-10-01
Plastic semiconductor packages were characterized as possible alternatives for canned devices, which are susceptible to internal shorts caused by conductive particles. Highly accelerated stress testing (HAST) as well as electrical and mechanical testing were conducted on plastic technology devices.
2. Historic American Buildings Survey Cal State Div. of Beaches ...
2. Historic American Buildings Survey Cal State Div. of Beaches & Parks Collection Sketch of 1857 Rephoto 1960 NORTHEAST CORNER ELEVATION - B. F. Hastings Bank Building, 128-132 J Street, Sacramento, Sacramento County, CA
1. Historic American Buildings Survey Cal. State Div. Beaches & ...
1. Historic American Buildings Survey Cal. State Div. Beaches & Parks Collection Sketch of 1857 Rephoto 1960 NORTHEAST CORNER ELEVATION - B. F. Hastings Bank Building, 128-132 J Street, Sacramento, Sacramento County, CA
BLDG F101, FRONT ELEVATION Naval Magazine Lualualei, Headquarters Branch, ...
BLDG F101, FRONT ELEVATION - Naval Magazine Lualualei, Headquarters Branch, Ammo Rework-Overhall Building Types, Eighteenth Street & Fence Road near Hastings Street intersection, Pearl City, Honolulu County, HI
BLDG F102, INTERIOR VIEW. Naval Magazine Lualualei, Headquarters Branch, ...
BLDG F102, INTERIOR VIEW. - Naval Magazine Lualualei, Headquarters Branch, Ammo Rework-Overhall Building Types, Eighteenth Street & Fence Road near Hastings Street intersection, Pearl City, Honolulu County, HI
BLDG F101, NORTH END AND REAR (EAST) SIDE. Naval ...
BLDG F101, NORTH END AND REAR (EAST) SIDE. - Naval Magazine Lualualei, Headquarters Branch, Ammo Rework-Overhall Building Types, Eighteenth Street & Fence Road near Hastings Street intersection, Pearl City, Honolulu County, HI
BLDG F101, FRONT ELEVATION W/POLE Naval Magazine Lualualei, Headquarters ...
BLDG F101, FRONT ELEVATION W/POLE - Naval Magazine Lualualei, Headquarters Branch, Ammo Rework-Overhall Building Types, Eighteenth Street & Fence Road near Hastings Street intersection, Pearl City, Honolulu County, HI
BLDG F101, SOUTH END AND FRONT (WEST). F102 IN BACKGROUND. ...
BLDG F101, SOUTH END AND FRONT (WEST). F102 IN BACKGROUND. - Naval Magazine Lualualei, Headquarters Branch, Ammo Rework-Overhall Building Types, Eighteenth Street & Fence Road near Hastings Street intersection, Pearl City, Honolulu County, HI
24. BUILDING NO. 266, GENERAL PURPOSE LABORATORY (ORIGINALLY MAGAZINE FOR ...
24. BUILDING NO. 266, GENERAL PURPOSE LABORATORY (ORIGINALLY MAGAZINE FOR HE 'A' PUMP & CHANGE HOUSE), LOOKING HAST AT NORTHWEST SIDE OF BUILDING. - Picatinny Arsenal, 200 Area, Shell Component Loading, State Route 15 near I-80, Dover, Morris County, NJ
A Comparison of Estimation Methods for a Multi-unidimensional Graded Response IRT Model
Kuo, Tzu-Chun; Sheng, Yanyan
2016-01-01
This study compared several parameter estimation methods for multi-unidimensional graded response models using their corresponding statistical software programs and packages. Specifically, we compared two marginal maximum likelihood (MML) approaches (Bock-Aitkin expectation-maximum algorithm, adaptive quadrature approach), four fully Bayesian algorithms (Gibbs sampling, Metropolis-Hastings, Hastings-within-Gibbs, blocked Metropolis), and the Metropolis-Hastings Robbins-Monro (MHRM) algorithm via the use of IRTPRO, BMIRT, and MATLAB. Simulation results suggested that, when the intertrait correlation was low, these estimation methods provided similar results. However, if the dimensions were moderately or highly correlated, Hastings-within-Gibbs had an overall better parameter recovery of item discrimination and intertrait correlation parameters. The performances of these estimation methods with different sample sizes and test lengths are also discussed. PMID:27375545
3. Historic American Buildings Survey California State Library Collection Sacramento ...
3. Historic American Buildings Survey California State Library Collection Sacramento Co. History Thompson & West Sketch of 1880 Rephoto 1960 NORTHEAST CORNER - B. F. Hastings Bank Building, 128-132 J Street, Sacramento, Sacramento County, CA
A Comparison of Estimation Methods for a Multi-unidimensional Graded Response IRT Model.
Kuo, Tzu-Chun; Sheng, Yanyan
2016-01-01
This study compared several parameter estimation methods for multi-unidimensional graded response models using their corresponding statistical software programs and packages. Specifically, we compared two marginal maximum likelihood (MML) approaches (Bock-Aitkin expectation-maximum algorithm, adaptive quadrature approach), four fully Bayesian algorithms (Gibbs sampling, Metropolis-Hastings, Hastings-within-Gibbs, blocked Metropolis), and the Metropolis-Hastings Robbins-Monro (MHRM) algorithm via the use of IRTPRO, BMIRT, and MATLAB. Simulation results suggested that, when the intertrait correlation was low, these estimation methods provided similar results. However, if the dimensions were moderately or highly correlated, Hastings-within-Gibbs had an overall better parameter recovery of item discrimination and intertrait correlation parameters. The performances of these estimation methods with different sample sizes and test lengths are also discussed. PMID:27375545
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-27
..., to Stuart M. Strauss, Esq., Clifford Chance US LLP (October 24, 2006); Letter from James A... Markets, to Domenick Pugliese, Esq., Paul, Hastings, Janofsky and Walker LLP (June 27, 2007). At least...
16. Photocopy of postcard (from Ardella Fish Shanks) Frank Stumm, ...
16. Photocopy of postcard (from Ardella Fish Shanks) Frank Stumm, photographer ca. 1908-16 SOUTH FRONT, HASTINGS HOUSE IN BACKGROUND - Riddell Fish House, 245 West K Street, Benicia, Solano County, CA
... polls available at the moment. The Hastings Center Bioethics Briefing Book From Birth to Death and Bench ... Journalists, Policymakers, and Educators Close Table of Contents Bioethics and Policy—A History Daniel Callahan Why Bioethics ...
Espinosa-Hernández, Wendy; Velez-Uriza, Dora; Valdés, Jesús; Vélez-Del Valle, Cristina; Salas-Benito, Juan; Martínez-Contreras, Rebeca; García-Espítia, Matilde; Salas-Benito, Mariana; Vega-Almeida, Tania; De Nova-Ocampo, Mónica
2014-01-01
The 3' untranslated region (3'UTR) of human astroviruses (HAstV) consists of two hairpin structures (helix I and II) joined by a linker harboring a conserved PTB/hnRNP1 binding site. The identification and characterization of cellular proteins that interact with the 3'UTR of HAstV-8 virus will help to uncover cellular requirements for viral functions. To this end, mobility shift assays and UV cross-linking were performed with uninfected and HAstV-8-infected cell extracts and HAstV-8 3'UTR probes. Two RNA-protein complexes (CI and CII) were recruited into the 3'UTR. Complex CII formation was compromised with cold homologous RNA, and seven proteins of 35, 40, 45, 50, 52, 57/60 and 75 kDa were cross-linked to the 3'UTR. Supermobility shift assays indicated that PTB/hnRNP1 is part of this complex, and 3'UTR-crosslinked PTB/hnRNP1 was immunoprecipitated from HAstV-8 infected cell-membrane extracts. Also, immunofluorescence analyses revealed that PTB/hnRNP1 is distributed in the nucleus and cytoplasm of uninfected cells, but it is mainly localized perinuclearly in the cytoplasm of HAstV-8 infected cells. Furthermore, the minimal 3'UTR sequences recognized by recombinant PTB are those conforming helix I, and an intact PTB/hnRNP1-binding site. Finally, small interfering RNA-mediated PTB/hnRNP1 silencing reduced synthesis viral genome and virus yield in CaCo2 cells, suggesting that PTB/hnRNP1 is required for HAstV replication. In conclusion, PTB/hnRNP1 binds to the 3'UTR HAstV-8 and is required or participates in viral replication. PMID:25406089
Espinosa-Hernández, Wendy; Velez-Uriza, Dora; Valdés, Jesús; Vélez-Del Valle, Cristina; Salas-Benito, Juan; Martínez-Contreras, Rebeca; García-Espítia, Matilde; Salas-Benito, Mariana; Vega-Almeida, Tania; De Nova-Ocampo, Mónica
2014-01-01
The 3′ untranslated region (3′UTR) of human astroviruses (HAstV) consists of two hairpin structures (helix I and II) joined by a linker harboring a conserved PTB/hnRNP1 binding site. The identification and characterization of cellular proteins that interact with the 3′UTR of HAstV-8 virus will help to uncover cellular requirements for viral functions. To this end, mobility shift assays and UV cross-linking were performed with uninfected and HAstV-8-infected cell extracts and HAstV-8 3′UTR probes. Two RNA-protein complexes (CI and CII) were recruited into the 3′UTR. Complex CII formation was compromised with cold homologous RNA, and seven proteins of 35, 40, 45, 50, 52, 57/60 and 75 kDa were cross-linked to the 3′UTR. Supermobility shift assays indicated that PTB/hnRNP1 is part of this complex, and 3′UTR-crosslinked PTB/hnRNP1 was immunoprecipitated from HAstV-8 infected cell-membrane extracts. Also, immunofluorescence analyses revealed that PTB/hnRNP1 is distributed in the nucleus and cytoplasm of uninfected cells, but it is mainly localized perinuclearly in the cytoplasm of HAstV-8 infected cells. Furthermore, the minimal 3′UTR sequences recognized by recombinant PTB are those conforming helix I, and an intact PTB/hnRNP1-binding site. Finally, small interfering RNA-mediated PTB/hnRNP1 silencing reduced synthesis viral genome and virus yield in CaCo2 cells, suggesting that PTB/hnRNP1 is required for HAstV replication. In conclusion, PTB/hnRNP1 binds to the 3′UTR HAstV-8 and is required or participates in viral replication. PMID:25406089
Pintó, Rosa M.; Guix, Susana
2014-01-01
SUMMARY Human astroviruses (HAtVs) are positive-sense single-stranded RNA viruses that were discovered in 1975. Astroviruses infecting other species, particularly mammalian and avian, were identified and classified into the genera Mamastrovirus and Avastrovirus. Through next-generation sequencing, many new astroviruses infecting different species, including humans, have been described, and the Astroviridae family shows a high diversity and zoonotic potential. Three divergent groups of HAstVs are recognized: the classic (MAstV 1), HAstV-MLB (MAstV 6), and HAstV-VA/HMO (MAstV 8 and MAstV 9) groups. Classic HAstVs contain 8 serotypes and account for 2 to 9% of all acute nonbacterial gastroenteritis in children worldwide. Infections are usually self-limiting but can also spread systemically and cause severe infections in immunocompromised patients. The other groups have also been identified in children with gastroenteritis, but extraintestinal pathologies have been suggested for them as well. Classic HAstVs may be grown in cells, allowing the study of their cell cycle, which is similar to that of caliciviruses. The continuous emergence of new astroviruses with a potential zoonotic transmission highlights the need to gain insights on their biology in order to prevent future health threats. This review focuses on the basic virology, pathogenesis, host response, epidemiology, diagnostic assays, and prevention strategies for HAstVs. PMID:25278582
MathBench Biology Modules: Web-Based Math for All Biology Undergraduates
ERIC Educational Resources Information Center
Nelson, Karen C.; Marbach-Ad, Gili; Schneider, Katie; Thompson, Katerina V.; Shields, Patricia A.; Fagan, William F.
2009-01-01
Historically, biology has not been a heavily quantitative science, but this is changing rapidly (Ewing 2002; Gross 2000; Hastings and palmer 2003; Jungck 2005; Steen 2005). Quantitative approaches now constitute a key tool for modern biologists, yet undergraduate biology courses remain largely qualitative and descriptive. Although biology majors…
2. AERIAL VIEW SHOWING AQUEDUCT RIGHTOFWAY PASSING OVER RAILROAD LINE ...
2. AERIAL VIEW SHOWING AQUEDUCT RIGHT-OF-WAY PASSING OVER RAILROAD LINE FROM STONE QUARRY. TRACKS ARE GONE BUT RIGHT-OF-WAY IS STILL VISIBLE. - Old Croton Aqueduct, Quarry Railroad Bridge, Aqueduct Lane at Williams Street, Hastings-on-Hudson, Westchester County, NY
Technology Transfer Automated Retrieval System (TEKTRAN)
Genomic-assisted breeding and transgenic approaches to crop improvement are presently targeting phenotypic traits that allegedly confer drought tolerance. A news feature published in Nature Biotechnology last year suggests that these efforts may not be proceeding with sufficient haste, considering t...
VIEW OF PIEDMONT AVENUE NORTH OF DWIGHT WAY. INTERSECTION OF ...
VIEW OF PIEDMONT AVENUE NORTH OF DWIGHT WAY. INTERSECTION OF HASTE STREET SEEN AT CENTER DISTANCE. SEEN FROM WEST SIDE OF PIEDMONT AVE. LOOKING NORTH. Photograph by Fredrica Drotos and Michael Kelly, July 8, 2006 - Piedmont Way & the Berkeley Property Tract, East of College Avenue between Dwight Way & U.C. Memorial Stadium, Berkeley, Alameda County, CA
The Brave New World of the Interim Superintendency
ERIC Educational Resources Information Center
Bigham, Gary D.
2011-01-01
Considering the vital role the superintendent plays in the overall functioning and well-being of any school district, the filling of the top leadership post with a permanent appointment never should be done in haste. The process of advertising, reviewing applications, conducting background checks, interviewing candidates, negotiating contracts,…
Maximum Likelihood Estimation of Nonlinear Structural Equation Models.
ERIC Educational Resources Information Center
Lee, Sik-Yum; Zhu, Hong-Tu
2002-01-01
Developed an EM type algorithm for maximum likelihood estimation of a general nonlinear structural equation model in which the E-step is completed by a Metropolis-Hastings algorithm. Illustrated the methodology with results from a simulation study and two real examples using data from previous studies. (SLD)
Bayesian Analysis of Nonlinear Structural Equation Models with Nonignorable Missing Data
ERIC Educational Resources Information Center
Lee, Sik-Yum
2006-01-01
A Bayesian approach is developed for analyzing nonlinear structural equation models with nonignorable missing data. The nonignorable missingness mechanism is specified by a logistic regression model. A hybrid algorithm that combines the Gibbs sampler and the Metropolis-Hastings algorithm is used to produce the joint Bayesian estimates of…
ERIC Educational Resources Information Center
Culhane, Dara
2003-01-01
The intersection of Main and Hastings streets--known locally as "Pain and Wastings"--marks the heart of Vancouver's inner-city neighborhood: the Downtown Eastside. Since 1997, when the City of Vancouver Health Department declared a public health emergency in response to reports that HIV infection rates among residents exceeded those anywhere else…
The Psychometric Properties of the Difficult Behavior Self-Efficacy Scale
ERIC Educational Resources Information Center
Oh, Hyun-Kyoung; Kozub, Francis M.
2010-01-01
The study was designed to estimate the psychometric properties of Hastings and Brown's (2002a) Difficult Behavior Self-efficacy Scale. Participants were two samples of physical educators teaching in Korea (n = 229) and the United States (U.S.; n = 139). An initial translation of the questionnaire to Korean and pilot study were conducted along with…
The Emotional Reactions to Challenging Behavior Scale-Korean (ERCBS-K): Modification and Validation
ERIC Educational Resources Information Center
Oh, Hyun-Kyoung; Seo, Dong-Chul; Kozub, Francis M.
2010-01-01
The purpose of this study was to explore the original version of Mitchell and Hastings's (1998) Emotional Reaction to Challenging Behavior Scale (ERCBS) and estimate validity and reliability of a revised version containing 29 items. The Emotional Reaction to Challenging Behavior Scale-Korean (ERCBS-K) was studied using 445 in-service physical…
Cobolet, Guy; Garrison, Dan; Vons, Jacqueline; Velut, Stéphane; Nutton, Vivian; Williams, David J
2014-01-01
This session focuses on the Fabrica (1543). Karger Publishers of Basel are producing a new English translation, by Daniel Garrison and Malcom Hast, to coincide with the quincentenary while Vivian Nutton's scholarly analysis of a newly discovered second edition indicates that the annotations are of Vesalius himself. PMID:25181777
ERIC Educational Resources Information Center
White, Caroline Jane
2010-01-01
Families of children with an Autism Spectrum Disorder (ASD) exhibit decreases in cohesion and adaptability, increased social isolation (Higgins et al., 2005), higher levels of marital dissatisfaction (Hastings et al., 2005), and overall disruption to daily life (Bristol et al., 1988). Research has provided evidence of higher levels of stress,…
Analyzing the Participatory Repertoire of a U.S. Educated EFL Teacher in Saudi Arabia
ERIC Educational Resources Information Center
Lee-Johnson, Yin Lam
2016-01-01
The KSA has become a popular country for Americans to work as an EFL teacher in the recent years because of the payment and cultural experience (Hastings, 2012). Due to the wide social distance between the KSA and USA, the teachers had to adapt to the expectation and become legitimate participants (Lave and Wenger, 1991) in the local communities.…
The Teaching of Ethics. Vol. 1-9.
ERIC Educational Resources Information Center
1980
The state of ethics teaching at the undergraduate and professional school levels is examined in these comprehensive monographs sponsored by the Institute of Society, Ethics and the Life Sciences/The Hastings Center. "The Teaching of Ethics in Higher Education (I)" encompasses: (1) the number and extent of courses in ethics, (2) the status and…
Making Art Invisible: Visual Education and the Cultural Stagnation of Neo-Liberal Rationality
ERIC Educational Resources Information Center
Peers, Chris
2011-01-01
The popularity of visual literacy may have resulted, in part, from some school authorities rushing the process of determining school curriculum. This article argues that the haste is reflective of pressure placed on educational discourse to conform to neo-liberal reforms of the sector, and is not the result of a careful and complex debate within…
Estimating a Noncompensatory IRT Model Using Metropolis within Gibbs Sampling
ERIC Educational Resources Information Center
Babcock, Ben
2011-01-01
Relatively little research has been conducted with the noncompensatory class of multidimensional item response theory (MIRT) models. A Monte Carlo simulation study was conducted exploring the estimation of a two-parameter noncompensatory item response theory (IRT) model. The estimation method used was a Metropolis-Hastings within Gibbs algorithm…
Technology Transfer Automated Retrieval System (TEKTRAN)
Microplax albofasciata (Costa), a Palearctic (mainly Mediterranean) species of the small family Oxycarenidae, is reported from California as the first record for the New World. Adults of this little-known lygaeoid bug were found in 2012 and 2013 at the Hastings Natural History Reservation in norther...
"Reading" Mixed Methods Research: Contexts for Criticism
ERIC Educational Resources Information Center
Freshwater, Dawn
2007-01-01
Health and social care researchers, in their haste to "belong" to academia, have adopted the system of mixed methodology research, overestimating its ability to reveal the truth and occasionally imprisoning their thought in one system. In this article, some of the assumptions underpinning mixed methodology research and its discourse are subjected…
NASA Astrophysics Data System (ADS)
Suzuki, Soh; Tanahashi, Tadanori; Doi, Takuya; Masuda, Atsushi
2016-02-01
We examined the effects of hyper-hygrothermal stresses with or without air on the degradation of crystalline silicon (c-Si) photovoltaic (PV) modules, to shorten the required duration of a conventional hygrothermal-stress test [i.e., the “damp heat (DH) stress test”, which is conducted at 85 °C/85% relative humidity for 1,000 h]. Interestingly, the encapsulant within a PV module becomes discolored under the air-included hygrothermal conditions achieved using DH stress test equipment and an air-included highly accelerated stress test (air-HAST) apparatus, but not under the air-excluded hygrothermal conditions realized using a highly accelerated stress test (HAST) machine. In contrast, the reduction in the output power of the PV module is accelerated irrespective of air inclusion in hyper-hygrothermal test atmosphere. From these findings, we conclude that the required duration of the DH stress test will at least be significantly shortened using air-HAST, but not HAST.
The need to get smarter on smart grid projects: four Lessons
2010-10-15
Significant investments are being made in gadgets and technologies, sometimes without proper analysis of the costs and potential benefits, often without much thought on how the various components of the effort would interface with one another. As often happen when things are done in great haste, there have been setbacks in a number of these initiatives, three of which are highlighted in the article.
ERIC Educational Resources Information Center
Bissell, Lianne; Phillips, Neil; Kroese, Biza Stenfert
2005-01-01
Carers' behaviour is thought to contribute to the development and maintenance of challenging behaviour in people with learning disabilities (Emerson et al. 1995; Hastings & Remington 1994). The present study sought to investigate the effectiveness of a behavioural intervention in the management of such problem behaviours by means of a long-term…
Zhao, Wei; Niu, Ke; Zhao, Jian; Jin, Yi-ming; Sui, Ting-ting; Wang, Wen
2013-09-01
Human astrovirus (HAstV) is one of the leading causes of actue virual diarrhea in infants. HAstV-induced epithdlial cell apoptosis plays an important role in the pathogenesis of HAstV infection. Our previous study indicated that HAstV non-structural protein nsPla C-terminal protein nsPla/4 was the major apoptosis functional protein and probably contained the main apoptosis domains. In order to screen for astrovirus encoded apoptotic protien, nsPla/4 and six turncated proteins, which possessed nsPla/4 protein different function domain ,were cloned into green fluorescent protein (GFP) vector pEG-FP-N3. After 24-72 h transfection, the fusion protein expression in BHK21 cells, was analysis by fluorescence microscope and Western blot. The results indicated seven fusion proteins were observed successfully in BHK21 cell after transfected for 24 h. Western blot analysis showed that the level of fusion protein expressed in BHK21 cells was increased significantly at 72h compared to 48h in transfected cells. The successful expression of deletion mutants of nsPla/4 protein was an important foundation to gain further insights into the function of apoptosis domains of nsPla/4 protein and it would also provide research platform to further confirm the molecule pathogenic mechanism of human astrovirus. PMID:24386845
Extended Mixed-Efects Item Response Models with the MH-RM Algorithm
ERIC Educational Resources Information Center
Chalmers, R. Philip
2015-01-01
A mixed-effects item response theory (IRT) model is presented as a logical extension of the generalized linear mixed-effects modeling approach to formulating explanatory IRT models. Fixed and random coefficients in the extended model are estimated using a Metropolis-Hastings Robbins-Monro (MH-RM) stochastic imputation algorithm to accommodate for…
ERIC Educational Resources Information Center
von Davier, Matthias; Sinharay, Sandip
2009-01-01
This paper presents an application of a stochastic approximation EM-algorithm using a Metropolis-Hastings sampler to estimate the parameters of an item response latent regression model. Latent regression models are extensions of item response theory (IRT) to a 2-level latent variable model in which covariates serve as predictors of the…
75 FR 58350 - Endangered Species; Permit No. 1578-01; and Permit No. 1595-04
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-24
...Notice is hereby given the following applicants have applied in due form for modifications to permits (Permit Nos. 1578 and 1595-03) to take shortnose sturgeon for purposes of scientific research:Maine Department of Marine Resources (MDMR) (Gail S. Wippelhauser, Principal Investigator), 21 State House Station, Augusta, ME, 04333 (Permit No. 1578); and Michael M. Hastings, University of Maine,......
77 FR 49440 - Membership of the Performance Review Board
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-16
... Defense. DATES: Effective Date: August 2, 2012. FOR FURTHER INFORMATION CONTACT: Michael L. Watson... Desimone Shari Durand Audrey Eckhart Webster Ewell John Hastings Paul Hulley John James, Jr. Clarence... Pontius Angela Rogers James Russell Dennis Savage Richard Sayre Steven Schleien Donna Seymour...
75 FR 43818 - Amendment of VOR Federal Airways V-50, V-251, and V-313 in the Vicinity of Decatur, IL
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-27
... Regulatory Policies and Procedures (44 FR 11034; February 26, 1979); and (3) does not warrant preparation of.... 10854, 24 FR 9565, 3 CFR, 1959-1963 Comp., p. 389. Sec. 71.1 0 2. The incorporation by reference in 14...) Domestic VOR Federal Airways. * * * * * V-50 From Hastings, NE; Pawnee City, NE; St. Joseph, MO;...
Stochastic Approximation Methods for Latent Regression Item Response Models
ERIC Educational Resources Information Center
von Davier, Matthias; Sinharay, Sandip
2010-01-01
This article presents an application of a stochastic approximation expectation maximization (EM) algorithm using a Metropolis-Hastings (MH) sampler to estimate the parameters of an item response latent regression model. Latent regression item response models are extensions of item response theory (IRT) to a latent variable model with covariates…
Watershed Research at the North Appalachian Experimental Watershed at Coshocton, Ohio
Technology Transfer Automated Retrieval System (TEKTRAN)
The North Appalachian Experimental Watershed (NAEW) at Coshocton, Ohio was established during the mid 1930s as one of the first watershed research locations in the US (other locations included Riesel, TX and Hastings, NE). The mission of the outdoor laboratory facility was to determine the effects ...
Comparing Three Estimation Methods for the Three-Parameter Logistic IRT Model
ERIC Educational Resources Information Center
Lamsal, Sunil
2015-01-01
Different estimation procedures have been developed for the unidimensional three-parameter item response theory (IRT) model. These techniques include the marginal maximum likelihood estimation, the fully Bayesian estimation using Markov chain Monte Carlo simulation techniques, and the Metropolis-Hastings Robbin-Monro estimation. With each…
Elevation, looking SE. Concrete and steel bridge with exposed steel ...
Elevation, looking SE. Concrete and steel bridge with exposed steel frame is the central of three bridges crossing Brush Street between east Baltimore and Piquette. The bridge links Old Lake Shore and Michigan Central Main Line on the western side to a New York Central siding on the eastern side - Railroad Overpass, East Milwaukee & Hastings Avenues, Detroit, MI
Nakamura, Noriko; Kobayashi, Shinichi; Minagawa, Hiroko; Matsushita, Tadashi; Sugiura, Wataru; Iwatani, Yasumasa
2016-07-01
Acute gastroenteritis is a critical infectious disease that affects infants and young children throughout the world, including Japan. This retrospective study was conducted from September 2008 to August 2014 (six seasons: 2008/09-2013/14) to investigate the incidence of enteric viruses responsible for 1,871 cases of acute gastroenteritis in Aichi prefecture, Japan. Of the 1,871 cases, 1,100 enteric viruses were detected in 978 samples, of which strains from norovirus (NoV) genogroup II (60.9%) were the most commonly detected, followed by strains of rotavirus A (RVA) (23.2%), adenovirus (AdV) type 41 (8.2%), sapovirus (SaV) (3.6%), human astrovirus (HAstV) (2.8%), and NoV genogroup I (1.3%). Sequencing of the NoV genogroup II (GII) strains revealed that GII.4 was the most common genotype, although four different GII.4 variants were also identified. The most common G-genotype of RVA was G1 (63.9%), followed by G3 (27.1%), G2 (4.7%) and G9 (4.3%). Three genogroups of SaV strains were found: GI (80.0%), GII (15.0%), and GV (5.0%). HAstV strains were genotyped as HAstV-1 (80.6%), HAstV-8 (16.1%), and HAstV-3 (3.2%). These results show that NoV GII was the leading cause of sporadic acute viral gastroenteritis, although a variety of enteric viruses were detected during the six-season surveillance period. PMID:26647761
2013-01-01
Background Upon initial contact with a virus, host cells activate a series of cellular signaling cascades that facilitate viral entry and viral propagation within the cell. Little is known about how the human astrovirus (HAstV) exploits signaling cascades to establish an infection in host cells. Recent studies showed that activation of extracellular signal-regulated kinase 1/2 (ERK1/2) is important for HAstV infection, though the involvement of other signaling cascades remains unclear. Methods A panel of kinase blockers was used to search for cellular signaling pathways important for HAstV1 infection. To determine their impact on the infectious process, we examined viral gene expression, RNA replication, and viral RNA and capsid protein release from host cells. Results Inhibitors of phosphoinositide 3-kinase (PI3K) activation interfered with the infection, independent of their effect on ERK 1/2 activation. Activation of the PI3K signaling cascade occurred at an early phase of the infection, judging from the timeframe of Akt phosphorylation. PI3K inhibition at early times, but not at later times, blocked viral gene expression. However, inhibiting the downstream targets of PI3K activation, Akt and Rac1, did not block infection. Inhibition of protein kinase A (PKA) activation was found to block a later phase of HAstV1 production. Conclusions Our results reveal a previously unknown, essential role of PI3K in the life cycle of HAstV1. PI3K participates in the early stage of infection, possibly during the viral entry process. Our results also reveal the role of PKA in viral production. PMID:23680019
2016-05-01
In May 2016, right around the time that this issue of the Hastings Center Report should be published, The Hastings Center is holding a conference in New York City titled "Bioethics Meets Moral Psychology." The goal of the conference is to consider the lessons that bioethicists should learn from the raft of literature now accumulating on how the mental processes of perception, emotion, and thinking affect things that bioethicists care about, from the education of health care professionals to the conflicts that arise in clinical care, the "culture wars" over bioethical policy issues, the status of different cultures' value systems, and the very understanding of the values that are foundational in moral thinking. The articles in this issue simply provide more evidence that bioethics is meeting moral psychology. PMID:27150409
Donnelley, Strachan; Nolan, Kathleen
1990-01-01
This is a report from The Hastings Center project, "The Ethics of Animal Experimentation and Research." As project members, we wanted to take a fresh look at the complex ethical issues that arise in the scientific use of animals in a non-adversarial and non-ideological forum. We were convinced that these issues required a genuinely interdisciplinary approach. This meant including laboratory and field scientific researchers; veterinarians; philosophers, lawyers, and scientists particularly interested in animal welfare issues; and physicians and philosophers with long-standing bioethical interests but who previously had not confronted the ethics of the human use of animals. This Special Supplement to the Hastings Center Report is the outcome of two years of deliberation. PMID:11650361
Revisiting Additivity Violation of Quantum Channels
NASA Astrophysics Data System (ADS)
Fukuda, Motohisa
2014-12-01
We prove additivity violation of minimum output entropy of quantum channels by straightforward application of -net argument and Lévy's lemma. The additivity conjecture was disproved initially by Hastings. Later, a proof via asymptotic geometric analysis was presented by Aubrun, Szarek and Werner, which uses Dudley's bound on Gaussian process (or Dvoretzky's theorem with Schechtman's improvement). In this paper, we develop another proof along Dvoretzky's theorem in Milman's view, showing additivity violation in broader regimes than the existing proofs. Importantly,Dvoretzky's theorem works well with norms to give strong statements, but these techniques can be extended to functions which have norm-like structures-positive homogeneity and triangle inequality. Then, a connection between Hastings' method and ours is also discussed. In addition, we make some comments on relations between regularized minimum output entropy and classical capacity of quantum channels.
Maximum likelihood estimation of population growth rates based on the coalescent.
Kuhner, M K; Yamato, J; Felsenstein, J
1998-01-01
We describe a method for co-estimating 4Nemu (four times the product of effective population size and neutral mutation rate) and population growth rate from sequence samples using Metropolis-Hastings sampling. Population growth (or decline) is assumed to be exponential. The estimates of growth rate are biased upwards, especially when 4Nemu is low; there is also a slight upwards bias in the estimate of 4Nemu itself due to correlation between the parameters. This bias cannot be attributed solely to Metropolis-Hastings sampling but appears to be an inherent property of the estimator and is expected to appear in any approach which estimates growth rate from genealogy structure. Sampling additional unlinked loci is much more effective in reducing the bias than increasing the number or length of sequences from the same locus. PMID:9584114
Kaebnick, Gregory E
2014-09-01
There are three broad themes in this issue of the Hastings Center Report. First, a special report published as a supplement to the issue addresses the medical and health policy issues faced by lesbian, gay, bisexual, and transgender patients. Inside the issue, the two articles take up questions about how caregivers may justify a refusal to provide a medical service that a patient has requested. The issue also contains a set of essays that have emerged from a collaborative effort by The Hastings Center and the Presidential Commission for the Study of Bioethical Issues to promote scholarly engagement with the practical problem of teaching caregivers, researchers, scientists, and others to address bioethical problems. What appears here is the first installment of a series that will appear in the pages of the Report well into the 2015 volume. PMID:25231650
Characterization of Human Astrovirus Cell Entry
Méndez, Ernesto; Muñoz-Yañez, Claudia; Sánchez-San Martín, Claudia; Aguirre-Crespo, Gabriela; Baños-Lara, M. del Rocio; Gutierrez, Michelle; Espinosa, Rafaela; Acevedo, Yunuén; Arias, Carlos F.
2014-01-01
Human astroviruses (HAstV) are a frequent cause of gastroenteritis in young children and immunocompromised patients. To understand the early steps of HAstV infection in the highly permissive Caco-2 cell line, the binding and entry processes of the virus were characterized. The half-time of virus binding to the cell surface was about 10 min, while virus decapsidation took around 130 min. Drugs affecting clathrin-mediated endocytosis, endosome acidification, and actin filament polymerization, as well as those that reduce the presence of cholesterol in the cell membrane, decreased the infectivity of the virus. The infection was also reduced by silencing the expression of the clathrin heavy chain (CHC) by RNA interference or by overexpression of dominant-negative mutants of dynamin 2 and Eps15. Furthermore, the entry of HAstV apparently depends on the maturation of endosomes, since the infection was reduced by silencing the expression of Rab7, a small GTPase involved in the early- to late-endosome maturation. Altogether, our results suggest that HAstV enters Caco-2 cells using a clathrin-dependent pathway and reaches late endosomes to enter cells. Here, we have characterized the mechanism used by human astroviruses, important agents of gastroenteritis in children, to gain entry into their host cells. Using a combination of biochemical and genetic tools, we found that these viruses enter Caco-2 cells using a clathrin-dependent endocytic pathway, where they most likely need to travel to late endosomes to reach the cytoplasm and begin their replication cycle. PMID:24335315
Frémond, M-L; Pérot, P; Muth, E; Cros, G; Dumarest, M; Mahlaoui, N; Seilhean, D; Desguerre, I; Hébert, C; Corre-Catelin, N; Neven, B; Lecuit, M; Blanche, S; Picard, C; Eloit, M
2015-09-01
A boy with X-linked agammaglobulinemia experienced progressive global motor decline, cerebellar syndrome, and epilepsy. All standard polymerase chain reactions for neurotropic viruses were negative on cerebrospinal fluid and brain biopsy. Next-generation sequencing allowed fast identification of a new astrovirus strain (HAstV-VA1/HMO-C-PA), which led to tailor the patient's treatment, with encouraging clinical monitoring over 1 year. PMID:26407445
Cruzan:A hostage to technology.
Cranford, R E
1990-01-01
In its September/October issue, the Hastings Center Report published six brief essays with a short introduction by Courtney S. Campbell under the collective title of "Cruzan: clear and convincing?" These articles present a range of responses from participants, parents, constitutional scholars, and caregivers to the U.S. Supreme Court's decision in Cruzan v. Director, Missouri Department of Health (June 25, 1990). Ronald E. Cranford is a neurologist and a consultant to the Cruzan family. PMID:2228595
The Dawn of State Medicine in Britain
McConaghey, R M S
1967-01-01
Dr R M S McConaghey traces the development of State control in the provision of medical services and also describes the rise in status of the general practitioner, from the early apothecary-surgeons. Mr Paul Vaughan describes the history of the British Medical Association and its development from the Provincial Medical and Surgical Association, founded by Sir Charles Hastings. He considers the relationship between the BMA and the Government, both in the past and present. PMID:5337476
NASA Astrophysics Data System (ADS)
Nakano, Shinya; Suzuki, Kazue; Kawamura, Kenji; Parrenin, Frederic; Higuchi, Tomoyuki
2015-04-01
A technique for estimating the age-depth relationship and its uncertainty in ice cores has been developed. The age-depth relationship is mainly determined by the accumulation of snow at the site of the ice core and the thinning process due to the horizontal stretching and vertical compression of an ice layer. However, both the accumulation process and the thinning process are not fully known. In order to appropriately estimate the age as a function of depth, it is crucial to incorporate observational information into a model describing the accumulation and thinning processes. In the proposed technique, the age as a function of depth is estimated from age markers and time series of δ18O data. The estimation is achieved using a method combining a sequential Monte Carlo method and the Markov chain Monte Carlo method as proposed by Andrieu et al. (2010). In this hybrid method, the posterior distributions for the parameters in the models for the accumulation and thinning processes are basically computed using a way of the standard Metropolis-Hastings method. Meanwhile, sampling from the posterior distribution for the age-depth relationship is achieved by using a sequential Monte Carlo approach at each iteration of the Metropolis-Hastings method. A sequential Monte Carlo method normally suffers from the degeneracy problem, especially in cases that the number of steps is large. However, when it is combined with the Metropolis-Hastings method, the degeneracy problem can be overcome by collecting a large number of samples obtained by many iterations of the Metropolis-Hastings method. We will demonstrate the result obtained by applying the proposed technique to the ice core data from Dome Fuji in Antarctica.
Enhanced Research to Create More Jobs
Agnihotri, Newal, K.
2004-03-01
Mr. Doc Hastings, U.S. Congressman from the state of Washington is interviewed regarding various topics related to the nuclear energy picture in the US. Topics include the level of public support for nuclear energy, differences between the roles for state and federal governments, job creation, clean-up briefings, a Yucca Mountain status, a hydrogen-nuclear status, the role of nuclear energy in Kyoto protocol compliance, and the market for power plants.
Prevost, B; Lucas, F S; Ambert-Balay, K; Pothier, P; Moulin, L; Wurtzer, S
2015-10-01
Although clinical epidemiology lists human enteric viruses to be among the primary causes of acute gastroenteritis in the human population, their circulation in the environment remains poorly investigated. These viruses are excreted by the human population into sewers and may be released into rivers through the effluents of wastewater treatment plants (WWTPs). In order to evaluate the viral diversity and loads in WWTP effluents of the Paris, France, urban area, which includes about 9 million inhabitants (approximately 15% of the French population), the seasonal occurrence of astroviruses and noroviruses in 100 WWTP effluent samples was investigated over 1 year. The coupling of these measurements with a high-throughput sequencing approach allowed the specific estimation of the diversity of human astroviruses (human astrovirus genotype 1 [HAstV-1], HAstV-2, HAstV-5, and HAstV-6), 7 genotypes of noroviruses (NoVs) of genogroup I (NoV GI.1 to NoV GI.6 and NoV GI.8), and 16 genotypes of NoVs of genogroup II (NoV GII.1 to NoV GII.7, NoV GII.9, NoV GII.12 to NoV GII.17, NoV GII.20, and NoV GII.21) in effluent samples. Comparison of the viral diversity in WWTP effluents to the viral diversity found by analysis of clinical data obtained throughout France underlined the consistency between the identified genotypes. However, some genotypes were locally present in effluents and were not found in the analysis of the clinical data. These findings could highlight an underestimation of the diversity of enteric viruses circulating in the human population. Consequently, analysis of WWTP effluents could allow the exploration of viral diversity not only in environmental waters but also in a human population linked to a sewerage network in order to better comprehend viral epidemiology and to forecast seasonal outbreaks. PMID:26253673
Prevost, B.; Lucas, F. S.; Ambert-Balay, K.; Pothier, P.; Wurtzer, S.
2015-01-01
Although clinical epidemiology lists human enteric viruses to be among the primary causes of acute gastroenteritis in the human population, their circulation in the environment remains poorly investigated. These viruses are excreted by the human population into sewers and may be released into rivers through the effluents of wastewater treatment plants (WWTPs). In order to evaluate the viral diversity and loads in WWTP effluents of the Paris, France, urban area, which includes about 9 million inhabitants (approximately 15% of the French population), the seasonal occurrence of astroviruses and noroviruses in 100 WWTP effluent samples was investigated over 1 year. The coupling of these measurements with a high-throughput sequencing approach allowed the specific estimation of the diversity of human astroviruses (human astrovirus genotype 1 [HAstV-1], HAstV-2, HAstV-5, and HAstV-6), 7 genotypes of noroviruses (NoVs) of genogroup I (NoV GI.1 to NoV GI.6 and NoV GI.8), and 16 genotypes of NoVs of genogroup II (NoV GII.1 to NoV GII.7, NoV GII.9, NoV GII.12 to NoV GII.17, NoV GII.20, and NoV GII.21) in effluent samples. Comparison of the viral diversity in WWTP effluents to the viral diversity found by analysis of clinical data obtained throughout France underlined the consistency between the identified genotypes. However, some genotypes were locally present in effluents and were not found in the analysis of the clinical data. These findings could highlight an underestimation of the diversity of enteric viruses circulating in the human population. Consequently, analysis of WWTP effluents could allow the exploration of viral diversity not only in environmental waters but also in a human population linked to a sewerage network in order to better comprehend viral epidemiology and to forecast seasonal outbreaks. PMID:26253673
Novel human astroviruses: Novel human diseases?
Vu, Diem-Lan; Cordey, Samuel; Brito, Francisco; Kaiser, Laurent
2016-09-01
Astroviruses are small, non-enveloped, single-stranded positive RNA viruses that belong to the Astroviridae family. While classical human astroviruses (HAstV) are a well-recognized cause of acute non-bacterial diarrhea among young children worldwide, novel astroviruses, named HAstV-MLB and HAstV-VA/HMO, have been identified recently in humans by molecular assays. They are phylogenetically more related to animal astroviruses than to classical human astroviruses, thus suggesting cross-species transmission. Serological studies demonstrated a surprisingly high seroprevalence in certain populations and highlighted a high infection rate in the early years of life. Although their pathogenic role has not yet been clearly determined, novel astrovirus RNA sequences have been identified in different biological specimens of symptomatic patients, including the feces, plasma, cerebrospinal fluid, and brain biopsies. Thus, there is evidence that they could contribute not only to digestive tract infection, but also to unexpected clinical syndromes, notably encephalitis and meningitis. Severe infections affect mainly immunocompromised patients. These findings indicate that novel astroviruses should be considered in the differential diagnosis of immunocompromised patients with meningitis or encephalitis of unknown origin. PMID:27434149
NASA Astrophysics Data System (ADS)
Hastings, Harold
2007-03-01
We address a long-standing dilemma concerning stability of large systems. MacArthur (1955) and Hutchinson (1959) argued that more ``complex'' natural systems tended to be more stable than less complex systems based upon energy flow. May (1972) argued the opposite, using random matrix models; see Cohen and Newman (1984, 1985), Bai and Yin (1986). We show that in some sense both are right: under reasonable scaling assumptions on interaction strength, Lyapunov stability increases but structural stability decreases as complexity is increased (c.f. Harrison, 1979; Hastings, 1984). We apply this result to a variety of network systems. References: Bai, Z.D. & Yin, Y.Q. 1986. Probab. Th. Rel. Fields 73, 555. Cohen, J.E., & Newman, C.M. 1984. Annals Probab. 12, 283; 1985. Theoret. Biol. 113, 153. Harrison, G.W. 1979. Amer. Natur. 113, 659. Hastings, H.M. 1984. BioSystems 17, 171. Hastings, H.M., Juhasz, F., & Schreiber, M. 1992. .Proc. Royal Soc., Ser. B. 249, 223. Hutchinson, G.E. 1959. Amer. Natur. 93, 145, MacArthur, R. H. 1955. Ecology 35, 533, May, R.M. 1972. Nature 238, 413.
Comparison of the kinetics of different Markov models for ligand binding under varying conditions
Martini, Johannes W. R.; Habeck, Michael
2015-03-07
We recently derived a Markov model for macromolecular ligand binding dynamics from few physical assumptions and showed that its stationary distribution is the grand canonical ensemble [J. W. R. Martini, M. Habeck, and M. Schlather, J. Math. Chem. 52, 665 (2014)]. The transition probabilities of the proposed Markov process define a particular Glauber dynamics and have some similarity to the Metropolis-Hastings algorithm. Here, we illustrate that this model is the stochastic analog of (pseudo) rate equations and the corresponding system of differential equations. Moreover, it can be viewed as a limiting case of general stochastic simulations of chemical kinetics. Thus, the model links stochastic and deterministic approaches as well as kinetics and equilibrium described by the grand canonical ensemble. We demonstrate that the family of transition matrices of our model, parameterized by temperature and ligand activity, generates ligand binding kinetics that respond to changes in these parameters in a qualitatively similar way as experimentally observed kinetics. In contrast, neither the Metropolis-Hastings algorithm nor the Glauber heat bath reflects changes in the external conditions correctly. Both converge rapidly to the stationary distribution, which is advantageous when the major interest is in the equilibrium state, but fail to describe the kinetics of ligand binding realistically. To simulate cellular processes that involve the reversible stochastic binding of multiple factors, our pseudo rate equation model should therefore be preferred to the Metropolis-Hastings algorithm and the Glauber heat bath, if the stationary distribution is not of only interest.
Tolentino-Ruiz, R; Montoya-Varela, D; García-Espitia, M; Salas-Benito, M; Gutiérrez-Escolano, A; Gómez-García, C; Figueroa-Arredondo, P; Salas-Benito, J; De Nova-Ocampo, M
2012-10-01
Acute gastroenteritis (AGE) is a major cause of childhood morbidity and mortality worldwide; the etiology of AGE includes viruses, bacteria, and parasites. A multiplex PCR assay to simultaneously identify human Astrovirus (HAstV), Calicivirus (HuCVs), Entamoeba histolytica (E. histolytica), and enteroinvasive Escherichia coli (EIEC) in stool samples is described. A total of 103 samples were individually analyzed by ELISA (enzyme-linked immunosorbent assays) and RT-PCR/PCR. HAstV and HuCVs were detected in four out of 103 samples (3.8 %) by RT-PCR, but ELISAs found only one sample as positive for HuCVs (2.5 %). E. histolytica was identified in two out of 19 samples (10.5 %) and EIEC in 13 out of 20 samples (70 %) by PCR, and all PCR products were sequenced to verify their identities. Our multiplex PCR results demonstrate the simultaneous amplification of different pathogens such as HAstV, EIEC, and E. histolytica in the same reaction, though the HuCVs signal was weak in every replicate. Regardless, this multiplex PCR protocol represents a novel tool for the identification of distinct pathogens and may provide support for the diagnosis of AGE in children. PMID:22711331
Comparison of the kinetics of different Markov models for ligand binding under varying conditions.
Martini, Johannes W R; Habeck, Michael
2015-03-01
We recently derived a Markov model for macromolecular ligand binding dynamics from few physical assumptions and showed that its stationary distribution is the grand canonical ensemble [J. W. R. Martini, M. Habeck, and M. Schlather, J. Math. Chem. 52, 665 (2014)]. The transition probabilities of the proposed Markov process define a particular Glauber dynamics and have some similarity to the Metropolis-Hastings algorithm. Here, we illustrate that this model is the stochastic analog of (pseudo) rate equations and the corresponding system of differential equations. Moreover, it can be viewed as a limiting case of general stochastic simulations of chemical kinetics. Thus, the model links stochastic and deterministic approaches as well as kinetics and equilibrium described by the grand canonical ensemble. We demonstrate that the family of transition matrices of our model, parameterized by temperature and ligand activity, generates ligand binding kinetics that respond to changes in these parameters in a qualitatively similar way as experimentally observed kinetics. In contrast, neither the Metropolis-Hastings algorithm nor the Glauber heat bath reflects changes in the external conditions correctly. Both converge rapidly to the stationary distribution, which is advantageous when the major interest is in the equilibrium state, but fail to describe the kinetics of ligand binding realistically. To simulate cellular processes that involve the reversible stochastic binding of multiple factors, our pseudo rate equation model should therefore be preferred to the Metropolis-Hastings algorithm and the Glauber heat bath, if the stationary distribution is not of only interest. PMID:25747058
Comparison of the kinetics of different Markov models for ligand binding under varying conditions
NASA Astrophysics Data System (ADS)
Martini, Johannes W. R.; Habeck, Michael
2015-03-01
We recently derived a Markov model for macromolecular ligand binding dynamics from few physical assumptions and showed that its stationary distribution is the grand canonical ensemble [J. W. R. Martini, M. Habeck, and M. Schlather, J. Math. Chem. 52, 665 (2014)]. The transition probabilities of the proposed Markov process define a particular Glauber dynamics and have some similarity to the Metropolis-Hastings algorithm. Here, we illustrate that this model is the stochastic analog of (pseudo) rate equations and the corresponding system of differential equations. Moreover, it can be viewed as a limiting case of general stochastic simulations of chemical kinetics. Thus, the model links stochastic and deterministic approaches as well as kinetics and equilibrium described by the grand canonical ensemble. We demonstrate that the family of transition matrices of our model, parameterized by temperature and ligand activity, generates ligand binding kinetics that respond to changes in these parameters in a qualitatively similar way as experimentally observed kinetics. In contrast, neither the Metropolis-Hastings algorithm nor the Glauber heat bath reflects changes in the external conditions correctly. Both converge rapidly to the stationary distribution, which is advantageous when the major interest is in the equilibrium state, but fail to describe the kinetics of ligand binding realistically. To simulate cellular processes that involve the reversible stochastic binding of multiple factors, our pseudo rate equation model should therefore be preferred to the Metropolis-Hastings algorithm and the Glauber heat bath, if the stationary distribution is not of only interest.
Chen, Bixiao; Morioka, Sahya; Nakagawa, Tomoyuki; Hayakawa, Takashi
2016-10-01
The effect of resistant starch (RS) and konjac mannan (KM) to maintain and improve the large intestinal environment was compared. Wistar SPF rats were fed the following diets for 4 weeks: negative control diet (C diet), tyrosine-supplemented positive control diet (T diet), and luminacoid supplemented diets containing either high-molecular konjac mannan A (KMAT diet), low-molecular konjac mannan B (KMBT diet), high-amylose cornstarch (HAST diet), or heat-moisture-treated starch (HMTST diet). The luminacoid-fed group had an increased content of short-chain fatty acids in the cecum. HAS caused a significant decrease in p-cresol content in the cecum, whereas KM did not. Urinary p-cresol was reduced in the HAST group compared with the T group, but not the KM fed groups. Deterioration in the large intestinal environment was only improved completely in the HAST and HMTST groups, suggesting that RS is considerably more effective than KM in maintaining the large intestinal environment. PMID:27296718
Upchurch, Paul; Mannion, Philip D.; Taylor, Michael P.
2015-01-01
The sauropod dinosaur “Pelorosaurus” becklesii was named in 1852 on the basis of an associated left humerus, ulna, radius and skin impression from the Early Cretaceous (Berriasian-Valanginian) Hastings Beds Group, near Hastings, East Sussex, southeast England, United Kingdom. The taxonomy and nomenclature of this specimen have a complex history, but most recent workers have agreed that “P.” becklesii represents a distinct somphospondylan (or at least a titanosauriform) and is potentially the earliest titanosaur body fossil from Europe or even globally. The Hastings specimen is distinct from the approximately contemporaneous Pelorosaurus conybeari from Tilgate Forest, West Sussex. “P.” becklesii can be diagnosed on the basis of five autapomorphies, such as: a prominent anteriorly directed process projecting from the anteromedial corner of the distal humerus; the proximal end of the radius is widest anteroposteriorly along its lateral margin; and the unique combination of a robust ulna and slender radius. The new generic name Haestasaurus is therefore erected for “P.” becklesii. Three revised and six new fore limb characters (e.g. the presence/absence of condyle-like projections on the posterodistal margin of the radius) are discussed and added to three cladistic data sets for Sauropoda. Phylogenetic analysis confirms that Haestasaurus becklesii is a macronarian, but different data sets place this species either as a non-titanosauriform macronarian, or within a derived clade of titanosaurs that includes Malawisaurus and Saltasauridae. This uncertainty is probably caused by several factors, including the incompleteness of the Haestasaurus holotype and rampant homoplasy in fore limb characters. Haestasaurus most probably represents a basal macronarian that independently acquired the robust ulna, enlarged olecranon, and other states that have previously been regarded as synapomorphies of clades within Titanosauria. There is growing evidence that basal
Rodríguez-Díaz, J.; Querales, L.; Caraballo, L.; Vizzi, E.; Liprandi, F.; Takiff, H.; Betancourt, W. Q.
2009-01-01
The detection and molecular characterization of pathogenic human viruses in urban sewage have been used extensively to derive information on circulating viruses in given populations throughout the world. In this study, a similar approach was applied to provide an overview of the epidemiology of waterborne gastroenteritis viruses circulating in urban areas of Caracas, the capital city of Venezuela in South America. Dry season sampling was conducted in sewers and in a major river severely polluted with urban sewage discharges. Nested PCR was used for detection of human adenoviruses (HAds), while reverse transcription plus nested or seminested PCR was used for detection of enteroviruses (HuEVs), rotaviruses (HRVs), noroviruses (HuNoVs), and astroviruses (HAstVs). HRVs were fully characterized with genotype-specific primers for VP4 (genotype P), VP7 (genotype G), and the rotavirus nonstructural protein 4 (NSP4). HuNoVs and HAstVs were characterized by sequencing and phylogenetic analysis. The detection rates of all viruses were ≥50%, and all sampling events were positive for at least one of the pathogenic viruses studied. The predominant HRV types found were G1, P[8], P[4], and NSP4A and -B. Genogroup II of HuNoVs and HAstV type 8 were frequently detected in sewage and sewage-polluted river waters. This study reveals relevant epidemiological data on the distribution and persistence of human pathogenic viruses in sewage-polluted waters and addresses the potential health risks associated with transmission of these viruses through water-related environmental routes. PMID:19028907
NASA Astrophysics Data System (ADS)
Scherer, Claiton M. S.; Goldberg, Karin; Bardola, Tatiana
2015-06-01
The Barbalha Formation (Aptian) records deposition in a fluvial and lacustrine environment accumulated in an early post-rift sag basin. Characterization of the facies architecture and sequence stratigraphic framework of the alluvial succession was carried out through detailed description and interpretation of outcrops and cored wells. The development of depositional sequences in this unit reflects variation in the accommodation-to-sediment supply (A/S) ratio. Two depositional sequences, showing an overall fining-upward trend, are preserved within the succession. The sequences are bounded by regional subaerial unconformities formed during negative A/S ratio, and may be subdivided in Low-accommodation Systems Tracts (LAST) (positive A/S ratio close to zero) and High accommodation Systems Tracts (HAST) (A/S ratio between 0.5 and 1). Sequence 1, with a minimum thickness of 100 m, is characterized by amalgamated, multi-storey, braided fluvial channel sand bodies, defining a LAST. These are interlayered with crevasse splay and floodplain deposits toward the top, passing to open lacustrine deposits, defining a HAST. Sequence 2, with minimum thickness ranging from 50 to 90 m, overlies the organic-rich lacustrine deposits. At the base, this sequence is composed of amalgamated, multistorey braided fluvial channel sand bodies (LAST), similar to Sequence 1, overlain by well-drained floodplain with fixed fluvial channel deposits, interpreted as an anastomosed fluvial system, which are in turn capped by lacustrine deposits, both grouped in a HAST. Paleocurrent data on fluvial deposits of sequences 1 and 2 show a consistent paleoflow to the SE. Sedimentological evidence indicates humid to sub-humid climatic conditions during deposition of sequences 1 and 2. Accumulation of fluvial sequences 1 and 2 was mainly controlled by tectonics. Variation in A/S ratios must be related to tectonic subsidence and uplift of the basin.
Eigenvalue analysis of an irreversible random walk with skew detailed balance conditions.
Sakai, Yuji; Hukushima, Koji
2016-04-01
An irreversible Markov-chain Monte Carlo (MCMC) algorithm with skew detailed balance conditions originally proposed by Turitsyn et al. is extended to general discrete systems on the basis of the Metropolis-Hastings scheme. To evaluate the efficiency of our proposed method, the relaxation dynamics of the slowest mode and the asymptotic variance are studied analytically in a random walk on one dimension. It is found that the performance in irreversible MCMC methods violating the detailed balance condition is improved by appropriately choosing parameters in the algorithm. PMID:27176439
Aragão, Glicélia Cruz; Mascarenhas, Joana D'Arc Pereira; Kaiano, Jane Haruko Lima; de Lucena, Maria Silvia Sousa; Siqueira, Jones Anderson Monteiro; Fumian, Túlio Machado; Hernandez, Juliana das Mercês; de Oliveira, Consuelo Silva; Oliveira, Darleise de Souza; Araújo, Eliete da Cunha; Soares, Luana da Silva; Linhares, Alexandre Costa; Gabbay, Yvone Benchimol
2013-01-01
Norovirus (NoV), sapovirus (SaV) and human astrovirus (HAstV) are viral pathogens that are associated with outbreaks and sporadic cases of gastroenteritis. However, little is known about the occurrence of these pathogens in relatively isolated communities, such as the remnants of African-descendant villages (“Quilombola”). The objective of this study was the frequency determination of these viruses in children under 10 years, with and without gastroenteritis, from a “Quilombola” Community, Northern Brazil. A total of 159 stool samples were obtained from April/2008 to July/2010 and tested by an enzyme immunoassay (EIA) and reverse transcription-polymerase chain reaction (RT-PCR) to detect NoV, SaV and HAstV, and further molecular characterization was performed. These viruses were detected only in the diarrheic group. NoV was the most frequent viral agent detected (19.7%-16/81), followed by SaV (2.5%-2/81) and HAstV (1.2%-1/81). Of the 16 NoV-positive samples, 14 were sequenced with primers targeting the B region of the polymerase (ORF1) and the D region of the capsid (ORF2). The results showed a broad genetic diversity of NoV, with 12 strains being classified as GII-4 (5–41.7%), GII-6 (3–25%), GII-7 (2–16.7%), GII-17 (1–8.3%) and GI-2 (1–8.3%), as based on the polymerase region; 12 samples were classified, based on the capsid region, as GII-4 (6–50%, being 3–2006b variant and 3–2010 variant), GII-6 (3–25%), GII-17 (2–16.7%) and GII-20 (1–8.3%). One NoV-strain showed dual genotype specificity, based on the polymerase and capsid region (GII-7/GII-20). This study provides, for the first time, epidemiological and molecular information on the circulation of NoV, SaV and HAstV in African-descendant communities in Northern Brazil and identifies NoV genotypes that were different from those detected previously in studies conducted in the urban area of Belém. It remains to be determined why a broader NoV diversity was observed in such a semi
Eutrophication control and the fallacy of nitrogen removal
Sincero, A.P.
1984-11-01
There has been a great deal of controversy over the issue of nitrogen control from sewage treatment plants discharges to alleviate excessive algae growths in receiving bodies of water. Some of the controversy seems to have risen from a thorough misunderstanding of the microbiology involved in the utilization of nitrogen by microbes. In a haste to control eutrophication, some regulators have required the removal of nitrogen from the effluent of sewage treatment plants; e.g., the Patuxent Nitrogen Removal Policy of the State of Maryland.
Atrial fibrillation: state of the art.
Hasun, Matthias; Gatterer, Eduard; Weidinger, Franz
2014-11-01
Atrial fibrillation (AF) is by far the most frequent heart rhythm disorder and is associated with a significantly increased risk of stroke, heart failure and death. Despite improvements in prevention and treatment, the prognosis has not changed significantly. To use new and promising pharmacological and interventional concepts for thromboembolic prophylaxis and treatment of AF, as well as prevention of recurrence, patient compliance has to be improved, physicians have to be trained and experience hast to be gained. A consistently carried 'anticoagulation pass' might be a promising piece of the puzzle. PMID:25409952
NASA Astrophysics Data System (ADS)
Zhang, Li-Sheng; Deng, Min-Yi; Kong, Ling-Jiang; Liu, Mu-Ren; Tang, Guo-Ning
2010-01-01
Using the Greenberg-Hasting cellular automata model, we study the properties of target waves in excitable media under the no-flux boundary conditions. For the system has only one excited state, the computer simulation and analysis lead to the conclusions that, the number of refractory states does not influence the wave-front speed; the wave-front speed decreases as the excitation threshold increases and increases as the neighbor radius increases; the period of target waves is equal to the number of cell states; the excitation condition for target waves is that the wave-front speed must be bigger than half of the neighbor radius.
Driving forces behind integration: weigh your options.
Friend, P M; Meighan, S
1994-01-01
Collaborative relationships between hospitals and physicians can take many forms. Before you choose your strategy, consider the benefits and drawbacks of each. Many of America's hospitals and physicians are rushing to integrate their services through a variety of collaborative options. Their haste has been encouraged by many factors. Before hospitals and physicians react to the driving forces around them, they should carefully consider the pros and cons of four types of collaborative options: 1. management service organizations, 2. physician-hospital organizations, 3. practice acquisition models, 4. equity models. PMID:10133599
Monte Carlo simulation of a noisy quantum channel with memory.
Akhalwaya, Ismail; Moodley, Mervlyn; Petruccione, Francesco
2015-10-01
The classical capacity of quantum channels is well understood for channels with uncorrelated noise. For the case of correlated noise, however, there are still open questions. We calculate the classical capacity of a forgetful channel constructed by Markov switching between two depolarizing channels. Techniques have previously been applied to approximate the output entropy of this channel and thus its capacity. In this paper, we use a Metropolis-Hastings Monte Carlo approach to numerically calculate the entropy. The algorithm is implemented in parallel and its performance is studied and optimized. The effects of memory on the capacity are explored and previous results are confirmed to higher precision. PMID:26565361
Cruzan: On taking substituted judgment seriously.
Baron, C
1990-01-01
In its September/October 1990 issue, the Hastings Center Report published six brief essays with a short introduction by Courtney S. Campbell under the collective title of "Cruzan: clear and convincing?" These articles present a range of responses from participants, parents, constitutional scholars, and caregivers to the U.S. Supreme Court's decision in Cruzan v. Director, Missouri Department of Health (June 25, 1990). Legal scholar Charles Baron, though an advocate for patients' rights in general and the right to die in particular, argues that the Supreme Court rendered the right decision in Cruzan. PMID:2228593
Padé approximations for Painlevé I and II transcendents
NASA Astrophysics Data System (ADS)
Novokshenov, V. Yu.
2009-06-01
We use a version of the Fair-Luke algorithm to find the Padé approximate solutions of the Painlevé I and II equations. We find the distributions of poles for the well-known Ablowitz-Segur and Hastings-McLeod solutions of the Painlevé II equation. We show that the Boutroux tritronquée solution of the Painleé I equation has poles only in the critical sector of the complex plane. The algorithm allows checking other analytic properties of the Painlevé transcendents, such as the asymptotic behavior at infinity in the complex plane.
Rapid recipe formulation for plasma etching of new materials
NASA Astrophysics Data System (ADS)
Chopra, Meghali; Zhang, Zizhuo; Ekerdt, John; Bonnecaze, Roger T.
2016-03-01
A fast and inexpensive scheme for etch rate prediction using flexible continuum models and Bayesian statistics is demonstrated. Bulk etch rates of MgO are predicted using a steady-state model with volume-averaged plasma parameters and classical Langmuir surface kinetics. Plasma particle and surface kinetics are modeled within a global plasma framework using single component Metropolis Hastings methods and limited data. The accuracy of these predictions is evaluated with synthetic and experimental etch rate data for magnesium oxide in an ICP-RIE system. This approach is compared and superior to factorial models generated from JMP, a software package frequently employed for recipe creation and optimization.
Eigenvalue analysis of an irreversible random walk with skew detailed balance conditions
NASA Astrophysics Data System (ADS)
Sakai, Yuji; Hukushima, Koji
2016-04-01
An irreversible Markov-chain Monte Carlo (MCMC) algorithm with skew detailed balance conditions originally proposed by Turitsyn et al. is extended to general discrete systems on the basis of the Metropolis-Hastings scheme. To evaluate the efficiency of our proposed method, the relaxation dynamics of the slowest mode and the asymptotic variance are studied analytically in a random walk on one dimension. It is found that the performance in irreversible MCMC methods violating the detailed balance condition is improved by appropriately choosing parameters in the algorithm.
Aragão, Glicélia Cruz; Mascarenhas, Joana D'Arc Pereira; Kaiano, Jane Haruko Lima; de Lucena, Maria Silvia Sousa; Siqueira, Jones Anderson Monteiro; Fumian, Túlio Machado; Hernandez, Juliana das Mercês; de Oliveira, Consuelo Silva; Oliveira, Darleise de Souza; Araújo, Eliete da Cunha; Soares, Luana da Silva; Linhares, Alexandre Costa; Gabbay, Yvone Benchimol
2013-01-01
Norovirus (NoV), sapovirus (SaV) and human astrovirus (HAstV) are viral pathogens that are associated with outbreaks and sporadic cases of gastroenteritis. However, little is known about the occurrence of these pathogens in relatively isolated communities, such as the remnants of African-descendant villages ("Quilombola"). The objective of this study was the frequency determination of these viruses in children under 10 years, with and without gastroenteritis, from a "Quilombola" Community, Northern Brazil. A total of 159 stool samples were obtained from April/2008 to July/2010 and tested by an enzyme immunoassay (EIA) and reverse transcription-polymerase chain reaction (RT-PCR) to detect NoV, SaV and HAstV, and further molecular characterization was performed. These viruses were detected only in the diarrheic group. NoV was the most frequent viral agent detected (19.7%-16/81), followed by SaV (2.5%-2/81) and HAstV (1.2%-1/81). Of the 16 NoV-positive samples, 14 were sequenced with primers targeting the B region of the polymerase (ORF1) and the D region of the capsid (ORF2). The results showed a broad genetic diversity of NoV, with 12 strains being classified as GII-4 (5-41.7%), GII-6 (3-25%), GII-7 (2-16.7%), GII-17 (1-8.3%) and GI-2 (1-8.3%), as based on the polymerase region; 12 samples were classified, based on the capsid region, as GII-4 (6-50%, being 3-2006b variant and 3-2010 variant), GII-6 (3-25%), GII-17 (2-16.7%) and GII-20 (1-8.3%). One NoV-strain showed dual genotype specificity, based on the polymerase and capsid region (GII-7/GII-20). This study provides, for the first time, epidemiological and molecular information on the circulation of NoV, SaV and HAstV in African-descendant communities in Northern Brazil and identifies NoV genotypes that were different from those detected previously in studies conducted in the urban area of Belém. It remains to be determined why a broader NoV diversity was observed in such a semi-isolated community. PMID:23457593
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2006-01-01
Microcircuits encapsulated in three plastic package styles were stored in different environments at temperatures varying from 130 C to 225 C for up to 4,000 hours in some cases. To assess the effect of oxygen, the parts were aged at high temperatures in air and in vacuum chambers. The effect of humidity was evaluated during long-term highly accelerated temperature and humidity stress testing (HAST) at temperatures of 130 C and 150 C. High temperature storage testing of decapsulated microcircuits in air, vacuum, and HAST chambers was carried out to evaluate the role of molding compounds in the environmentally-induced degradation and failure of wire bonds (WB). This paper reports on accelerating factors of environment and molding compound on WB failures. It has been shown that all environments, including oxygen, moisture, and the presence of molding compounds reduce time-to-failures compared to unencapsulated devices in vacuum conditions. The mechanism of the environmental effect on KB degradation is discussed.
AN AFFINE-INVARIANT SAMPLER FOR EXOPLANET FITTING AND DISCOVERY IN RADIAL VELOCITY DATA
Hou Fengji; Hogg, David W.; Goodman, Jonathan; Weare, Jonathan; Schwab, Christian
2012-02-01
Markov chain Monte Carlo (MCMC) proves to be powerful for Bayesian inference and in particular for exoplanet radial velocity fitting because MCMC provides more statistical information and makes better use of data than common approaches like chi-square fitting. However, the nonlinear density functions encountered in these problems can make MCMC time-consuming. In this paper, we apply an ensemble sampler respecting affine invariance to orbital parameter extraction from radial velocity data. This new sampler has only one free parameter, and does not require much tuning for good performance, which is important for automatization. The autocorrelation time of this sampler is approximately the same for all parameters and far smaller than Metropolis-Hastings, which means it requires many fewer function calls to produce the same number of independent samples. The affine-invariant sampler speeds up MCMC by hundreds of times compared with Metropolis-Hastings in the same computing situation. This novel sampler would be ideal for projects involving large data sets such as statistical investigations of planet distribution. The biggest obstacle to ensemble samplers is the existence of multiple local optima; we present a clustering technique to deal with local optima by clustering based on the likelihood of the walkers in the ensemble. We demonstrate the effectiveness of the sampler on real radial velocity data.
Inscribed matter as an energy-efficient means of communication with an extraterrestrial civilization
NASA Astrophysics Data System (ADS)
Rose, Christopher; Wright, Gregory
2004-09-01
It is well known that electromagnetic radiation-radio waves-can in principle be used to communicate over interstellar distances. By contrast, sending physical artefacts has seemed extravagantly wasteful of energy, and imagining human travel between the stars even more so. The key consideration in earlier work, however, was the perceived need for haste. If extraterrestrial civilizations existed within a few tens of light years, radio could be used for two-way communication on timescales comparable to human lifetimes (or at least the longevities of human institutions). Here we show that if haste is unimportant, sending messages inscribed on some material can be strikingly more energy efficient than communicating by electromagnetic waves. Because messages require protection from cosmic radiation and small messages could be difficult to find among the material clutter near a recipient, `inscribed matter' is most effective for long archival messages (as opposed to potentially short ``we exist'' announcements). The results suggest that our initial contact with extraterrestrial civilizations may be more likely to occur through physical artefacts-essentially messages in a bottle-than via electromagnetic communication.
Fitting complex population models by combining particle filters with Markov chain Monte Carlo.
Knape, Jonas; de Valpine, Perry
2012-02-01
We show how a recent framework combining Markov chain Monte Carlo (MCMC) with particle filters (PFMCMC) may be used to estimate population state-space models. With the purpose of utilizing the strengths of each method, PFMCMC explores hidden states by particle filters, while process and observation parameters are estimated using an MCMC algorithm. PFMCMC is exemplified by analyzing time series data on a red kangaroo (Macropus rufus) population in New South Wales, Australia, using MCMC over model parameters based on an adaptive Metropolis-Hastings algorithm. We fit three population models to these data; a density-dependent logistic diffusion model with environmental variance, an unregulated stochastic exponential growth model, and a random-walk model. Bayes factors and posterior model probabilities show that there is little support for density dependence and that the random-walk model is the most parsimonious model. The particle filter Metropolis-Hastings algorithm is a brute-force method that may be used to fit a range of complex population models. Implementation is straightforward and less involved than standard MCMC for many models, and marginal densities for model selection can be obtained with little additional effort. The cost is mainly computational, resulting in long running times that may be improved by parallelizing the algorithm. PMID:22624307
Effect of oxygen pressure on glycogen synthesis by rat-liver slices
Figueroa, Enrique; Vallejos, Rodolfo; Pfeifer, Ariana; Kahler, Cecilia
1966-01-01
1. Glycogen synthesized by rat-liver slices 0·5mm. thick incubated at 1atm. oxygen pressure in Hastings medium with glucose was localized in the cells of the periphery of the slice. Cells of the interior of this slice do not synthesize glycogen. 2. Inner cells of thin slices (about 0·3mm. thick) can synthesize glycogen when such slices are incubated under the same conditions, but oxygen pressures higher than 1atm. are required if inner cells of slices 0·5mm. or more thick are to be able to synthesize glycogen. 3. Localization of newly synthesized glycogen in rat-liver slices incubated in Hastings medium with glucose does not depend on glucose concentration. 4. Calculation of the minimum oxygen pressure required to synthesize glycogen gives values between 0·09 and 0·17atm. 5. The advantages of high oxygen pressures for the study of the synthesis of glycogen and other compounds that require ATP are discussed. ImagesFig. 2.Fig. 5. PMID:5938650
Modeling the Sinoatrial Node by Cellular Automata with Irregular Topology
NASA Astrophysics Data System (ADS)
Makowiec, Danuta
The role of irregularity in intercellular connections is studied in the first natural human pacemaker called the sinoatrial node by modeling with the Greenberg-Hastings cellular automata. Facts from modern physiology about the sinoatrial node drive modeling. Heterogeneity between cell connections is reproduced by a rewiring procedure applied to a square lattice. The Greenberg-Hastings rule, representing the intrinsic cellular dynamics, is modified to imitate self-excitation of each pacemaker cell. Moreover, interactions with nearest neighbors are changed to heterogeneous ones by enhancing horizontal connections. Stationary states of the modeled system emerge as self-organized robust oscillatory states. Since the sinoatrial node role relies on a single cell cyclic activity, properties of single cells are studied. It appears that the strength and diversity of cellular oscillations depend directly on properties of intrinsic cellular dynamics. But these oscillations also depend on the underlying topology. Moderate nonuniformity of intercellular connections are found vital for proper function of the sinoatrial node, namely, for producing robust oscillatory states that are able to respond effectively to the autonomic system control.
Astrovirus MLB1 Is Not Associated with Diarrhea in a Cohort of Indian Children
Holtz, Lori R.; Bauer, Irma K.; Rajendran, Priya; Kang, Gagandeep; Wang, David
2011-01-01
Astroviruses are a known cause of human diarrhea. Recently the highly divergent astrovirus MLB1 (MLB1) was identified in a stool sample from a patient with diarrhea. It has subsequently been detected in stool from individuals with and without diarrhea. To determine whether MLB1 is associated with diarrhea, we conducted a case control study of MLB1. In parallel, the prevalence of the classic human astroviruses (HAstVs) was also determined in the same case control cohort. 400 cases and 400 paired controls from a longitudinal birth cohort in Vellore, India were analyzed by RT-PCR. While HAstVs were associated with diarrhea (p = 0.029) in this cohort, MLB1 was not; 14 of the controls and 4 cases were positive for MLB1. Furthermore, MLB1 viral load did not differ significantly between the cases and controls. The role of MLB1 in human health still remains unknown and future studies are needed. PMID:22174853
Markov chain Monte Carlo methods: an introductory example
NASA Astrophysics Data System (ADS)
Klauenberg, Katy; Elster, Clemens
2016-02-01
When the Guide to the Expression of Uncertainty in Measurement (GUM) and methods from its supplements are not applicable, the Bayesian approach may be a valid and welcome alternative. Evaluating the posterior distribution, estimates or uncertainties involved in Bayesian inferences often requires numerical methods to avoid high-dimensional integrations. Markov chain Monte Carlo (MCMC) sampling is such a method—powerful, flexible and widely applied. Here, a concise introduction is given, illustrated by a simple, typical example from metrology. The Metropolis-Hastings algorithm is the most basic and yet flexible MCMC method. Its underlying concepts are explained and the algorithm is given step by step. The few lines of software code required for its implementation invite interested readers to get started. Diagnostics to evaluate the performance and common algorithmic choices are illustrated to calibrate the Metropolis-Hastings algorithm for efficiency. Routine application of MCMC algorithms may be hindered currently by the difficulty to assess the convergence of MCMC output and thus to assure the validity of results. An example points to the importance of convergence and initiates discussion about advantages as well as areas of research. Available software tools are mentioned throughout.
Miller, Elizabeth Christina; Lin, Hsiu-Chin; Hastings, Philip A
2016-03-01
The triplefin blennies (Teleostei: Tripterygiidae) are a diverse group of small-bodied benthic fishes associated with rocky or coral reefs. The Neotropics contain four genera and 26 species, many of which have only been recently described. A recent molecular phylogeny (Lin and Hastings, 2013) contrasts with previous phylogenies based on morphology in recovering the four Neotropical genera as a single clade with respect to the Indo-Pacific genera; however, relationships within and among genera were poorly resolved. This study reports a novel topology based on an expanded seven-loci molecular dataset. Individual gene trees have poor resolution, but concatenated analyses show strong support for most nodes, likely due to emergent support from concatenation. Consistent with Lin and Hastings (2013), three of the Neotropical genera, Axoclinus, Enneanectes, and Crocodilichthys, form a well-supported clade, but relationships of the fourth (Lepidonectes) are not confidently resolved. The monophyly of Axoclinus is well supported, but Enneanectes is paraphyletic with the inclusion of Axoclinus and Crocodilichthys. Improved resolution allows for reinterpretation of the biogeography of the Neotropical Tripterygiidae. Broader taxon sampling is still necessary for resolving the relationships within Tripterygiidae globally. PMID:26718057
Frontoparietal white matter integrity predicts haptic performance in chronic stroke
Borstad, Alexandra L.; Choi, Seongjin; Schmalbrock, Petra; Nichols-Larsen, Deborah S.
2015-01-01
Frontoparietal white matter supports information transfer between brain areas involved in complex haptic tasks such as somatosensory discrimination. The purpose of this study was to gain an understanding of the relationship between microstructural integrity of frontoparietal network white matter and haptic performance in persons with chronic stroke and to compare frontoparietal network integrity in participants with stroke and age matched control participants. Nineteen individuals with stroke and 16 controls participated. Haptic performance was quantified using the Hand Active Sensation Test (HASTe), an 18-item match-to-sample test of weight and texture discrimination. Three tesla MRI was used to obtain diffusion-weighted and high-resolution anatomical images of the whole brain. Probabilistic tractography was used to define 10 frontoparietal tracts total; Four intrahemispheric tracts measured bilaterally 1) thalamus to primary somatosensory cortex (T–S1), 2) thalamus to primary motor cortex (T–M1), 3) primary to secondary somatosensory cortex (S1 to SII) and 4) primary somatosensory cortex to middle frontal gyrus (S1 to MFG) and, 2 interhemispheric tracts; S1–S1 and precuneus interhemispheric. A control tract outside the network, the cuneus interhemispheric tract, was also examined. The diffusion metrics fractional anisotropy (FA), mean diffusivity (MD), axial (AD) and radial diffusivity (RD) were quantified for each tract. Diminished FA and elevated MD values are associated with poorer white matter integrity in chronic stroke. Nine of 10 tracts quantified in the frontoparietal network had diminished structural integrity poststroke compared to the controls. The precuneus interhemispheric tract was not significantly different between groups. Principle component analysis across all frontoparietal white matter tract MD values indicated a single factor explained 47% and 57% of the variance in tract mean diffusivity in stroke and control groups respectively. Age
Frontoparietal white matter integrity predicts haptic performance in chronic stroke.
Borstad, Alexandra L; Choi, Seongjin; Schmalbrock, Petra; Nichols-Larsen, Deborah S
2016-01-01
Frontoparietal white matter supports information transfer between brain areas involved in complex haptic tasks such as somatosensory discrimination. The purpose of this study was to gain an understanding of the relationship between microstructural integrity of frontoparietal network white matter and haptic performance in persons with chronic stroke and to compare frontoparietal network integrity in participants with stroke and age matched control participants. Nineteen individuals with stroke and 16 controls participated. Haptic performance was quantified using the Hand Active Sensation Test (HASTe), an 18-item match-to-sample test of weight and texture discrimination. Three tesla MRI was used to obtain diffusion-weighted and high-resolution anatomical images of the whole brain. Probabilistic tractography was used to define 10 frontoparietal tracts total; Four intrahemispheric tracts measured bilaterally 1) thalamus to primary somatosensory cortex (T-S1), 2) thalamus to primary motor cortex (T-M1), 3) primary to secondary somatosensory cortex (S1 to SII) and 4) primary somatosensory cortex to middle frontal gyrus (S1 to MFG) and, 2 interhemispheric tracts; S1-S1 and precuneus interhemispheric. A control tract outside the network, the cuneus interhemispheric tract, was also examined. The diffusion metrics fractional anisotropy (FA), mean diffusivity (MD), axial (AD) and radial diffusivity (RD) were quantified for each tract. Diminished FA and elevated MD values are associated with poorer white matter integrity in chronic stroke. Nine of 10 tracts quantified in the frontoparietal network had diminished structural integrity poststroke compared to the controls. The precuneus interhemispheric tract was not significantly different between groups. Principle component analysis across all frontoparietal white matter tract MD values indicated a single factor explained 47% and 57% of the variance in tract mean diffusivity in stroke and control groups respectively. Age
Iacono, Antonio Dello; Eliakim, Alon; Meckel, Yoav
2015-03-01
The present study was designed to compare the effects of high-intensity intermittent training (HIIT) and small-sided games (SSGs) training on fitness variables of elite handball players. Eighteen highly trained players (mean age ± SD: 25.6 ± 0.5 years) were assigned to either HIIT or SSGs group training protocols twice per week for 8 weeks. The HIIT consisted of 12-24 × 15 seconds of high-intensity runs interspersed by 15 seconds of recovery. The SSGs training consisted of 3 against 3 small-sided handball games. Both training methods were matched for exercise duration and recovery at each training session. Before and after 8-week training, the following fitness variables were assessed-speed: 10- and 20-m sprint time, agility: handball agility specific test (HAST), upper arm strength: 1 repetition maximum (1RM) bench press test, lower limb power: counter-movement jump tests with (CMJarm) and without (CMJ) arm movement, and aerobic fitness (yo-yo intermittent recovery test level 1 [YYIRTL1]). Significant improvement was found in the YYIRTL1 (23.3 and 26.3%, respectively), 10-m sprint (2.3 and 4.1%, respectively) and 20-m sprint (2.1 and 4%, respectively), HAST (1.1 and 2.2%, respectively), 1RM bench press (6.8 and 12.3%, respectively), CMJ (7.4 and 10.8%, respectively), and CMJarm (6.4 and 8.9%, respectively) following training in both groups (p ≤ 0.05 for all). There was a significantly greater improvement in 10- and 20-m sprint, HAST, 1RM, CMJ, and CMJarm following the SSGs training compared with the HIIT (p ≤ 0.05 for all). These results indicated that both HIIT and SSGs are effective training methods for fitness development among elite adult handball players. However, SSGs training may be considered as the preferred training regimen for improving handball-specific fitness variables during the in-season period. PMID:25226326
NASA Astrophysics Data System (ADS)
Wanas, H. A.; Sallam, E.; Zobaa, M. K.; Li, X.
2015-11-01
This study aims to provide the depositional facies, sequence stratigraphic and paleoclimatic characteristics of the Mid-Eocene (Bartonian) continental succession exposed at Gebel El-Goza El-Hamra (Shabrawet Area, NE Eastern Desert, Egypt). The studied succession consists of siliciclastic rocks followed upward by carbonate rocks. Detailed field observation and petrographic investigation indicate accumulation in floodplain-dominated alluvial and shallow lacustrine systems. The floodplain-dominated alluvial facies (45 m thick) is composed mainly of carbonate nodules-bearing, mottled mudrock with subordinate sandstone and conglomerate beds. The conglomerate and pebbly sandstone bodies interpreted as ephemeral braided channel deposits. The massive, laminated, planner cross-bedded, fine- to medium-grained sandstone bodies interlayered within mudstone reflect sheet flood deposits. The mudrocks associated with paleosols represent distal floodplain deposits. The shallow lacustrine facies (15 m thick) is made up of an alternation of marlstone, micritic limestone, dolostone and mudrock beds with charophytes and small gastropods. Both the alluvial and lacustrine facies show evidence of macro-and micro-pedogenic features. Pollen assemblages, stable δ18O and δ13C isotopes, and paleopedogenic features reflect prevalence of arid to semi-arid climatic conditions during the Bartonian. The sequence stratigraphic framework shows an overall fining-upward depositional sequence, consisting of Low- and High-accommodation Systems Tracts (LAST, HAST), and is bounded by two sequence boundaries (SB-1, SB-2). Conglomerate and pebbly sandstone deposits (braided channel and sheet flood deposits) of the lower part of the alluvial facies reflect a LAST. Mudrock and silty claystone facies (distal floodplain deposits) of the upper part of alluvial facies and its overlying lacustrine facies correspond to a HAST. The LAST, HAST and SB were formed during different accommodation-to-sediment supply (A
NASA Astrophysics Data System (ADS)
Dong, Kyung-Rae; Goo, Eun-Hoe; Lee, Jae-Seung; Chung, Woon-Kwan
2013-01-01
A consecutive series of 50 patients (28 males and 22 females) who underwent hepatic magnetic resonance imaging (MRI) from August to December 2011 were enrolled in this study. The appropriate parameters for abdominal MRI scans were determined by comparing the images (TE = 90 and 128 msec) produced using the half-Fourier acquisition single-shot turbo spin-echo (HASTE) technique at different signal acquisition times. The patients consisted of 15 normal patients, 25 patients with a hepatoma and 10 patients with a hemangioma. The TE in a single patient was set to either 90 msec or 128 msec. This was followed by measurements using the four normal rendering methods of the biliary tract system and the background signal intensity using the maximal signal intensity techniques in the liver, spleen, pancreas, gallbladder, fat, muscles and hemangioma. The signal-to-noise and the contrast-to-noise ratios were obtained. The image quality was assessed subjectively, and the results were compared. The signal-to-noise and the contrast-to-noise ratios were significantly higher at TE = 128 msec than at TE = 90 when diseases of the liver, spleen, pancreas, gallbladder, and fat and muscles, hepatocellular carcinomas and hemangiomas, and rendering the hepatobiliary tract system based on the maximum signal intensity technique were involved (p < 0.05). In addition, the presence of artifacts, the image clarity and the overall image quality were excellent at TE = 128 msec (p < 0.05). In abdominal MRI, the breath-hold half-Fourier acquisition single-shot turbo spin-echo (HASTE) was found to be effective in illustrating the abdominal organs for TE = 128 msec. Overall, the image quality at TE = 128 msec was better than that at TE = 90 msec due to the improved signal-to-noise (SNR) and contrast-to-noise (CNR) ratios. Overall, the HASTE technique for abdominal MRI based on a high-magnetic field (3.0 T) at a TE of 128 msec can provide useful data.
DNA Microarray for Detection of Gastrointestinal Viruses
Martínez, Miguel A.; Soto-del Río, María de los Dolores; Gutiérrez, Rosa María; Chiu, Charles Y.; Greninger, Alexander L.; Contreras, Juan Francisco; López, Susana; Arias, Carlos F.
2014-01-01
Gastroenteritis is a clinical illness of humans and other animals that is characterized by vomiting and diarrhea and caused by a variety of pathogens, including viruses. An increasing number of viral species have been associated with gastroenteritis or have been found in stool samples as new molecular tools have been developed. In this work, a DNA microarray capable in theory of parallel detection of more than 100 viral species was developed and tested. Initial validation was done with 10 different virus species, and an additional 5 species were validated using clinical samples. Detection limits of 1 × 103 virus particles of Human adenovirus C (HAdV), Human astrovirus (HAstV), and group A Rotavirus (RV-A) were established. Furthermore, when exogenous RNA was added, the limit for RV-A detection decreased by one log. In a small group of clinical samples from children with gastroenteritis (n = 76), the microarray detected at least one viral species in 92% of the samples. Single infection was identified in 63 samples (83%), and coinfection with more than one virus was identified in 7 samples (9%). The most abundant virus species were RV-A (58%), followed by Anellovirus (15.8%), HAstV (6.6%), HAdV (5.3%), Norwalk virus (6.6%), Human enterovirus (HEV) (9.2%), Human parechovirus (1.3%), Sapporo virus (1.3%), and Human bocavirus (1.3%). To further test the specificity and sensitivity of the microarray, the results were verified by reverse transcription-PCR (RT-PCR) detection of 5 gastrointestinal viruses. The RT-PCR assay detected a virus in 59 samples (78%). The microarray showed good performance for detection of RV-A, HAstV, and calicivirus, while the sensitivity for HAdV and HEV was low. Furthermore, some discrepancies in detection of mixed infections were observed and were addressed by reverse transcription-quantitative PCR (RT-qPCR) of the viruses involved. It was observed that differences in the amount of genetic material favored the detection of the most abundant
Hastings, Matthew B
2009-01-01
We show how to combine the light-cone and matrix product algorithms to simulate quantum systems far from equilibrium for long times. For the case of the XXZ spin chain at {Delta} = 0.5, we simulate to a time of {approx} 22.5. While part of the long simulation time is due to the use of the light-cone method, we also describe a modification of the infinite time-evolving bond decimation algorithm with improved numerical stability, and we describe how to incorporate symmetry into this algorithm. While statistical sampling error means that we are not yet able to make a definite statement, the behavior of the simulation at long times indicates the appearance of either 'revivals' in the order parameter as predicted by Hastings and Levitov (e-print arXiv:0806.4283) or of a distinct shoulder in the decay of the order parameter.
A Markov-Chain Monte-Carlo Based Method for Flaw Detection in Beams
Glaser, R E; Lee, C L; Nitao, J J; Hickling, T L; Hanley, W G
2006-09-28
A Bayesian inference methodology using a Markov Chain Monte Carlo (MCMC) sampling procedure is presented for estimating the parameters of computational structural models. This methodology combines prior information, measured data, and forward models to produce a posterior distribution for the system parameters of structural models that is most consistent with all available data. The MCMC procedure is based upon a Metropolis-Hastings algorithm that is shown to function effectively with noisy data, incomplete data sets, and mismatched computational nodes/measurement points. A series of numerical test cases based upon a cantilever beam is presented. The results demonstrate that the algorithm is able to estimate model parameters utilizing experimental data for the nodal displacements resulting from specified forces.
Link, W.A.; Barker, R.J.
2005-01-01
We present a hierarchical extension of the Cormack?Jolly?Seber (CJS) model for open population capture?recapture data. In addition to recaptures of marked animals, we model first captures of animals and losses on capture. The parameter set includes capture probabilities, survival rates, and birth rates. The survival rates and birth rates are treated as a random sample from a bivariate distribution, thus the model explicitly incorporates correlation in these demographic rates. A key feature of the model is that the likelihood function, which includes a CJS model factor, is expressed entirely in terms of identifiable parameters; losses on capture can be factored out of the model. Since the computational complexity of classical likelihood methods is prohibitive, we use Markov chain Monte Carlo in a Bayesian analysis. We describe an efficient candidate-generation scheme for Metropolis?Hastings sampling of CJS models and extensions. The procedure is illustrated using mark-recapture data for the moth Gonodontis bidentata.
Large-eddy simulation of flow around an airfoil on a structured mesh
NASA Technical Reports Server (NTRS)
Kaltenbach, Hans-Jakob; Choi, Haecheon
1995-01-01
The diversity of flow characteristics encountered in a flow over an airfoil near maximum lift taxes the presently available statistical turbulence models. This work describes our first attempt to apply the technique of large-eddy simulation to a flow of aeronautical interest. The challenge for this simulation comes from the high Reynolds number of the flow as well as the variety of flow regimes encountered, including a thin laminar boundary layer at the nose, transition, boundary layer growth under adverse pressure gradient, incipient separation near the trailing edge, and merging of two shear layers at the trailing edge. The flow configuration chosen is a NACA 4412 airfoil near maximum lift. The corresponding angle of attack was determined independently by Wadcock (1987) and Hastings & Williams (1984, 1987) to be close to 12 deg. The simulation matches the chord Reynolds number U(sub infinity)c/v = 1.64 x 10(exp 6) of Wadcock's experiment.
Generalized Dynamic Factor Models for Mixed-Measurement Time Series
Cui, Kai; Dunson, David B.
2013-01-01
In this article, we propose generalized Bayesian dynamic factor models for jointly modeling mixed-measurement time series. The framework allows mixed-scale measurements associated with each time series, with different measurements having different distributions in the exponential family conditionally on time-varying latent factor(s). Efficient Bayesian computational algorithms are developed for posterior inference on both the latent factors and model parameters, based on a Metropolis Hastings algorithm with adaptive proposals. The algorithm relies on a Greedy Density Kernel Approximation (GDKA) and parameter expansion with latent factor normalization. We tested the framework and algorithms in simulated studies and applied them to the analysis of intertwined credit and recovery risk for Moody’s rated firms from 1982–2008, illustrating the importance of jointly modeling mixed-measurement time series. The article has supplemental materials available online. PMID:24791133
Optimal timing for managed relocation of species faced with climate change
McDonald Madden, Eve; Runge, Michael C.; Possingham, Hugh P.; Martin, Tara G.
2011-01-01
Managed relocation is a controversial climate-adaptation strategy to combat negative climate change impacts on biodiversity. While the scientific community debates the merits of managed relocation, species are already being moved to new areas predicted to be more suitable under climate change. To inform these moves, we construct a quantitative decision framework to evaluate the timing of relocation in the face of climate change. We find that the optimal timing depends on many factors, including the size of the population, the demographic costs of translocation and the expected carrying capacities over time in the source and destination habitats. In some settings, such as when a small population would benefit from time to grow before risking translocation losses, haste is ill advised. We also find that active adaptive management is valuable when the effect of climate change on source habitat is uncertain, and leads to delayed movement.
31. WEST TO PARTS AND TOOLS LOCATED DIRECTLY OPPOSITE FROM ...
31. WEST TO PARTS AND TOOLS LOCATED DIRECTLY OPPOSITE FROM THE BLACKSMITH SHOP AREA IN THE NORTHEAST QUADRANT OF THE FACTORY. ON THE FLOOR AT THE LEFT SIDE IS A MANUAL PIPE THREADER FOR LARGE-DIAMETER PIPE (AS DROP PIPE IN WELLS FOR WATER SYSTEMS). BENEATH THE BENCH ARE UNMACHINED NEW OLD STOCK MAIN CASTINGS FOR ELI WINDMILLS, TOGETHER WITH A USED MAIN SHAFT/WHEEL HUB/CRANK PLATE ASSEMBLY WITH 1920S-1930S OIL RESERVOIR FROM ELI WINDMILL. THE CIRCULAR CASTING WITH CRESCENT-SHAPED PATTERNS IS A PORTION OF THE CAM MECHANISM FROM A 'WESTERN GEARED GEARLESS' WINDMILL MADE BY THE WESTERN LAND ROLLER CO., HASTINGS, NEB. TO THE RIGHT ON THE BENCH IS A GEARED TIRE BENDER USED TO GIVE CURVATURE TO WHEEL RIMS OF ELI WINDMILLS. IN THE BACKGROUND ARE ... - Kregel Windmill Company Factory, 1416 Central Avenue, Nebraska City, Otoe County, NE
Tail decay for the distribution of the endpoint of a directed polymer
NASA Astrophysics Data System (ADS)
Bothner, Thomas; Liechty, Karl
2013-05-01
We obtain an asymptotic expansion for the tails of the random variable { T}=\\arg\\max_{u\\in{R}}(A_2(u)-u^2) where A_2 is the Airy2 process. Using the formula of Schehr (2012 J. Stat. Phys. 149 385) that connects the density function of { T} to the Hastings-McLeod solution to the second Painlevé equation, we prove that as t → ∞, {P}(|{ T}|>t)=C\\rme^{-\\frac{4}{3}\\varphi(t)}t^{-145/32}(1+O(t^{-3/4})) , where φ(t) = t3 - 2t3/2 + 3t3/4, and the constant C is given explicitly.
Generalized Dynamic Factor Models for Mixed-Measurement Time Series.
Cui, Kai; Dunson, David B
2014-02-12
In this article, we propose generalized Bayesian dynamic factor models for jointly modeling mixed-measurement time series. The framework allows mixed-scale measurements associated with each time series, with different measurements having different distributions in the exponential family conditionally on time-varying latent factor(s). Efficient Bayesian computational algorithms are developed for posterior inference on both the latent factors and model parameters, based on a Metropolis Hastings algorithm with adaptive proposals. The algorithm relies on a Greedy Density Kernel Approximation (GDKA) and parameter expansion with latent factor normalization. We tested the framework and algorithms in simulated studies and applied them to the analysis of intertwined credit and recovery risk for Moody's rated firms from 1982-2008, illustrating the importance of jointly modeling mixed-measurement time series. The article has supplemental materials available online. PMID:24791133
Modeling interdependent animal movement in continuous time.
Niu, Mu; Blackwell, Paul G; Skarin, Anna
2016-06-01
This article presents a new approach to modeling group animal movement in continuous time. The movement of a group of animals is modeled as a multivariate Ornstein Uhlenbeck diffusion process in a high-dimensional space. Each individual of the group is attracted to a leading point which is generally unobserved, and the movement of the leading point is also an Ornstein Uhlenbeck process attracted to an unknown attractor. The Ornstein Uhlenbeck bridge is applied to reconstruct the location of the leading point. All movement parameters are estimated using Markov chain Monte Carlo sampling, specifically a Metropolis Hastings algorithm. We apply the method to a small group of simultaneously tracked reindeer, Rangifer tarandus tarandus, showing that the method detects dependency in movement between individuals. PMID:26812666
Setting mental health priorities: problems and possibilities.
Callahan, D
1994-01-01
A recent project at the Hastings Center examined the question of priority setting in the provision of mental health services. A central issue was whether those services should be prioritized independently of other health services. The answer to that question was no: they should have full parity. Even so, priority setting can be a complex venture. At the heart of any such effort will be the relationship between empirical evidence on treatment outcomes and efficacy and the political and ethical interests that legitimately bear on interpreting and using that evidence. An argument is made that a priority should be given those whose suffering and inability to function in ordinary life is most pronounced, even if the available treatment for them is comparatively less efficacious than for other conditions. PMID:7935242
Reproductive technology: in Britain, the debate after the Warnock Report.
Gillon, Raanan
1987-06-01
Gillon contributes an article on Great Britain to the Hastings Center Report series on reproductive technologies outside the United States. In 1984 the Warnock Committee's report represented the first attempt by a national government to formulate a policy on reproductive issues such as artificial insemination, in vitro fertilization, surrogate mothers, and research on human embryos. Reaction to the Warnock report has focused on its recommendations to ban commercial surrogacy and to allow experimentation on embryos up to 14 days after fertilization. Legislation on surrogacy was passed in 1985, while bills banning embryo research failed in 1986. A 1986 government consultation paper called for discussion of other aspects of the Warnock report, including its recommendation that a statutory licensing authority to regulate reproductive technologies be established. Gillon predicts that no new legislation will be enacted under the present government. PMID:11644023
Lessons not yet learned from the Fukushima disaster
NASA Astrophysics Data System (ADS)
Klügel, Jens-Uwe
2014-05-01
The Fukushima nuclear catastrophe has led to a wide-spread international discussion on how seismic and tsunami hazards can be better predicted and adverse consequences be prevented.In some countries the event led to the complete phase-out of nuclear energy. The lessons drawn by different organisations including earth scientists, earthquake engineers,non-governmental and governmental organisations will be reviewed from an independent position. This review captures the following areas: 1) Hazard assessment 2) Engineering design and defense in depth concepts 3) Emergency preparedness It is shown that not all important lessons from the catastrophe have been drawn. Especially the need of an holistic approach towards hazard assessment and the implementation of defense in depth and diversity of design principles for critical infrastructures like nuclear power plants hast to be stronger emphasized to prevent similar disasters.
Is there life after Roe v. Wade?
Mahowald, M B
1989-01-01
Mahowald's article is one of three in this issue of the Hastings Center Report under the overall title of "Abortion: searching for common ground." The articles were occasioned by the impending U.S. Supreme Court decision in Webster v. Reproductive Health Services (decided 3 Jul 1989), which was widely regarded as the Court's reconsideration of Roe v. Wade (1973). The debate in the United States over abortion has become intransigent and polarized since Roe, and the HCR articles by Mahowald, M. Glendon, and N. Rhoden represent an effort to find areas of agreement between advocates on both sides of the abortion question. Mahowald identifies several points of convergence and proposes modifications to Roe that might better accommodate competing interests of woman, fetus, and society. PMID:2663777
Rhoden, N K
1989-01-01
Rhoden's article is one of three on "Abortion: searching for common ground" in this issue of the Hastings Center Report. Her article, together with those by M. Mahowald and M. Glendon, was prompted by the expectation that the impending U.S. Supreme Court decision in Webster v. Reproductive Health Services (3 July 1989) would overturn or restrict Roe v. Wade (1973). Rhoden, an advocate for the pro-choice position, asks whether a compromise leading to an acceptable regulatory policy is possible or desirable among those on opposite sides of the abortion issue. She identifies several reasons why the Roe decision is vulnerable to review, but argues that effective education about sexuality and comprehensive social support of women are better approaches to abortion than restrictive legislation. PMID:2663778
Coping and positive perceptions in Irish mothers of children with intellectual disabilities.
Greer, Felicity A; Grey, Ian M; McClean, Brian
2006-09-01
Thirty-six mothers of children aged between 5 and 8 years with intellectual disabilities completed five self-report questionnaires measuring variables related to behavioural and emotional difficulties, levels of care demand, family supports, coping and positive perceptions. The relationships among these variables were investigated using a working model proposed by Hastings and Taunt (2002). Child behavioural and emotional problems in the non-clinical range predicted low levels of care demand. Formal social support was an effective form of support for mothers; helpfulness of formal social support predicted mobilizing the family to acquire and accept help in the community; and mobilizing the family predicted levels of strength and family closeness. The majority of respondents rated agreement with statements that their child was: a source of happiness or fulfilment; a source of strength and family closeness; and a source of personal growth and maturity. The theoretical and clinical implications of these results are discussed. PMID:16916848
Constraints on topological order in mott insulators.
Zaletel, Michael P; Vishwanath, Ashvin
2015-02-20
We point out certain symmetry induced constraints on topological order in Mott insulators (quantum magnets with an odd number of spin 1/2 moments per unit cell). We show, for example, that the double-semion topological order is incompatible with time reversal and translation symmetry in Mott insulators. This sharpens the Hastings-Oshikawa-Lieb-Schultz-Mattis theorem for 2D quantum magnets, which guarantees that a fully symmetric gapped Mott insulator must be topologically ordered, but is silent about which topological order is permitted. Our result applies to the kagome lattice quantum antiferromagnet, where recent numerical calculations of the entanglement entropy indicate a ground state compatible with either toric code or double-semion topological order. Our result rules out the latter possibility. PMID:25763971
Molecular dynamics simulations of field emission from a planar nanodiode
Torfason, Kristinn; Valfells, Agust; Manolescu, Andrei
2015-03-15
High resolution molecular dynamics simulations with full Coulomb interactions of electrons are used to investigate field emission in planar nanodiodes. The effects of space-charge and emitter radius are examined and compared to previous results concerning transition from Fowler-Nordheim to Child-Langmuir current [Y. Y. Lau, Y. Liu, and R. K. Parker, Phys. Plasmas 1, 2082 (1994) and Y. Feng and J. P. Verboncoeur, Phys. Plasmas 13, 073105 (2006)]. The Fowler-Nordheim law is used to determine the current density injected into the system and the Metropolis-Hastings algorithm to find a favourable point of emission on the emitter surface. A simple fluid like model is also developed and its results are in qualitative agreement with the simulations.
MRI in the assessment of pregnancy related intrauterine bleeding: a valuable adjunct to ultrasound?
Verswijvel, G; Grieten, M; Gyselaers, W; Van Holsbeke, C; Vandevenne, J; Horvath, M; Gelin, G; Palmers, Y
2002-01-01
MR imaging using ultrafast MR sequences is a useful method in assessing pregnancies at risk. This is especially the case for fetal imaging. However, reports of imaging of the placenta or the uterus are rare. We report the MR findings in 8 pregnant patients with vaginal blood loss in whom the obstetrical ultrasound was equivocal. MR imaging was performed with a 1.5 T magnet and consisted of T2- (HASTE), fat-suppressed gradient echo T1- and gradient echo T2-weighted images. Adequate anatomical visualisation of the uterus, the placentary tissue and the intrauterine bleeding irrespective of size and location of the latter, were obtained in all cases. PMID:12403387
Efficient estimation of decay parameters in acoustically coupled-spaces using slice sampling.
Jasa, Tomislav; Xiang, Ning
2009-09-01
Room-acoustic energy decay analysis of acoustically coupled-spaces within the Bayesian framework has proven valuable for architectural acoustics applications. This paper describes an efficient algorithm termed slice sampling Monte Carlo (SSMC) for room-acoustic decay parameter estimation within the Bayesian framework. This work combines the SSMC algorithm and a fast search algorithm in order to efficiently determine decay parameters, their uncertainties, and inter-relationships with a minimum amount of required user tuning and interaction. The large variations in the posterior probability density functions over multidimensional parameter spaces imply that an adaptive exploration algorithm such as SSMC can have advantages over the exiting importance sampling Monte Carlo and Metropolis-Hastings Markov Chain Monte Carlo algorithms. This paper discusses implementation of the SSMC algorithm, its initialization, and convergence using experimental data measured from acoustically coupled-spaces. PMID:19739741
Hastings, M. B.
2009-09-15
We show how to combine the light-cone and matrix product algorithms to simulate quantum systems far from equilibrium for long times. For the case of the XXZ spin chain at {delta}=0.5, we simulate to a time of {approx_equal}22.5. While part of the long simulation time is due to the use of the light-cone method, we also describe a modification of the infinite time-evolving bond decimation algorithm with improved numerical stability, and we describe how to incorporate symmetry into this algorithm. While statistical sampling error means that we are not yet able to make a definite statement, the behavior of the simulation at long times indicates the appearance of either 'revivals' in the order parameter as predicted by Hastings and Levitov (e-print arXiv:0806.4283) or of a distinct shoulder in the decay of the order parameter.
Kroening, Sharon E.; Lee, Kathy E.; Goldstein, R.M.
2003-01-01
Most sites had pronounced seasonal variations in dissolved nitrite plus nitrate nitrogen and dissolved ammonia nitrogen concentrations. At most sites, dissolved nitrite plus nitrate nitrogen concentrations were greatest in the winter and spring and least during the summer and fall. In contrast, the greatest dissolved nitrite plus nitrate nitrogen concentrations in the Little Cobb River near Beauford, Minnesota; Minnesota River near Jordan, Minnesota; and Mississippi River at Hastings and Red Wing, Minnesota occurred during the spring and summer. These seasonal variations in dissolved nitrite plus nitrate nitrogen concentrations may be the result of nitrogen cycling in the soils, as well as crop uptake and hydrologic conditions. The greatest concentrations of dissolved ammonia nitrogen at all sites occurred in the winter and spring. The maximum contaminant level for nitrate of 10 milligrams per liter (mg/L) as nitrogen set by the U.S. Environmental Protection Agency (USEPA) for drinking water was exceeded in 20 percent of the
Application of Thermo-Mechanical Measurements of Plastic Packages for Reliability Evaluation of PEMS
NASA Technical Reports Server (NTRS)
Sharma, Ashok K.; Teverovsky, Alexander
2004-01-01
Thermo-mechanical analysis (TMA) is typically employed for measurements of the glass transition temperature (Tg) and coefficients of thermal expansion (CTE) in molding compounds used in plastic encapsulated microcircuits (PEMs). Application of TMA measurements directly to PEMs allows anomalies to be revealed in deformation of packages with temperature, and thus indicates possible reliability concerns related to thermo-mechanical integrity and stability of the devices. In this work, temperature dependencies of package deformation were measured in several types of PEMs that failed environmental stress testing including temperature cycling, highly accelerated stress testing (HAST) in humid environments, and bum-in (BI) testing. Comparison of thermo-mechanical characteristics of packages and molding compounds in the failed parts allowed for explanation of the observed failures. The results indicate that TMA of plastic packages might be used for quality evaluation of PEMs intended for high-reliability applications.
Phase-Change Modelling in Severe Nuclear Accidents
NASA Astrophysics Data System (ADS)
Pain, Christopher; Pavlidis, Dimitrios; Xie, Zhihua; Percival, James; Gomes, Jefferson; Matar, Omar; Moatamedi, Moji; Tehrani, Ali; Jones, Alan; Smith, Paul
2014-11-01
This paper describes progress on a consistent approach for multi-phase flow modelling with phase-change. Although, the developed methods are general purpose the applications presented here cover core melt phenomena at the lower vessel head. These include corium pool formation, coolability and solidification. With respect to external cooling, comparison with the LIVE experiments (from Karlsruhe) is undertaken. Preliminary re-flooding simulation results are also presented. These include water injection into porous media (debris bed) and boiling. Numerical simulations follow IRSN's PEARL experimental programme on quenching/re-flooding. The authors wish to thank Prof. Timothy Haste of IRSN. Dr. D. Pavlidis is funded by EPSRC Consortium ``Computational Modelling for Advanced Nuclear Plants,'' Grant Number EP/I003010/1.
Bokma, Folmer
2008-11-01
Algorithms are presented to simultaneously estimate probabilities of speciation and extinction, rates of anagenetic and cladogenetic phenotypic evolution, as well as ancestral character states, from a complete ultrametric species-level phylogeny with dates assigned to all bifurcations and one or more phenotypes in three or more extant species, using Metropolis-Hastings Markov Chain Monte Carlo sampling. The algorithms also estimate missing phenotypes of extant species and numbers of speciation events that occurred on all branches of the phylogeny. The algorithms are discussed and their performance is evaluated using simulated data. That evaluation shows that precise estimation of rates of evolution of one or a few phenotypes requires large phylogenies. Estimation accuracy improves with the number of species on the phylogeny. PMID:18752617
Bayesian inference of subglacial topography using mass conservation
NASA Astrophysics Data System (ADS)
Brinkerhoff, Douglas; Aschwanden, Andy; Truffer, Martin
2016-02-01
We develop a Bayesian model for estimating ice thickness given sparse observations coupled with estimates of surface mass balance, surface elevation change, and surface velocity. These fields are related through mass conservation. We use the Metropolis-Hastings algorithm to sample from the posterior probability distribution of ice thickness for three cases: a synthetic mountain glacier, Storglaciären, and Jakobshavn Isbræ. Use of continuity in interpolation improves thickness estimates where relative velocity and surface mass balance errors are small, a condition difficult to maintain in regions of slow flow and surface mass balance near zero. Estimates of thickness uncertainty depend sensitively on spatial correlation. When this structure is known, we suggest a thickness measurement spacing of one to two times the correlation length to take best advantage of continuity based interpolation techniques. To determine ideal measurement spacing, the structure of spatial correlation must be better quantified.
A joint model for boundaries of multiple anatomical parts
NASA Astrophysics Data System (ADS)
Kerr, Grégoire; Kurtek, Sebastian; Srivastava, Anuj
2011-03-01
The use of joint shape analysis of multiple anatomical parts is a promising area of research with applications in medical diagnostics, growth evaluations, and disease characterizations. In this paper, we consider several features (shapes, orientations, scales, and locations) associated with anatomical parts and develop probability models that capture interactions between these features and across objects. The shape component is based on elastic shape analysis of continuous boundary curves. The proposed model is a second order model that considers principal coefficients in tangent spaces of joint manifolds as multivariate normal random variables. Additionally, it models interactions across objects using area-interaction processes. Using given observations of four anatomical parts: caudate, hippocampus, putamen and thalamus, on one side of the brain, we first estimate the model parameters and then generate random samples from them using the Metropolis-Hastings algorithm. The plausibility of these random samples validates the proposed models.
NASA Astrophysics Data System (ADS)
Williams, Gwyn P.; Revesz, Peter; Arp, Uwe
2014-03-01
Conference Chairs NameOrganization Gwyn Williams Jefferson Lab Peter ReveszCornell High Energy Synchrotron Source Uwe ArpSynchrotron Ultraviolet Radiation Facility Programme Committee NameOrganization Alastair MacDowellAdvanced Light Source Tom ToellnerAdvanced Photon Source Amitava D RoyCenter for Advanced Microstructures and Devices Tom EllisCanadian Light Source Roberta SantarosaLaboratório Nacional de Luz Síncrotron Jerry (Jerome) HastingsLinac Coherent Light Source Steven HulbertNational Synchrotron Light Source Thomas A RabedeauStanford Synchrotron Radiation Lightsource Mark BissenSynchrotron Radiation Center Gwyn WilliamsJefferson Lab Peter ReveszCornell High Energy Synchrotron Source Uwe ArpSynchrotron Ultraviolet Radiation Facility
Reproductive technology: in the Netherlands, tolerance and debate.
De Wachter, Maurice A M; De Wert, Guido MWR
1987-06-01
Two ethicists from the Netherlands' Institute for Bioethics file a report on their country in one of six Hastings Center Report articles on the status of reproductive technologies around the world. The situation in the Netherlands reflects the tolerant attitudes of the Dutch toward what are regarded as private matters. Artificial insemination, in vitro fertilization, and surrogate motherhood are available, and research on embryos is in the planning stages. Facilities offering reproductive services are regulated by the Minister of Health, with advice from the independent Health Council on Artificial Reproduction, the National Council for Public Health, and various insurance companies and professional medical organizations. Public policy debates center around such issues as the value of parenthood; involvement of third parties; secrecy about a child's genetic origins; privacy for semen, ovum, and embryo donors; access to services; and insurance coverage of treatment. PMID:11644022
Small Commercial Program DOE Project: Impact evaluation. Final report
Bathgate, R.; Faust, S.
1992-08-12
In 1991, Washington Electric Cooperative (WEC) implemented a Department of Energy grant to conduct a small commercial energy conservation project. The small commercial ``Mom, and Pop`` grocery stores within WEC`s service territory were selected as the target market for the project. Energy & Solid Waste Consultant`s (E&SWC) Impact Evaluation is documented here. The evaluation was based on data gathered from a variety of sources, including load profile metering, kWh submeters, elapsed time indicators, and billing histories. Five stores were selected to receive measures under this program: Waits River General Store, Joe`s Pond Store, Hastings Store, Walden General Store, and Adamant Cooperative. Specific measures installed in each store and description of each are included.
Small Commercial Program DOE Project: Impact evaluation
Bathgate, R.; Faust, S. )
1992-08-12
In 1991, Washington Electric Cooperative (WEC) implemented a Department of Energy grant to conduct a small commercial energy conservation project. The small commercial Mom, and Pop'' grocery stores within WEC's service territory were selected as the target market for the project. Energy Solid Waste Consultant's (E SWC) Impact Evaluation is documented here. The evaluation was based on data gathered from a variety of sources, including load profile metering, kWh submeters, elapsed time indicators, and billing histories. Five stores were selected to receive measures under this program: Waits River General Store, Joe's Pond Store, Hastings Store, Walden General Store, and Adamant Cooperative. Specific measures installed in each store and description of each are included.
Quantitative K-Theory Related to Spin Chern Numbers
NASA Astrophysics Data System (ADS)
Loring, Terry A.
2014-07-01
We examine the various indices defined on pairs of almost commuting unitary matrices that can detect pairs that are far from commuting pairs. We do this in two symmetry classes, that of general unitary matrices and that of self-dual matrices, with an emphasis on quantitative results. We determine which values of the norm of the commutator guarantee that the indices are defined, where they are equal, and what quantitative results on the distance to a pair with a different index are possible. We validate a method of computing spin Chern numbers that was developed with Hastings and only conjectured to be correct. Specifically, the Pfaffian-Bott index can be computed by the ''log method'' for commutator norms up to a specific constant.
Tang, An-Min; Tang, Nian-Sheng
2015-02-28
We propose a semiparametric multivariate skew-normal joint model for multivariate longitudinal and multivariate survival data. One main feature of the posited model is that we relax the commonly used normality assumption for random effects and within-subject error by using a centered Dirichlet process prior to specify the random effects distribution and using a multivariate skew-normal distribution to specify the within-subject error distribution and model trajectory functions of longitudinal responses semiparametrically. A Bayesian approach is proposed to simultaneously obtain Bayesian estimates of unknown parameters, random effects and nonparametric functions by combining the Gibbs sampler and the Metropolis-Hastings algorithm. Particularly, a Bayesian local influence approach is developed to assess the effect of minor perturbations to within-subject measurement error and random effects. Several simulation studies and an example are presented to illustrate the proposed methodologies. PMID:25404574
Understanding the agreements and controversies surrounding childhood psychopharmacology
Parens, Erik; Johnston, Josephine
2008-01-01
The number of children in the US taking prescription drugs for emotional and behavioral disturbances is growing dramatically. This growth in the use of psychotropic drugs in pediatric populations has given rise to multiple controversies, ranging from concerns over off-label use and long-term safety to debates about the societal value and cultural meaning of pharmacological treatment of childhood behavioral and emotional disorders. This commentary summarizes the authors' eight main findings from the first of five workshops that seek to understand and produce descriptions of these controversies. The workshop series is convened by The Hastings Center, a bioethics research institute located in Garrison, New York, U.S.A. PMID:18261228
A Bayesian Approach to Learning Scoring Systems.
Ertekin, Şeyda; Rudin, Cynthia
2015-12-01
We present a Bayesian method for building scoring systems, which are linear models with coefficients that have very few significant digits. Usually the construction of scoring systems involve manual effort-humans invent the full scoring system without using data, or they choose how logistic regression coefficients should be scaled and rounded to produce a scoring system. These kinds of heuristics lead to suboptimal solutions. Our approach is different in that humans need only specify the prior over what the coefficients should look like, and the scoring system is learned from data. For this approach, we provide a Metropolis-Hastings sampler that tends to pull the coefficient values toward their "natural scale." Empirically, the proposed method achieves a high degree of interpretability of the models while maintaining competitive generalization performances. PMID:27441407
NASA Astrophysics Data System (ADS)
Jenke, Peter Alexander
2009-01-01
Occultation is a technique that enables image reconstruction and source identification with a non-imaging detector. Such an approach is well suited for a future survey mission in nuclear astrophysics. In particular, the Lunar Occultation Technique (LOT) utilizes the Moon as an occulting object and is the basis of a new gamma-ray survey mission concept, the Lunar OCcultation Observer (LOCO). Techniques utilizing the LOT to detect spatially extended emission, from the Galactic plane or Galactic Center region, have been developed. Given knowledge of detector position in lunar orbit, combined with lunar ephemeris and relevant coordinate transformations, occultation time series can be used to reconstruct skymaps of these extended Galactic emitters. Monte-Carlo Markov Chains (MCMC), incorporating the Metropolis-Hastings algorithm for parametric model testing, form the basis of the technique. Performance of the imaging methodology, and its application to nuclear astrophysics will be presented.
2016-05-01
For the last six months or so, some of us at The Hastings Center have been participating in a kind of short-term book group. Together we have been thinking about the contribution of moral psychology to bioethics. One of our questions is whether bioethics' understanding of moral values should draw on what moral psychology tells us about moral values. Bioethics tends to look to philosophy for guidance. Can it learn from insights in moral psychology into the biological, environmental, and cultural influences on morality? The question can be taken in many directions. One that I've wrestled with has to do with debates about genetic engineering, where a common concern is that genetic alteration of other organisms, and maybe also of humans, doesn't sit well with the kind of relationship that people want to have to nature. PMID:27150424
Searching for massive black hole binaries in the first Mock LISA Data Challenge
NASA Astrophysics Data System (ADS)
Cornish, Neil J.; Porter, Edward K.
2007-10-01
The Mock LISA Data Challenge is a worldwide effort to solve the LISA data analysis problem. We present here our results for the massive black hole binary (BBH) section of round 1. Our results cover challenge 1.2.1, where the coalescence of the binary is seen, and challenge 1.2.2, where the coalescence occurs after the simulated observational period. The data stream is composed of Gaussian instrumental noise plus an unknown BBH waveform. Our search algorithm is based on a variant of the Markov chain Monte Carlo method that uses Metropolis Hastings sampling and thermostated frequency annealing. We present results from the training data sets where we know the parameter values a priori and the blind data sets where we were informed of the parameter values after the challenge had finished. We demonstrate that our algorithm is able to rapidly locate the sources, accurately recover the source parameters and provide error estimates for the recovered parameters.
An algorithm for the detection of extreme mass ratio inspirals in LISA data
NASA Astrophysics Data System (ADS)
Babak, Stanislav; Gair, Jonathan R.; Porter, Edward K.
2009-07-01
The gravitational wave signal from a compact object inspiralling into a massive black hole (MBH) is considered to be one of the most difficult sources to detect in the LISA data stream. Due to the large parameter space of possible signals and many orbital cycles spent in the sensitivity band of LISA, it has been estimated that ~1035 templates would be required to carry out a fully coherent search using a template grid, which is computationally impossible. Here we describe an algorithm based on a constrained Metropolis-Hastings stochastic search which allows us to find and accurately estimate parameters of isolated EMRI signals buried in Gaussian instrumental noise. We illustrate the effectiveness of the algorithm with results from searches of the Mock LISA Data Challenge round 1B data sets.
NASA Astrophysics Data System (ADS)
Gair, Jonathan R.; Porter, Edward K.
2009-11-01
We describe a hybrid evolutionary algorithm that can simultaneously search for multiple supermassive black hole binary (SMBHB) inspirals in LISA data. The algorithm mixes evolutionary computation, Metropolis-Hastings methods and Nested Sampling. The inspiral of SMBHBs presents an interesting problem for gravitational wave data analysis since, due to the LISA response function, the sources have a bi-modal sky solution. We show here that it is possible not only to detect multiple SMBHBs in the data stream, but also to investigate simultaneously all the various modes of the global solution. In all cases, the algorithm returns parameter determinations within 5σ (as estimated from the Fisher matrix) of the true answer, for both the actual and antipodal sky solutions.
Intensity coding in two-dimensional excitable neural networks
NASA Astrophysics Data System (ADS)
Copelli, Mauro; Kinouchi, Osame
2005-04-01
In the light of recent experimental findings that gap junctions are essential for low level intensity detection in the sensory periphery, the Greenberg-Hastings cellular automaton is employed to model the response of a two-dimensional sensory network to external stimuli. We show that excitable elements (sensory neurons) that have a small dynamical range are shown to give rise to a collective large dynamical range. Therefore the network transfer (gain) function (which is Hill or Stevens law-like) is an emergent property generated from a pool of small dynamical range cells, providing a basis for a “neural psychophysics”. The growth of the dynamical range with the system size is approximately logarithmic, suggesting a functional role for electrical coupling. For a fixed number of neurons, the dynamical range displays a maximum as a function of the refractory period, which suggests experimental tests for the model. A biological application to ephaptic interactions in olfactory nerve fascicles is proposed.
Meat and its place in the diet.
MacDonald, H B
1991-01-01
Canadians are becoming increasingly aware of the importance of nutrition in their long-term health prospects. With this increased awareness, however, has come an abundance of misconceptions including the notion that meat is "bad" for you. In their haste to avoid saturated fat, physicians and the public alike have lost sight of the fact that lean meat in reasonable serving sizes poses no threat to health and is an extremely important source of many nutrients. The mistaken notion that only animal fats are saturated has resulted in a change in the source of fat but not the quantity. Health professionals must work together to educate the public about the many nutrient-dense, low-fat food choices available in a well-balanced diet. PMID:1768992
Madelin, Guillaume; Grucker, Daniel; Franconi, Jean-Michel; Thiaudiere, Eric
2006-07-01
In this study, magnetic resonance imaging (MRI) is used to visualize acoustic streaming in liquids. A single-shot spin echo sequence (HASTE) with a saturation band perpendicular to the acoustic beam permits the acquisition of an instantaneous image of the flow due to the application of ultrasound. An average acoustic streaming velocity can be estimated from the MR images, from which the ultrasonic absorption coefficient and the bulk viscosity of different glycerol-water mixtures can be deduced. In the same way, this MRI method could be used to assess the acoustic field and time-average power of ultrasonic transducers in water (or other liquids with known physical properties), after calibration of a geometrical parameter that is dependent on the experimental setup. PMID:16650447
Bayesian Estimation of Latently-grouped Parameters in Undirected Graphical Models
Liu, Jie; Page, David
2014-01-01
In large-scale applications of undirected graphical models, such as social networks and biological networks, similar patterns occur frequently and give rise to similar parameters. In this situation, it is beneficial to group the parameters for more efficient learning. We show that even when the grouping is unknown, we can infer these parameter groups during learning via a Bayesian approach. We impose a Dirichlet process prior on the parameters. Posterior inference usually involves calculating intractable terms, and we propose two approximation algorithms, namely a Metropolis-Hastings algorithm with auxiliary variables and a Gibbs sampling algorithm with “stripped” Beta approximation (Gibbs_SBA). Simulations show that both algorithms outperform conventional maximum likelihood estimation (MLE). Gibbs_SBA’s performance is close to Gibbs sampling with exact likelihood calculation. Models learned with Gibbs_SBA also generalize better than the models learned by MLE on real-world Senate voting data. PMID:25404848
Spain: from the decree to the proposal.
Gracia, Diego
1987-06-01
This is one in a series of four country reports published together in the Hastings Center Report. Gracia, a bioethicist, compares health care policy before and after Franco's dictatorship. Under Franco, compulsory health insurance was enacted, and modern hospitals were built at the expense of primary services. Patient care was governed by the principle of beneficence "in its extreme and paternalistic sense." Medicine in the democratic post-Franco period has reflected changes in Spanish society as political freedom has led to an increased moral pluralism and the formation of public policy through debate and compromise. Gracia identifies three bioethical issues where changes in attitudes and policies have been the greatest: resource allocation, abortion, and organ transplantation. He concludes his report by briefly describing the role bioethics plays in public policy formation in Spain today. PMID:11644028
Doug Cathro
2010-06-30
The Lake Charles CCS Project is a large-scale industrial carbon capture and sequestration (CCS) project which will demonstrate advanced technologies that capture and sequester carbon dioxide (CO{sub 2}) emissions from industrial sources into underground formations. Specifically the Lake Charles CCS Project will accelerate commercialization of large-scale CO{sub 2} storage from industrial sources by leveraging synergy between a proposed petroleum coke to chemicals plant (the LCC Gasification Project) and the largest integrated anthropogenic CO{sub 2} capture, transport, and monitored sequestration program in the U.S. Gulf Coast Region. The Lake Charles CCS Project will promote the expansion of EOR in Texas and Louisiana and supply greater energy security by expanding domestic energy supplies. The capture, compression, pipeline, injection, and monitoring infrastructure will continue to sequester CO{sub 2} for many years after the completion of the term of the DOE agreement. The objectives of this project are expected to be fulfilled by working through two distinct phases. The overall objective of Phase 1 was to develop a fully definitive project basis for a competitive Renewal Application process to proceed into Phase 2 - Design, Construction and Operations. Phase 1 includes the studies attached hereto that will establish: the engineering design basis for the capture, compression and transportation of CO{sub 2} from the LCC Gasification Project, and the criteria and specifications for a monitoring, verification and accounting (MVA) plan at the Hastings oil field in Texas. The overall objective of Phase 2, provided a successful competitive down-selection, is to execute design, construction and operations of three capital projects: (1) the CO{sub 2} capture and compression equipment, (2) a Connector Pipeline from the LLC Gasification Project to the Green Pipeline owned by Denbury and an affiliate of Denbury, and (3) a comprehensive MVA system at the Hastings oil field.
Mitui, Marcelo Takahiro; Bozdayi, Gulendam; Ahmed, Selim; Matsumoto, Takashi; Nishizono, Akira; Ahmed, Kamruddin
2014-07-01
The incidence and mortality caused by diarrhea differ among countries. The prevalence of different enteric viruses, their molecular characteristics, and infections with multiple viruses might affect the disease incidence and mortality caused by diarrhea. The objective of this study was to determine the distribution and molecular characteristics of enteric viruses in children with diarrhea in Turkey and Bangladesh. A total of 288 stool samples that were negative for group A rotavirus were collected from children aged <5 years with acute diarrhea who presented to hospitals in Turkey and Bangladesh. The samples were screened for human bocavirus (HBoV), astrovirus (HAstV), norovirus (NoV), and adenovirus (AdV). Phylogenetic analyses of the targeted virus genes were performed. In Turkey, viruses were detected in 87/150 samples (58%), which included 69 (79.3%) with single viruses and 18 (20.7%) with multiple viruses. AdV was the most common virus, followed by HBoV. In Bangladesh, viruses were detected in 123/138 samples (89.1%), which included 29 (23.6%) with single viruses and 94 (76.4%) with multiple viruses. NoV GII was the most common, followed by AdV. The dominant genotypes among the virus species were HBoV 2A, HAstV 1, NoV GI type 1, and AdV 40. For NoV GII, the Hunter variant of genotype 4 in Turkey and genotype 17 in Bangladesh were the most common among the sequenced strains. It was concluded that the distribution of the viruses associated with diarrhea in Turkish and Bangladeshi children was different. Enteric viruses and mixed infections were more prevalent in Bangladesh than in Turkey. PMID:24105741
TU-F-BRF-06: 3D Pancreas MRI Segmentation Using Dictionary Learning and Manifold Clustering
Gou, S; Rapacchi, S; Hu, P; Sheng, K
2014-06-15
Purpose: The recent advent of MRI guided radiotherapy machines has lent an exciting platform for soft tissue target localization during treatment. However, tools to efficiently utilize MRI images for such purpose have not been developed. Specifically, to efficiently quantify the organ motion, we develop an automated segmentation method using dictionary learning and manifold clustering (DLMC). Methods: Fast 3D HASTE and VIBE MR images of 2 healthy volunteers and 3 patients were acquired. A bounding box was defined to include pancreas and surrounding normal organs including the liver, duodenum and stomach. The first slice of the MRI was used for dictionary learning based on mean-shift clustering and K-SVD sparse representation. Subsequent images were iteratively reconstructed until the error is less than a preset threshold. The preliminarily segmentation was subject to the constraints of manifold clustering. The segmentation results were compared with the mean shift merging (MSM), level set (LS) and manual segmentation methods. Results: DLMC resulted in consistently higher accuracy and robustness than comparing methods. Using manual contours as the ground truth, the mean Dices indices for all subjects are 0.54, 0.56 and 0.67 for MSM, LS and DLMC, respectively based on the HASTE image. The mean Dices indices are 0.70, 0.77 and 0.79 for the three methods based on VIBE images. DLMC is clearly more robust on the patients with the diseased pancreas while LS and MSM tend to over-segment the pancreas. DLMC also achieved higher sensitivity (0.80) and specificity (0.99) combining both imaging techniques. LS achieved equivalent sensitivity on VIBE images but was more computationally inefficient. Conclusion: We showed that pancreas and surrounding normal organs can be reliably segmented based on fast MRI using DLMC. This method will facilitate both planning volume definition and imaging guidance during treatment.
Asteroid orbital inversion using a virtual-observation Markov-chain Monte Carlo method
NASA Astrophysics Data System (ADS)
Muinonen, Karri; Granvik, Mikael; Oszkiewicz, Dagmara; Pieniluoma, Tuomo; Pentikäinen, Hanna
2012-12-01
A novel virtual-observation Markov-chain Monte Carlo method (MCMC) is presented for the asteroid orbital inverse problem posed by small to moderate numbers of astrometric observations. In the method, the orbital-element proposal probability density is chosen to mimic the convolution of the a posteriori density by itself: first, random errors are simulated for each observation, resulting in a set of virtual observations; second, least-squares orbital elements are derived for the virtual observations using the Nelder-Mead downhill simplex method; third, repeating the procedure gives a difference between two sets of what can be called virtual least-squares elements; and, fourth, the difference obtained constitutes a symmetric proposal in a random-walk Metropolis-Hastings algorithm, avoiding the explicit computation of the proposal density. In practice, the proposals are based on a large number of pre-computed sets of orbital elements. Virtual-observation MCMC is thus based on the characterization of the phase-space volume of solutions before the actual MCMC sampling. Virtual-observation MCMC is compared to MCMC orbital ranging, a random-walk Metropolis-Hastings algorithm based on sampling with the help of Cartesian positions at two observation dates, in the case of the near-Earth asteroid (85640) 1998 OX4. In the present preliminary comparison, the methods yield similar results for a 9.1-day observational time interval extracted from the full current astrometry of the asteroid. In the future, both of the methods are to be applied to the astrometric observations of the Gaia mission.
NASA Astrophysics Data System (ADS)
Phillips, G.; Robinson, J.; Glen, R.; Roberts, J.
2016-05-01
The middle to late Permian Hunter Bowen Event is credited with the development of orogenic curvature in the southern New England Orogen, yet contention surrounds the structural dynamics responsible for the development of this curvature. Debate is largely centred on the roles of orogen parallel strike-slip and orogen normal extension and contraction to explain the development of curvature. To evaluate the dynamic history of the Hunter Bowen Event, we present new kinematic reconstructions of the Tamworth Belt. The Tamworth Belt formed as a Carboniferous forearc basin and was subsequently inverted during the Hunter Bowen Event. Kinematic reconstructions of the Tamworth Belt are based on new maps and cross-sections built from a synthesis of best-available mapping, chronostratigraphic data and new interpretations of depth-converted seismic data. The following conclusions are made from our study: (i) the Hunter Bowen Event was dominantly driven by margin normal contraction (east-west shortening; present-day coordinates), and; (ii) variations in structural style along the strike of the Tamworth Belt can be explained by orthogonal vs. oblique inversion, which reflects the angular relationship between the principal shortening vector and continental-arc margin. Given these conclusions, we suggest that curvature around the controversial Manning Bend was influenced by the presence of primary curvature in the continental margin, and that the Hastings Block was translated along a sinistral strike-slip fault system that formed along this oblique (with respect to the regional east-west extension and convergence direction) part of the margin. Given the available temporal data, the translation of the Hastings Block took place in the Early Permian (Asselian) and therefore preceded the Hunter Bowen Event. Accordingly, we suggest that the Hunter Bowen Event was dominantly associated with enhancing curvature that was either primary in origin, or associated with fault block translation
NASA Technical Reports Server (NTRS)
Sharma, Ashok K.; Teverovsky, Alexander; Dowdy, Terry W.; Hamilton, Brett
2000-01-01
A major reliability issue for all advanced nonvolatile memory (NVM) technology devices including FRAMs (Ferroelectric random access memories) is the data retention characteristics over extended period of time, under environmental stresses and exposure to total ionizing dose (TID) radiation effects. For this testing, 256 Kb FRAMs in 28-pin plastic DIPS, rated for industrial grade temperature range of -40 C to +85 C, were procured. These are two-transistor, two-capacitor (2T-2C) design FRAMs. In addition to data retention characteristics, the parts were also evaluated for imprint failures, which are defined as the failure of cells to change from a "preferred" state, where it has been for a significant period of time to an opposite state (e.g., from 1 to 0, or 0 to 1). These 256 K FRAMs were subjected to scanning acoustic microscopy (C-SAM); 1,000 temperature cycles from -65 C to +150 C; high temperature aging at 150 C, 175 C, and 200 C for 1,000 hours; highly accelerated stress test (HAST) for 500 hours; 1,000 hours of operational life test at 125 C; and total ionizing dose radiation testing. As a preconditioning, 10 K read/write cycles were performed on all devices. Interim electrical measurements were performed throughout this characterization, including special imprint testing and final electrical testing. Some failures were observed during high temperature aging test at 200 C, during HAST testing, and during 1,000 hours of operational life at 125 C. The parts passed 10 Krad exposure, but began showing power supply current increases during the dose increment from 10 Krad to 30 Krad, and at 40 Krad severe data retention and parametric failures were observed. Failures from various environmental group testing are currently being analyzed.
NASA Technical Reports Server (NTRS)
Sharma, Asbok K.; Teverovsky, Alexander; Dowdy, Terry W.; Hamilton, Brett
2002-01-01
A major reliability issue for all advanced nonvolatile memory (NVM) technology devices including FRAMs is the data retention characteristics over extended period of time, under environmental stresses and exposure to total ionizing dose (TID) radiation effects. For this testing, 256 Kb FRAMs in 28-pin plastic DIPS, rated for industrial grade temperature range of -40 C to +85 C, were procured. These are two-transistor, two-capacitor (2T-2C) design FRAMs. In addition to data retention characteristics, the parts were also evaluated for imprint failures, which are defined as the failure of cells to change from a "preferred" state, where it has been for a significant period of time to an opposite state (e.g., from 1 to 0, or 0 to 1). These 256 K FRAMs were subjected to scanning acoustic microscopy (C-SAM); 1,000 temperature cycles from -65 C to +150 C; high temperature aging at 150 C, 175 C, and 200 C for 1,000 hours; highly accelerated stress test (HAST) for 500 hours; 1,000 hours of operational life test at 125 C; and total ionizing dose radiation testing. As a preconditioning, 10 K read/write cycles were performed on all devices. Interim electrical measurements were performed throughout this characterization, including special imprint testing and final electrical testing. Some failures were observed during high temperature aging test at 200 C, during HAST testing, and during 1,000 hours of operational life at 125 C. The parts passed 10 Krad exposure, but began showing power supply current increases during the dose increment from 10 Krad to 30 Krad, and at 40 Krad severe data retention and parametric failures were observed. Failures from various environmental group testing are currently being analyzed.
Chitambar, Shobha; Gopalkrishna, Varanasi; Chhabra, Preeti; Patil, Pooja; Verma, Harsha; Lahon, Anismrita; Arora, Ritu; Tatte, Vaishali; Ranshing, Sujata; Dhale, Ganesh; Kolhapure, Rajendra; Tikute, Sanjay; Kulkarni, Jagannath; Bhardwaj, Renu; Akarte, Sulbha; Pawar, Sashikant
2012-03-01
Faecal specimens collected from two outbreaks of acute gastroenteritis that occurred in southern Mumbai, India in March and October, 2006 were tested for seven different enteric viruses. Among the 218 specimens tested, 95 (43.6%) were positive, 73 (76.8%) for a single virus and 22 (23.2%) for multiple viruses. Single viral infections in both, March and October showed predominance of enterovirus (EV, 33.3% and 40%) and rotavirus A (RVA, 33.3% and 25%). The other viruses detected in these months were norovirus (NoV, 12.1% and 10%), rotavirus B (RVB, 12.1% and 10%), enteric adenovirus (AdV, 6.1% and 7.5%), Aichivirus (AiV, 3% and 7.5%) and human astrovirus (HAstV, 3% and 0%). Mixed viral infections were largely represented by two viruses (84.6% and 88.9%), a small proportion showed presence of three (7.7% and 11%) and four (7.7% and 0%) viruses in the two outbreaks. Genotyping of the viruses revealed predominance of RVA G2P[4], RVB G2 (Indian Bangladeshi lineage), NoV GII.4, AdV-40, HAstV-8 and AiV B types. VP1/2A junction region based genotyping showed presence of 11 different serotypes of EVs. Although no virus was detected in the tested water samples, examination of both water and sewage pipelines in gastroenteritis affected localities indicated leakages and possibility of contamination of drinking water with sewage water. Coexistence of multiple enteric viruses during the two outbreaks of gastroenteritis emphasizes the need to expand such investigations to other parts of India. PMID:22690171
Diversity in the Enteric Viruses Detected in Outbreaks of Gastroenteritis from Mumbai, Western India
Chitambar, Shobha; Gopalkrishna, Varanasi; Chhabra, Preeti; Patil, Pooja; Verma, Harsha; Lahon, Anismrita; Arora, Ritu; Tatte, Vaishali; Ranshing, Sujata; Dhale, Ganesh; Kolhapure, Rajendra; Tikute, Sanjay; Kulkarni, Jagannath; Bhardwaj, Renu; Akarte, Sulbha; Pawar, Sashikant
2012-01-01
Faecal specimens collected from two outbreaks of acute gastroenteritis that occurred in southern Mumbai, India in March and October, 2006 were tested for seven different enteric viruses. Among the 218 specimens tested, 95 (43.6%) were positive, 73 (76.8%) for a single virus and 22 (23.2%) for multiple viruses. Single viral infections in both, March and October showed predominance of enterovirus (EV, 33.3% and 40%) and rotavirus A (RVA, 33.3% and 25%). The other viruses detected in these months were norovirus (NoV, 12.1% and 10%), rotavirus B (RVB, 12.1% and 10%), enteric adenovirus (AdV, 6.1% and 7.5%), Aichivirus (AiV, 3% and 7.5%) and human astrovirus (HAstV, 3% and 0%). Mixed viral infections were largely represented by two viruses (84.6% and 88.9%), a small proportion showed presence of three (7.7% and 11%) and four (7.7% and 0%) viruses in the two outbreaks. Genotyping of the viruses revealed predominance of RVA G2P[4], RVB G2 (Indian Bangladeshi lineage), NoV GII.4, AdV-40, HAstV-8 and AiV B types. VP1/2A junction region based genotyping showed presence of 11 different serotypes of EVs. Although no virus was detected in the tested water samples, examination of both water and sewage pipelines in gastroenteritis affected localities indicated leakages and possibility of contamination of drinking water with sewage water. Coexistence of multiple enteric viruses during the two outbreaks of gastroenteritis emphasizes the need to expand such investigations to other parts of India. PMID:22690171
Cieszanowski, Andrzej; Lisowska, Antonina; Dabrowska, Marta; Korczynski, Piotr; Zukowska, Malgorzata; Grudzinski, Ireneusz P.; Pacho, Ryszard; Rowinski, Olgierd; Krenke, Rafal
2016-01-01
Objective The aims of this study were to assess the sensitivity of various magnetic resonance imaging (MRI) sequences for the diagnosis of pulmonary nodules and to estimate the accuracy of MRI for the measurement of lesion size, as compared to computed tomography (CT). Methods Fifty patients with 113 pulmonary nodules diagnosed by CT underwent lung MRI and CT. MRI studies were performed on 1.5T scanner using the following sequences: T2-TSE, T2-SPIR, T2-STIR, T2-HASTE, T1-VIBE, and T1-out-of-phase. CT and MRI data were analyzed independently by two radiologists. Results The overall sensitivity of MRI for the detection of pulmonary nodules was 80.5% and according to nodule size: 57.1% for nodules ≤4mm, 75% for nodules >4-6mm, 87.5% for nodules >6-8mm and 100% for nodules >8mm. MRI sequences yielded following sensitivities: 69% (T1-VIBE), 54.9% (T2-SPIR), 48.7% (T2-TSE), 48.7% (T1-out-of-phase), 45.1% (T2-STIR), 25.7% (T2-HASTE), respectively. There was very strong agreement between the maximum diameter of pulmonary nodules measured by CT and MRI (mean difference -0.02 mm; 95% CI –1.6–1.57 mm; Bland-Altman analysis). Conclusions MRI yielded high sensitivity for the detection of pulmonary nodules and enabled accurate assessment of their diameter. Therefore it may be considered an alternative to CT for follow-up of some lung lesions. However, due to significant number of false positive diagnoses, it is not ready to replace CT as a tool for lung nodule detection. PMID:27258047
Exponential Decay of Correlations Implies Area Law
NASA Astrophysics Data System (ADS)
Brandão, Fernando G. S. L.; Horodecki, Michał
2015-01-01
We prove that a finite correlation length, i.e., exponential decay of correlations, implies an area law for the entanglement entropy of quantum states defined on a line. The entropy bound is exponential in the correlation length of the state, thus reproducing as a particular case Hastings's proof of an area law for groundstates of 1D gapped Hamiltonians. As a consequence, we show that 1D quantum states with exponential decay of correlations have an efficient classical approximate description as a matrix product state of polynomial bond dimension, thus giving an equivalence between injective matrix product states and states with a finite correlation length. The result can be seen as a rigorous justification, in one dimension, of the intuition that states with exponential decay of correlations, usually associated with non-critical phases of matter, are simple to describe. It also has implications for quantum computing: it shows that unless a pure state quantum computation involves states with long-range correlations, decaying at most algebraically with the distance, it can be efficiently simulated classically. The proof relies on several previous tools from quantum information theory—including entanglement distillation protocols achieving the hashing bound, properties of single-shot smooth entropies, and the quantum substate theorem—and also on some newly developed ones. In particular we derive a new bound on correlations established by local random measurements, and we give a generalization to the max-entropy of a result of Hastings concerning the saturation of mutual information in multiparticle systems. The proof can also be interpreted as providing a limitation on the phenomenon of data hiding in quantum states.
Constraining Moment Deficit Rate on Crustal Faults from Geodetic Data
NASA Astrophysics Data System (ADS)
Maurer, J.; Bradley, A. M.; Segall, P.
2014-12-01
Constraining moment deficit rates on crustal faults using geodetic data is currently an under-utilized but powerful method for estimating the potential seismic hazard presented by crustal faults. Two previous approaches to moment-bounding, bootstrapping and Metropolis-Hastings sampling, can both fail catastrophically when estimating the probability distribution of moment given data, p(Mo|d). Straightforward application of traditional Metropolis-Hastings sampling with uniform prior probabilities on slip leads to a mesh-dependent estimate of moment with a variance inversely related to the number of model elements. Moment thus estimated exhibits an "effective prior" on p(MO) that tends toward a delta function halfway between the bounds as the fault discretization becomes finer! Thus, it is incorrect to estimate the uncertainty in moment directly from the uncertainty in slip. Bootstrapping can produce optimistic bounds and give biased results. A third approach is functional moment bounding (FMB), which obtains bounds on moment by minimizing the data misfit over slip for all possible values of Mo and accepting only those values with a total misfit less than some threshold. We present a modified version of this method that creates a probability distribution function on Mo from the misfit and uses this pdf to obtain confidence bounds. We also present a fourth method that we term Probabilistic Moment Bounding (PMB) that we derive within a Bayesian framework and incorporate a smoothed slip prior. Both of these approaches produce conservative results and do not exhibit mesh dependence. We compare the results from FMB and PMB to those obtained from other methods and assess the results.
Characterization and adhesion measurement of ceramic-coated nickel and titanium alloys
NASA Astrophysics Data System (ADS)
Gruss, Kimberly Ann
Chemically inert ceramic coatings are currently being investigated to extend the lifetime of metallic components operating in severe environments. As part of this effort, the characterization and adhesion measurement of zirconium nitride and silicon carbide coatings deposited on two nickel and one titanium alloys were conducted. Polycrystalline ZrN and amorphous Sisb{0.57}Csb{0.43} coatings were deposited by cathodic arc evaporation and by PACVD, respectively, on Incoloy 825 (Inc.), Hastelloy C22 (Hast.) and Titanium Grade 12 (Ti.) metal substrates. Analysis of the ZrN coatings by scanning electron microscopy and Auger electron spectroscopy (AES) revealed the presence of 1-8 mum diameter macroparticles composed of zirconium metal. Residual stress analyses were performed on the ZrN coatings via XRD using the sinsp2\\ Psi, method. Compressive stresses of 4.06 GPa, 3.88 GPa and 2.69 GPa were found in the ZrN coatings deposited on Inc., Hast. and Ti. substrates, respectively. Residual stresses in the Sisb{0.57}Csb{0.43} coatings were estimated from reports in the literature. Nanoindentation testing was employed to assess the Young's modulus and hardness of the coatings. The Young's modulus and hardness for the ZrN coatings were 458 GPa and 27.65 GPa, respectively, while the corresponding values for the Sisb{0.57}Csb{0.43} coatings were 212.15 GPa and 21.97 GPa. X-ray photoelectron spectroscopy was employed to measure the coating composition. The ZrN coatings were composed of 58.41% Zr and 41.59% N, measured in atomic concentration. The composition of the Sisb{0.57}Csb{0.43} coatings was 57.29 at.% Si and 42.18 at.% C. Studies of the interfacial chemistry via Auger electron spectroscopy and transmission electron microscopy revealed chemically abrupt interfaces. In addition, there was good compositional uniformity throughout the thickness of both the ZrN and Sisb{0.57}Csb{0.43} coatings. Scratch tests were employed to assess the critical load for interfacial failure and
NASA Astrophysics Data System (ADS)
Buckman, S.; Nutman, A.
2013-12-01
(sinistral) displacement of the Port Macquarie and Hastings Blocks and the dextral displacement of the Coffs Harbour Block associated with the Texas orocline, is apparent only, and due more in part to vertical displacements of an extensive, thin-skinned oceanic terranes that underlie the Tablelands Complex, rather than extensive lateral movements. Thus, there is no need to invoke large-scale ';oroclinal' folding or significant sinistral faulting to explain the repetition of Hastings and Port Macquarie blocks in the southern New England.
Modeling and simulation of cascading contingencies
NASA Astrophysics Data System (ADS)
Zhang, Jianfeng
This dissertation proposes a new approach to model and study cascading contingencies in large power systems. The most important contribution of the work involves the development and validation of a heuristic analytic model to assess the likelihood of cascading contingencies, and the development and validation of a uniform search strategy. We model the probability of cascading contingencies as a function of power flow and power flow changes. Utilizing logistic regression, the proposed model is calibrated using real industry data. This dissertation analyzes random search strategies for Monte Carlo simulations and proposes a new uniform search strategy based on the Metropolis-Hastings Algorithm. The proposed search strategy is capable of selecting the most significant cascading contingencies, and it is capable of constructing an unbiased estimator to provide a measure of system security. This dissertation makes it possible to reasonably quantify system security and justify security operations when economic concerns conflict with reliability concerns in the new competitive power market environment. It can also provide guidance to system operators about actions that may be taken to reduce the risk of major system blackouts. Various applications can be developed to take advantage of the quantitative security measures provided in this dissertation.
Medical triage for WMD incidents incidents: an adaptation of daily triage.
Donohue, Dave
2008-05-01
It's 2000 HRS on a Friday evening. You're assigned to an ALS engine company, and you're just settling down after a busy day when you're dispatched along with a BLS ambulance to a report of a sick person outside a local club where they're holding a concert. During your response, dispatch advises that they're receiving multiple calls on the incident and are dispatching a second BLS ambulance to the call. * As you turn the corner and approach the scene, you notice a haze in the air coming from an industrial site on the same side of the street and see approximately 200 people exiting the club in haste. Several dozen patrons line the street between the club and the subway station. They're coughing and crying, and several are vomiting. * The driver stops the engine in front of the subway entrance, which is located approximately 500 feet from the club and uphill and upwind from the haze. The scene is overwhelming, even to the captain, who turns to you-as the paramedic on the crew-and asks what you want done first. Your first thought is, Triage. But you know that triaging these patients is more complicated than your everyday two-car collision. PMID:18482652
A theoretical approach to calibrate radiation portal monitor (RPM) systems.
Nafee, Sherif S; Abbas, Mahmoud I
2008-10-01
Radiation portal monitor (RPM) systems are widely used at international border crossings, where they are applied to the task of detecting nuclear devices, special nuclear material, and radiation dispersal device materials that could appear at borders. The requirements and constraints on RPM systems deployed at high-volume border crossings are significantly different from those at weapons facilities or steel recycling plants, the former being required to rapidly detect localized sources of radiation with a very high detection probability and low false-alarm rate, while screening all of the traffic without impeding the flow of commerce [Chambers, W.H., Atwater, H.F., Fehlau, P.E., Hastings, R.D., Henry, C.N., Kunz, W.E., Sampson, T.E., Whittlesey, T.H., Worth, G.M., 1974. Portal Monitor for Diversion Safeguards. LA-5681, Los Alamos Scientific Laboratory, Los Alamos, NM]. In the present work, compact analytical formulae are derived and used to calibrate two RPM systems with isotropic radiating sources: (i) polyvinyltoluene (PVT) or plastic and (ii) thallium-doped crystalline sodium iodide, NaI(Tl), gamma-ray detector materials. The calculated efficiencies are compared to measured values reported in the literatures, showing very good agreement. PMID:18486482
Painlevé Representation of Tracy-Widom{_β} Distribution for {β} = 6
NASA Astrophysics Data System (ADS)
Rumanov, Igor
2016-03-01
In Rumanov (J Math Phys 56:013508, 2015), we found explicit Lax pairs for the soft edge of beta ensembles with even integer values of {β}. Using this general result, the case {β = 6} is further considered here. This is the smallest even {β}, when the corresponding Lax pair and its relation to Painlevé II (PII) have not been known before, unlike cases {β = 2} and 4. It turns out that again everything can be expressed in terms of the Hastings-McLeod solution of PII. In particular, a second order nonlinear ordinary differential equation (ODE) for the logarithmic derivative of Tracy-Widom distribution for {β = 6} involving the PII function in the coefficients is found, which allows one to compute asymptotics for the distribution function. The ODE is a consequence of a linear system of three ODEs for which the local singularity analysis yields series solutions with exponents in the set 4/3, 1/3 and -2/3.
Effect of gaseous anaesthesia on blood carbon dioxide measurements
Ogilvie, R. R.; Howie, G. F. A.
1965-01-01
The present study of the effect of two common anaesthetic gases on blood acid-base parameters shows that the micro-Astrup measurement of carbon dioxide tension is not invalidated by the presence of nitrous oxide. This result was anticipated from the theoretical aspect of this technique. The mean error involved in estimating plasma carbon dioxide content in the presence of nitrous oxide using the volumetric Van Slyke apparatus without absorption of carbon dioxide by sodium hydroxide can be of the order of 25%. No such effect was measurable in estimating carbon dioxide contents in the presence of halothane. The degree of respiratory alkalosis during anaesthesia reported in earlier papers (Walker, Morgan, Breckenridge, Watt, Ogilvie, and Douglas, 1963; Morgan, Ogilvie, and Walker, 1963) was greater than had been originally appreciated. A `false' increase in carbon dioxide content will also falsely increase buffer base or `base excess' as calculated from standard nomograms (Singer and Hastings, 1948; Davenport, 1958; Siggaard-Andersen, 1963). PMID:14304255
The Full Monte Carlo: A Live Performance with Stars
NASA Astrophysics Data System (ADS)
Meng, Xiao-Li
2014-06-01
Markov chain Monte Carlo (MCMC) is being applied increasingly often in modern Astrostatistics. It is indeed incredibly powerful, but also very dangerous. It is popular because of its apparent generality (from simple to highly complex problems) and simplicity (the availability of out-of-the-box recipes). It is dangerous because it always produces something but there is no surefire way to verify or even diagnosis that the “something” is remotely close to what the MCMC theory predicts or one hopes. Using very simple models (e.g., conditionally Gaussian), this talk starts with a tutorial of the two most popular MCMC algorithms, namely, the Gibbs Sampler and the Metropolis-Hasting Algorithm, and illustratestheir good, bad, and ugly implementations via live demonstration. The talk ends with a story of how a recent advance, the Ancillary-Sufficient Interweaving Strategy (ASIS) (Yu and Meng, 2011, http://www.stat.harvard.edu/Faculty_Content/meng/jcgs.2011-article.pdf)reduces the danger. It was discovered almost by accident during a Ph.D. student’s (Yaming Yu) struggle with fitting a Cox process model for detecting changes in source intensity of photon counts observed by the Chandra X-ray telescope from a (candidate) neutron/quark star.
Jewell, J. B.; O'Dwyer, I. J.; Huey, Greg; Gorski, K. M.; Eriksen, H. K.; Wandelt, B. D. E-mail: h.k.k.eriksen@astro.uio.no
2009-05-20
We present a new Markov Chain Monte Carlo (MCMC) algorithm for cosmic microwave background (CMB) analysis in the low signal-to-noise regime. This method builds on and complements the previously described CMB Gibbs sampler, and effectively solves the low signal-to-noise inefficiency problem of the direct Gibbs sampler. The new algorithm is a simple Metropolis-Hastings sampler with a general proposal rule for the power spectrum, C {sub l}, followed by a particular deterministic rescaling operation of the sky signal, s. The acceptance probability for this joint move depends on the sky map only through the difference of {chi}{sup 2} between the original and proposed sky sample, which is close to unity in the low signal-to-noise regime. The algorithm is completed by alternating this move with a standard Gibbs move. Together, these two proposals constitute a computationally efficient algorithm for mapping out the full joint CMB posterior, both in the high and low signal-to-noise regimes.
Chemical analyses of surface waters in Oklahoma, September - December, 1944
U.S. Geological Survey
1945-01-01
Red River at Denison Dam, Texas Sport samples were collected at the remainder of the stations. The analyses of the spot samples were made largely in a laboratory provided by the Oklahoma A. & M. College, under the supervision of Dr. O.M. Smith, Head, Department of Chemistry; Dr. S.R. Wood, Associate Professor of Chemistry; and W.W. Hastings, U.S. Geological Survey. The daily samples were analyzed in the water resources laboratory of the Geological Survey at Austin, Texas. These data have been summarized in a report to the Oklahoma Planning and Resources Board prepared by the U.S. Geological Survey, March 1, 1945. The streams of Oklahoma are classified into two major drainage basins: the Arkansas River and the Red River and their tributaries. The attached analyses are arranged in geographical order for their respective drainage basins, with records listed in downstream order for stations on the main stem first, followed by the analyses for the tributaries. When available, the mean daily discharge is given for the analyses.
Learn-as-you-go acceleration of cosmological parameter estimates
NASA Astrophysics Data System (ADS)
Aslanyan, Grigor; Easther, Richard; Price, Layne C.
2015-09-01
Cosmological analyses can be accelerated by approximating slow calculations using a training set, which is either precomputed or generated dynamically. However, this approach is only safe if the approximations are well understood and controlled. This paper surveys issues associated with the use of machine-learning based emulation strategies for accelerating cosmological parameter estimation. We describe a learn-as-you-go algorithm that is implemented in the Cosmo++ code and (1) trains the emulator while simultaneously estimating posterior probabilities; (2) identifies unreliable estimates, computing the exact numerical likelihoods if necessary; and (3) progressively learns and updates the error model as the calculation progresses. We explicitly describe and model the emulation error and show how this can be propagated into the posterior probabilities. We apply these techniques to the Planck likelihood and the calculation of ΛCDM posterior probabilities. The computation is significantly accelerated without a pre-defined training set and uncertainties in the posterior probabilities are subdominant to statistical fluctuations. We have obtained a speedup factor of 6.5 for Metropolis-Hastings and 3.5 for nested sampling. Finally, we discuss the general requirements for a credible error model and show how to update them on-the-fly.
Cheng, Gordon L F; Lee, Tatia M C
2016-08-01
The prefrontal cortex (PFC) subserves complex cognitive abilities, including risky decision-making; the modulation of this brain area is shown to alter the way people take risks. Yet, neuromodulation of the PFC in relation to risk-taking behavior remains relatively less well-studied. Moreover, the psychological variables that influence such neuromodulation remain poorly understood. To address these issues, 16 participants took part in 3 experimental sessions on separate days. They received: (i) left anodal-right cathodal transcranial direct current stimulation (tDCS); (ii) left cathodal-right anodal stimulation; or (iii) sham stimulation while they completed two risk-taking tasks. They also measured on several cognitive-affective abilities and personality traits. It was revealed that left cathodal-right anodal stimulation led to significantly reduced risk-taking under a context of haste. The reduction of risk-taking (relative to sham) correlated with state and trait impulsivity, such that the effect was larger in more impulsive individuals. For these individuals, the tDCS effect size was considered to be large (generalized partial η(2) > .17). The effect of prefrontal-neuromodulation in reducing risk-taking was influenced by baseline impulsivity, reflecting a state-dependent effect of neuromodulation on the PFC. The results of this study carry important insights into the use of neuromodulation to alter higher cognition. PMID:26343527
A non-enteric adenovirus A12 gastroenteritis outbreak in Rio de Janeiro, Brazil.
Portes, Silvana Augusta Rodrigues; Volotão, Eduardo de Mello; Rocha, Monica Simões; Rebelo, Maria Cristina; Xavier, Maria da Penha Trindade Pinheiro; Assis, Rosane Maria de; Rose, Tatiana Lundgren; Miagostovich, Marize Pereira; Leite, José Paulo Gagliardi; Carvalho-Costa, Filipe Anibal
2016-05-24
A gastroenteritis outbreak that occurred in 2013 in a low-income community in Rio de Janeiro was investigated for the presence of enteric viruses, including species A rotavirus (RVA), norovirus (NoV), astrovirus (HAstV), bocavirus (HBoV), aichivirus (AiV), and adenovirus (HAdV). Five of nine stool samples (83%) from patients were positive for HAdV, and no other enteric viruses were detected. Polymerase chain reaction products were sequenced and subjected to phylogenetic analysis, which revealed four strains and one strain of non-enteric HAdV-A12 and HAdV-F41, respectively. The HAdV-A12 nucleotide sequences shared 100% nucleotide similarity. Viral load was assessed using a TaqMan real-time PCR assay. Stool samples that were positive for HAdV-A12 had high viral loads (mean 1.9 X 107 DNA copies/g stool). All four patients with HAdV-A12 were < 25 months of age and had symptoms of fever and diarrhoea. Evaluation of enteric virus outbreaks allows the characterisation of novel or unique diarrhoea-associated viruses in regions where RVA vaccination is routinely performed. PMID:27223654
Electric utility privatization: What we can learn from the British experience
Hyman, L.S.
1997-10-01
The most famous privatization effort, that of the Thatcher government, put the concept on the front pages. It embraced privatization with zeal. The government raked in billions of pounds. Millions of new investors bought shares in dozens of companies. But the privatizations left a legacy of problems. One can learn from them. Privatizations shook up complacent enterprises, increasing their efficiency and decreasing the prices that customers paid (with the exception of water consumers). The efforts helped to revitalize London as a financial center, and launched enterprises that have now ventured forth from England`s green and pleasant land into the rest of the world. The UK also made sure that the British remained in control of all the new companies upon privatization. On the other side of the balance, the British government, on occasion, acted in haste, with meeting a politically imposed deadline more important than getting the structure right. It left a great deal of money on the table after each sale. It confused the appearance of competition with effective competition. It created a string of regulatory agencies that lacked the tools to effectively control the utilities absent the effective competition that was supposed to supplement light regulation. And the privatizations put a lot of people out of work in the regulated industries and the businesses that supplied them.
GRID-BASED EXPLORATION OF COSMOLOGICAL PARAMETER SPACE WITH SNAKE
Mikkelsen, K.; Næss, S. K.; Eriksen, H. K.
2013-11-10
We present a fully parallelized grid-based parameter estimation algorithm for investigating multidimensional likelihoods called Snake, and apply it to cosmological parameter estimation. The basic idea is to map out the likelihood grid-cell by grid-cell according to decreasing likelihood, and stop when a certain threshold has been reached. This approach improves vastly on the 'curse of dimensionality' problem plaguing standard grid-based parameter estimation simply by disregarding grid cells with negligible likelihood. The main advantages of this method compared to standard Metropolis-Hastings Markov Chain Monte Carlo methods include (1) trivial extraction of arbitrary conditional distributions; (2) direct access to Bayesian evidences; (3) better sampling of the tails of the distribution; and (4) nearly perfect parallelization scaling. The main disadvantage is, as in the case of brute-force grid-based evaluation, a dependency on the number of parameters, N{sub par}. One of the main goals of the present paper is to determine how large N{sub par} can be, while still maintaining reasonable computational efficiency; we find that N{sub par} = 12 is well within the capabilities of the method. The performance of the code is tested by comparing cosmological parameters estimated using Snake and the WMAP-7 data with those obtained using CosmoMC, the current standard code in the field. We find fully consistent results, with similar computational expenses, but shorter wall time due to the perfect parallelization scheme.
Rape and the prevalence of hybrids in broadly sympatric species: a case study using albatrosses.
Rohwer, Sievert; Harris, Rebecca B; Walsh, Hollie E
2014-01-01
Conspecific rape often increases male reproductive success. However, the haste and aggression of forced copulations suggests that males may sometimes rape heterospecific females, thus making rape a likely, but undocumented, source of hybrids between broadly sympatric species. We present evidence that heterospecific rape may be the source of hybrids between Black-footed and Laysan Albatrosses (Phoebastria nigripes, and P. immutabilis, respectively). Extensive field studies have shown that paired (but not unpaired) males of both of these albatross species use rape as a supplemental reproductive strategy. Between species differences in size, timing of laying, and aggressiveness suggest that Black-footed Albatrosses should be more successful than Laysan Albatrosses in heteropspecific rape attempts, and male Black-footed Albatrosses have been observed attempting to force copulations on female Laysan Albatrosses. Nuclear markers showed that the six hybrids we studied were F1s and mitochondrial markers showed that male Black-footed Albatrosses sired all six hybrids. Long-term gene exchange between these species has been from Black-footed Albatrosses into Laysan Albatrosses, suggesting that the siring asymmetry found in our hybrids has long persisted. If hybrids are sired in heterospecific rapes, they presumably would be raised and sexually imprinted on Laysan Albatrosses, and two unmated hybrids in a previous study courted only Laysan Albatrosses. PMID:24949232
Dettmer, Jan; Dosso, Stan E
2013-05-01
This paper develops a probabilistic two-dimensional (2D) inversion for geoacoustic seabed and water-column parameters in a strongly range-dependent environment. Range-dependent environments in shelf and shelf-break regions are of increasing importance to the acoustical-oceanography community, and recent advances in nonlinear inverse theory and sampling methods are applied here for efficient probabilistic range-dependent inversion. The 2D seabed and water column are parameterized using highly efficient, self-adapting irregular grids which intrinsically match the local resolving power of the data and provide parsimonious solutions requiring few parameters to capture complex environments. The self-adapting parameterization is achieved by implementing the irregular grid as a trans-dimensional hierarchical Bayesian model with an unknown number of nodes which is sampled with the Metropolis-Hastings-Green algorithm. To improve sampling, population Monte Carlo is applied with a large number of interacting parallel Markov chains with adaptive proposal distributions. The inversion is applied to simulated data for a vertical-line array and several source locations to several kilometers range. Complex acoustic-pressure fields are computed using a parabolic equation model and results are considered in terms of 2D ensemble parameter estimates and credibility intervals. PMID:23654369
Emulation of a couple atmosphere-ocean general circulation model with a simple climate model
NASA Astrophysics Data System (ADS)
Ishizaki, Y.; Emori, S.; Oki, T.; Shiogama, H.; Yokohata, T.; Yoshimori, M.
2013-12-01
Simple climate models have been used to investigate uncertainty of future projections under a very wide range of emission scenarios because the use of Atmosphere-ocean general circulation models (AOGCMs) requires very huge computer resources to project future climate changes under many different socio-economic scenarios. We developed a simple climate model, and investigated the ability of the simple climate model to emulate global mean surface air temperature (SAT) changes of an AOGCM (MIROC5) in a representative concentration pathway (RCP8.5). Some previous research indicated that climate sensitivity, ocean vertical diffusion and anthropogenic aerosol forcing (direct and indirect effects of sulfate aerosol, black carbon and organic carbon) are essentially important factors to emulate of global mean SAT changes of AOGCMs. We, therefore, estimate these important factors in the simple climate model using a Metropolis-Hastings Markov chain Monte Carlo (MCMC) approach, and compared the results of the emulation of the simple climate model with those of AIM/impact[policy] simple climate model. Although root mean square error (RMSE) in decadal means of global mean SAT changes during the period of 2001-2100 in the AIM/impact[policy] simple climate model are large (0.6), the RMSE in our new simple climate model are dramatically improved (0.02). Thus, the estimation of these important factors by a MCMC is very useful for emulation of AOGCMs by the use of simple climate models.
Coercivity enhancement of sintered Nd-Fe-B magnets by chemical bath deposition of TbCl{sub 3}
Guo, Shuai Zhang, Xiaofeng; Ding, Guangfei; Chen, Renjie; Yan, Aru; Lee, Don
2014-05-07
The chemical bath deposition (CBD) and the grain boundary diffusion method were combined to diffuse the heavy rare earth for obtain the thick magnets with high coercivity and low heavy rare earth. The jet mill powders were soaked into the alcohol solution of 0.2 wt. % TbCl{sub 3}. A thin layer of TbCl{sub 3} was wrapped to the surface of (PrNd){sub 2}Fe{sub 14}B powder particles. The coercivity of magnet is increased from 11.89 kOe to 14.72 kOe without significant reduction of remanence after grain boundary diffusion in the sintering and the annealing processes. The temperature coefficients of the remanence and the coercivity are improved by the substitution of PrNd by Tb in the surface of grains. The highly accelerated temperature/humidity stress test (HAST) results indicate that the CBD magnet has poor corrosion resistance, attributing to the present of Cl atoms in the grain boundaries.
Bustamante, Carlos D.; Valero-Cuevas, Francisco J.
2010-01-01
The field of complex biomechanical modeling has begun to rely on Monte Carlo techniques to investigate the effects of parameter variability and measurement uncertainty on model outputs, search for optimal parameter combinations, and define model limitations. However, advanced stochastic methods to perform data-driven explorations, such as Markov chain Monte Carlo (MCMC), become necessary as the number of model parameters increases. Here, we demonstrate the feasibility and, what to our knowledge is, the first use of an MCMC approach to improve the fitness of realistically large biomechanical models. We used a Metropolis–Hastings algorithm to search increasingly complex parameter landscapes (3, 8, 24, and 36 dimensions) to uncover underlying distributions of anatomical parameters of a “truth model” of the human thumb on the basis of simulated kinematic data (thumbnail location, orientation, and linear and angular velocities) polluted by zero-mean, uncorrelated multivariate Gaussian “measurement noise.” Driven by these data, ten Markov chains searched each model parameter space for the subspace that best fit the data (posterior distribution). As expected, the convergence time increased, more local minima were found, and marginal distributions broadened as the parameter space complexity increased. In the 36-D scenario, some chains found local minima but the majority of chains converged to the true posterior distribution (confirmed using a cross-validation dataset), thus demonstrating the feasibility and utility of these methods for realistically large biomechanical problems. PMID:19272906
Primary charge separation in isolated photosystem II reaction centers
Seibert, M.; Toon, S.; Govindjee; O`Neil, M.P.; Wasielewski, M.R.
1992-08-24
Primary charge-separation in isolated bacterial reaction center (RC) complex occurs in 2.8 ps at room temperature and 0.7--1.2 ps at 10 K. Because of similarities between the bacterial and photosystem II (PSII) RCs, it has been of considerable interest to obtain analogous charge-separation rates in the higher plant system. Our previous femtosecond transient absorption studies used PSII RC material stabilized with PEG or by exchanging dodecyl maltoside (DM) for Triton in the isolation procedure. These materials gave charge-separation 1/e times of 3.0 {plus_minus} 0.6 ps at 4{degree}C and 1.4{plus_minus} 0.2 ps at 15 K based on the risetime of transient absorption kinetics at 820 nm. These values were thought to represent the time required for formation of the P680{sup +}-Pheo{sup {minus}} state. Recent results of Hastings et al. obtained at high data acquisition rates and low flash intensities, suggest that the Pheo{sup {minus}} state may form more slowly. In light of this work, we have carried out additional time domain studies of both electron transport and energy transfer phenomena in stabilized DM PSII RCs at room temperature. We used a 1-kHz repetition rate femtosecond transient absorption spectrometer with a 200 fs instrumental time resolution and compared the results with those obtained by others using frequency domain hole-burning techniques.
Post-traumatic stress disorder. Does it exist?
Sparr, L F
1995-05-01
Facing the inevitable, psychiatry formally acquired PTSD as a diagnostic entity in 1980. It then discovered that PTSD had a bevy of nasty laylegal relatives (e.g., disability and personal injury claims). In response, psychiatrists have been continuously trying to refine PTSD criteria. There have even been cogent arguments that psychiatrists should take their own forensic medicine and formally address legally relevant behavior in the DSM. In the meantime, prosecutors, defense attorneys, and adjudicators sometimes stretch and pull the DSM-III-R PTSD diagnosis beyond justifiable limits to try to fit square pegs of psychiatric testimony into round holes of legal rules. Ultimately, however, lawyers cannot be blamed for misusing the PTSD diagnosis because only clinicians can make it. Causal diagnosticians may fail to apply the requisite symptomatic criteria or do so only superficially. In their haste to eliminate bogus stress claims, clinicians should not throw out the baby (authentic PTSD) with the bathwater (idiosyncratic "stress" disorders and careless PTSD diagnoses). PMID:7643834
Automated Antenna Orientation For Wireless Data Transfer Using Bayesian Modeling
NASA Astrophysics Data System (ADS)
Guttman, Rotem D.
2009-12-01
The problem of attaining a usable wireless connection at an arbitrary location is one of great concern to mobile end users. The majority of antennae currently in use for mobile devices conducting two way communications are omnidirectional. The use of a directional antenna allows for increased effective coverage area without increasing power consumption. However, directional antennae must be oriented toward a wireless network access point in order for their benefits to be realized. This paper outlines a system for determining the optimal orientation of a directional antenna without the need for additional hardware. The response of the antenna is described by the use of a parameterized model corresponding to the sum of a set of cardioid functions. Signal strength is measured at several antenna orientations and is used by a Metropolis-Hastings search algorithm to estimate the model parameter values that best describe the antenna's response pattern. Using this model the antenna can be oriented to respond optimally to the wireless network access point's broadcast pattern.
Transport map-accelerated Markov chain Monte Carlo for Bayesian parameter inference
NASA Astrophysics Data System (ADS)
Marzouk, Y.; Parno, M.
2014-12-01
We introduce a new framework for efficient posterior sampling in Bayesian inference, using a combination of optimal transport maps and the Metropolis-Hastings rule. The core idea is to use transport maps to transform typical Metropolis proposal mechanisms (e.g., random walks, Langevin methods, Hessian-preconditioned Langevin methods) into non-Gaussian proposal distributions that can more effectively explore the target density. Our approach adaptively constructs a lower triangular transport map—i.e., a Knothe-Rosenblatt re-arrangement—using information from previous MCMC states, via the solution of an optimization problem. Crucially, this optimization problem is convex regardless of the form of the target distribution. It is solved efficiently using Newton or quasi-Newton methods, but the formulation is such that these methods require no derivative information from the target probability distribution; the target distribution is instead represented via samples. Sequential updates using the alternating direction method of multipliers enable efficient and parallelizable adaptation of the map even for large numbers of samples. We show that this approach uses inexact or truncated maps to produce an adaptive MCMC algorithm that is ergodic for the exact target distribution. Numerical demonstrations on a range of parameter inference problems involving both ordinary and partial differential equations show multiple order-of-magnitude speedups over standard MCMC techniques, measured by the number of effectively independent samples produced per model evaluation and per unit of wallclock time.
NASA Astrophysics Data System (ADS)
Nakagawa, K.; Tanaka, T.; Suzuki, T.
2015-10-01
This paper presents the fabrication of a new energy harvesting module that uses a thermoelectric device (TED) by using molding technology. Through molding technology, the TED and circuit board can be properly protected and a heat-radiating fin structure can be simultaneously constructed. The output voltage per heater temperature of the TED module at 20 °C ambient temperature is 8 mV K-1, similar to the result with the aluminum heat sink which is almost the same fin size as the TED module. The accelerated environmental tests are performed on a damp heat test, which is an aging test under high temperature and high humidity, highly accelerated temperature, and humidity stress test (HAST) for the purpose of evaluating the electrical reliability in harsh environments, cold test and thermal cycle test to evaluate degrading characteristics by cycling through two temperatures. All test results indicate that the TED and circuit board can be properly protected from harsh temperature and humidity by using molding technology because the output voltage of after-tested modules is reduced by less than 5%. This study presents a novel fabrication method for a high reliability TED-installed module appropriate for Machine to Machine wireless sensor networks.
NASA Astrophysics Data System (ADS)
Nakagawa, K.; Tanaka, T.; Suzuki, T.
2014-11-01
This paper presents the fabrication of a new energy harvesting module that used the thermoelectric device (TED) by using molding technology. The output voltage per heater temperature of the TED module at 20 °C ambient temperature is 8mV/K and similar to the result with the aluminium heat sink which is almost the same fin size as the TED module. The accelerated environmental tests are performed on damp heat test that is an aging test under high temperature and high humidity, cold test and highly accelerated temperature and humidity stress test (HAST) for the purpose of evaluating the electrical reliability in harsh environments. Every result of tests indicates that the TED and circuit board can be properly protected from harsh temperature and humidity by using molding technology, because the output voltage of after tested modules is reduced by less than 5%.This study presents a novel fabrication method for a high reliability TED-installed module appropriate for Machine to Machine wireless sensor networks
Evolution and revolution as organizations grow. 1972.
Greiner, L E
1998-01-01
The influence of history on an organization is a powerful but often overlooked force. Managers, in their haste to build companies, frequently fail to ask such critical developmental questions as, Where has our organization been? Where is it now? and What do the answers to these questions mean for where it is going? Instead, when confronted with problems, managers fix their gaze outward on the environment and toward the future, as if more precise market projections will provide the organization with a new identity. In this HBR Classic, Larry Greiner identifies a series of developmental phases that companies tend to pass through as they grow. He distinguishes the phases by their dominant themes: creativity, direction, delegation, coordination, and collaboration. Each phase begins with a period of evolution, steady growth, and stability, and ends with a revolutionary period of organizational turmoil and change. The critical task for management in each revolutionary period is to find a new set of organizational practices that will become the basis for managing the next period of evolutionary growth. Those new practices eventually outlast their usefulness and lead to another period of revolution. Managers therefore experience the irony of seeing a major solution in one period become a major problem in a later period. Originally published in 1972, the article's argument and insights remain relevant to managers today. Accompanying the original article is a commentary by the author updating his earlier observations. PMID:10179654
Saline as the Sole Contrast Agent for Successful MRI-guided Epidural Injections
Deli, Martin; Mateiescu, Serban Busch, Martin; Becker, Jan Garmer, Marietta Groenemeyer, Dietrich
2013-06-15
Purpose. To assess the performance of sterile saline solution as the sole contrast agent for percutaneous magnetic resonance imaging (MRI)-guided epidural injections at 1.5 T. Methods. A retrospective analysis of two different techniques of MRI-guided epidural injections was performed with either gadolinium-enhanced saline solution or sterile saline solution for documentation of the epidural location of the needle tip. T1-weighted spoiled gradient echo (FLASH) images or T2-weighted single-shot turbo spin echo (HASTE) images visualized the test injectants. Methods were compared by technical success rate, image quality, table time, and rate of complications. Results. 105 MRI-guided epidural injections (12 of 105 with gadolinium-enhanced saline solution and 93 of 105 with sterile saline solution) were performed successfully and without complications. Visualization of sterile saline solution and gadolinium-enhanced saline solution was sufficient, good, or excellent in all 105 interventions. For either test injectant, quantitative image analysis demonstrated comparable high contrast-to-noise ratios of test injectants to adjacent body substances with reliable statistical significance levels (p < 0.001). The mean table time was 22 {+-} 9 min in the gadolinium-enhanced saline solution group and 22 {+-} 8 min in the saline solution group (p = 0.75). Conclusion. Sterile saline is suitable as the sole contrast agent for successful and safe percutaneous MRI-guided epidural drug delivery at 1.5 T.
A coupled hidden Markov model for disease interactions.
Sherlock, Chris; Xifara, Tatiana; Telfer, Sandra; Begon, Mike
2013-08-01
To investigate interactions between parasite species in a host, a population of field voles was studied longitudinally, with presence or absence of six different parasites measured repeatedly. Although trapping sessions were regular, a different set of voles was caught at each session, leading to incomplete profiles for all subjects. We use a discrete time hidden Markov model for each disease with transition probabilities dependent on covariates via a set of logistic regressions. For each disease the hidden states for each of the other diseases at a given time point form part of the covariate set for the Markov transition probabilities from that time point. This allows us to gauge the influence of each parasite species on the transition probabilities for each of the other parasite species. Inference is performed via a Gibbs sampler, which cycles through each of the diseases, first using an adaptive Metropolis-Hastings step to sample from the conditional posterior of the covariate parameters for that particular disease given the hidden states for all other diseases and then sampling from the hidden states for that disease given the parameters. We find evidence for interactions between several pairs of parasites and of an acquired immune response for two of the parasites. PMID:24223436
Lee, Sik-Yum; Song, Xin-Yuan
2004-05-01
Missing data are very common in behavioural and psychological research. In this paper, we develop a Bayesian approach in the context of a general nonlinear structural equation model with missing continuous and ordinal categorical data. In the development, the missing data are treated as latent quantities, and provision for the incompleteness of the data is made by a hybrid algorithm that combines the Gibbs sampler and the Metropolis-Hastings algorithm. We show by means of a simulation study that the Bayesian estimates are accurate. A Bayesian model comparison procedure based on the Bayes factor and path sampling is proposed. The required observations from the posterior distribution for computing the Bayes factor are simulated by the hybrid algorithm in Bayesian estimation. Our simulation results indicate that the correct model is selected more frequently when the incomplete records are used in the analysis than when they are ignored. The methodology is further illustrated with a real data set from a study concerned with an AIDS preventative intervention for Filipina sex workers. PMID:15171804
Virtual Goods Recommendations in Virtual Worlds
Chen, Kuan-Yu; Liao, Hsiu-Yu; Chen, Jyun-Hung; Liu, Duen-Ren
2015-01-01
Virtual worlds (VWs) are computer-simulated environments which allow users to create their own virtual character as an avatar. With the rapidly growing user volume in VWs, platform providers launch virtual goods in haste and stampede users to increase sales revenue. However, the rapidity of development incurs virtual unrelated items which will be difficult to remarket. It not only wastes virtual global companies' intelligence resources, but also makes it difficult for users to find suitable virtual goods fit for their virtual home in daily virtual life. In the VWs, users decorate their houses, visit others' homes, create families, host parties, and so forth. Users establish their social life circles through these activities. This research proposes a novel virtual goods recommendation method based on these social interactions. The contact strength and contact influence result from interactions with social neighbors and influence users' buying intention. Our research highlights the importance of social interactions in virtual goods recommendation. The experiment's data were retrieved from an online VW platform, and the results show that the proposed method, considering social interactions and social life circle, has better performance than existing recommendation methods. PMID:25834837
Non-Intersecting Squared Bessel Paths at a Hard-Edge Tacnode
NASA Astrophysics Data System (ADS)
Delvaux, Steven
2013-12-01
The squared Bessel process is a 1-dimensional diffusion process related to the squared norm of a higher dimensional Brownian motion. We study a model of n non-intersecting squared Bessel paths, with all paths starting at the same point a > 0 at time t = 0 and ending at the same point b > 0 at time t = 1. Our interest lies in the critical regime ab = 1/4, for which the paths are tangent to the hard edge at the origin at a critical time . The critical behavior of the paths for n → ∞ is studied in a scaling limit with time t = t * + O( n -1/3) and temperature T = 1 + O( n -2/3). This leads to a critical correlation kernel that is defined via a new Riemann-Hilbert problem of size 4 × 4. The Riemann-Hilbert problem gives rise to a new Lax pair representation for the Hastings-McLeod solution to the inhomogeneous Painlevé II equation q''( x) = xq( x) + 2 q 3( x) - ν, where ν = α + 1/2 with α > -1 the parameter of the squared Bessel process. These results extend our recent work with Kuijlaars and Zhang (Comm Pure Appl Math 64:1305-1383, 2011) for the homogeneous case ν = 0.
A non-enteric adenovirus A12 gastroenteritis outbreak in Rio de Janeiro, Brazil
Portes, Silvana Augusta Rodrigues; Volotão, Eduardo de Mello; Rocha, Monica Simões; Rebelo, Maria Cristina; Xavier, Maria da Penha Trindade Pinheiro; de Assis, Rosane Maria; Rose, Tatiana Lundgren; Miagostovich, Marize Pereira; Leite, José Paulo Gagliardi; Carvalho-Costa, Filipe Anibal
2016-01-01
A gastroenteritis outbreak that occurred in 2013 in a low-income community in Rio de Janeiro was investigated for the presence of enteric viruses, including species A rotavirus (RVA), norovirus (NoV), astrovirus (HAstV), bocavirus (HBoV), aichivirus (AiV), and adenovirus (HAdV). Five of nine stool samples (83%) from patients were positive for HAdV, and no other enteric viruses were detected. Polymerase chain reaction products were sequenced and subjected to phylogenetic analysis, which revealed four strains and one strain of non-enteric HAdV-A12 and HAdV-F41, respectively. The HAdV-A12 nucleotide sequences shared 100% nucleotide similarity. Viral load was assessed using a TaqMan real-time PCR assay. Stool samples that were positive for HAdV-A12 had high viral loads (mean 1.9 X 107 DNA copies/g stool). All four patients with HAdV-A12 were < 25 months of age and had symptoms of fever and diarrhoea. Evaluation of enteric virus outbreaks allows the characterisation of novel or unique diarrhoea-associated viruses in regions where RVA vaccination is routinely performed. PMID:27223654
New Kinematic Model in comparing with Langevin equation and Fokker Planck Equation
NASA Astrophysics Data System (ADS)
Lee, Kyoung; Wang, Zhijian; Gardner, Robin
2010-03-01
An analytic approximate solution of New Kinematic Model with the boundary conditions is developed for the incompressible packing condition in Pebble Bed Reactors. It is based on velocity description of the packing density in the hopper. The packing structure can be presented with a jamming phenomenon from flow types. The gravity-driven macroscopic motions are governed not only by the geometry and external boundary conditions of silos and hoppers, but by flow prosperities of granular materials, such as friction, viscosity and porosity. The analytical formulas for the quasi-linear diffusion and convection coefficients of the velocity profile are obtained. Since it was found that the New Kinematic Model is dependent upon the granular packing density distribution, we are motivated to study the Langevin equation with friction under the influence of the Gravitational field. We also discuss the relation with the Fokker Planck Equation using Detailed balance and Metropolis-Hastings Algorithm. Markov chain Monte Carlo methods are shown to be a non-Maxwellian distribution function with the mean velocity of the field particles having an effective temperature.
NASA Astrophysics Data System (ADS)
Grayver, Alexander V.; Kuvshinov, Alexey V.
2016-05-01
This paper presents a methodology to sample equivalence domain (ED) in nonlinear partial differential equation (PDE)-constrained inverse problems. For this purpose, we first applied state-of-the-art stochastic optimization algorithm called Covariance Matrix Adaptation Evolution Strategy (CMAES) to identify low-misfit regions of the model space. These regions were then randomly sampled to create an ensemble of equivalent models and quantify uncertainty. CMAES is aimed at exploring model space globally and is robust on very ill-conditioned problems. We show that the number of iterations required to converge grows at a moderate rate with respect to number of unknowns and the algorithm is embarrassingly parallel. We formulated the problem by using the generalized Gaussian distribution. This enabled us to seamlessly use arbitrary norms for residual and regularization terms. We show that various regularization norms facilitate studying different classes of equivalent solutions. We further show how performance of the standard Metropolis-Hastings Markov chain Monte Carlo algorithm can be substantially improved by using information CMAES provides. This methodology was tested by using individual and joint inversions of magneotelluric, controlled-source electromagnetic (EM) and global EM induction data.
Potential effect of resonant scattering from multiple swimbladders on audition in juvenile fish
NASA Astrophysics Data System (ADS)
Hastings, Mardi C.
2003-04-01
The swimbladder, a gas-filled chamber in the abdominal cavity of most bony fishes, is a hydrostatic organ that enables fish to maintain neutral buoyancy; however, it also responds to acoustic pressure and radiates a secondary acoustic field that enhances detection capability of the inner ear. Recent experiments have indicated that resonant response of the swimbladder may control the auditory bandwidth in at least four species of fish [Hastings et al., J. Acoust. Soc. Am. 110, 2640 (2001)]. The auditory bandwidths of these fishes, however, do not change appreciably while they grow even though the resonance frequency of the swimbladder decreases with increasing body length. Results of an analysis inspired by Feiullade et al. [J. Acoust. Soc. Am. 112, 2206 (2002)] show that the downward shift and broadening associated with resonance of the aggregate scattered field from multiple fish is perhaps sufficient enough to account for this discrepancy. Effects of resonant characteristics of a single swimbladder, fish length, and number of fish on the changes in the collective scattered field are presented. Thus the resonant scattered field created by relatively large schools of juvenile fish may enhance their auditory capability.
NASA Astrophysics Data System (ADS)
Wu, Zu-guang; Tian, Zhan-jun; Liu, Hui; Huang, Rui; Zhu, Guo-hua
2009-07-01
Being the only listed telecom operators of A share market, China Unicom has always been attracted many institutional investors under the concept of 3G recent years,which itself is a great technical progress expectation.Do the institutional investors or the concept of technical progress have signficant effect on the improving of firm's operating efficiency?Though reviewing the documentary about operating efficiency we find that schoolars study this problem useing the regress analyzing based on traditional production function and data envelopment analysis(DEA) and financial index anayzing and marginal function and capital labor ratio coefficient etc. All the methods mainly based on macrodata. This paper we use the micro-data of company to evaluate the operating efficiency.Using factor analyzing based on financial index and comparing the factor score of three years from 2005 to 2007, we find that China Unicom's operating efficiency is under the averge level of benchmark corporates and has't improved under the concept of 3G from 2005 to 2007.In other words,institutional investor or the conception of technical progress expectation have faint effect on the changes of China Unicom's operating efficiency. Selecting benchmark corporates as post to evaluate the operating efficiency is a characteristic of this method ,which is basicallly sipmly and direct.This method is suit for the operation efficiency evaluation of agriculture listed companies because agriculture listed also face technical progress and marketing concept such as tax-free etc.
Hamilton, Kirk L
2014-04-15
An old proverb states "necessity is the mother of invention." This certainly holds true in science as one pursues research questions. Experimental techniques have evolved as scientists have asked more specific research questions. Indeed, techniques such as the Ussing chamber, the perfused renal tubule preparation, patch-clamp, polymerase chain reaction, and site-directed mutagenesis have been developed over the past 60 years. However, sometimes, simple techniques may be useful and still very informative, and this certainly applies to intestinal physiology. Indeed, Gerald Wiseman and Thomas Hastings Wilson described the intestinal everted sac preparation some 60 years ago. Since then, this technique has been used for numerous research purposes including determining ion, amino acid, water and solute transport across the intestinal epithelium; and drug metabolism, absorption, and interactions in pharmaceutical/pharmacological research and even in education. This article provides a historical review of the development of the in vitro intestinal preparation that led to the everted sac preparation and its use in science. PMID:24573083
Aggarwal, Abhishek; Azad, Rajiv; Ahmad, Armeen; Arora, Pankaj; Gupta, Puneet
2012-01-01
Objective: To validate the additional merits of two-dimensional (2D) single thick-slice Magnetic Resonance Myelography (MRM) in spinal imaging. Materials and Methods: 2D single thick-slice MRM was performed using T2 half-Fourier acquisition single-shot turbo spin-echo (HASTE) sequence in addition to routine Magnetic resonance (MR) sequences for spine in 220 patients. The images were evaluated for additional diagnostic information in spinal and extra-spinal regions. A three-point grading system was adopted depending upon the utility of MRM in contributing to the detection of spinal or extra-spinal findings. Grade 1 represented no contribution of MRM while grade 3 would indicate that it was essential to detection of findings. Results: Utility of MRM in spine was categorized as grade 3 in 10.9% cases (24/220), grade 2 in 21.8% (48/220) cases and grade 1 in 67.3% cases (148/220). Thus, the overall additional merit of MRM in spine was seen in 32.7% (72/220) of cases. Besides in 14.1% cases (31/220) extra-spinal pathologies were identified. Conclusion: 2D single thick-slice MRM could have additional merits in spinal imaging when used as an adjunct to routine MR sequences. PMID:23393640
A simulation approach for change-points on phylogenetic trees.
Persing, Adam; Jasra, Ajay; Beskos, Alexandros; Balding, David; De Iorio, Maria
2015-01-01
We observe n sequences at each of m sites and assume that they have evolved from an ancestral sequence that forms the root of a binary tree of known topology and branch lengths, but the sequence states at internal nodes are unknown. The topology of the tree and branch lengths are the same for all sites, but the parameters of the evolutionary model can vary over sites. We assume a piecewise constant model for these parameters, with an unknown number of change-points and hence a transdimensional parameter space over which we seek to perform Bayesian inference. We propose two novel ideas to deal with the computational challenges of such inference. Firstly, we approximate the model based on the time machine principle: the top nodes of the binary tree (near the root) are replaced by an approximation of the true distribution; as more nodes are removed from the top of the tree, the cost of computing the likelihood is reduced linearly in n. The approach introduces a bias, which we investigate empirically. Secondly, we develop a particle marginal Metropolis-Hastings (PMMH) algorithm, that employs a sequential Monte Carlo (SMC) sampler and can use the first idea. Our time-machine PMMH algorithm copes well with one of the bottle-necks of standard computational algorithms: the transdimensional nature of the posterior distribution. The algorithm is implemented on simulated and real data examples, and we empirically demonstrate its potential to outperform competing methods based on approximate Bayesian computation (ABC) techniques. PMID:25506749
Flow and chloride transport in the tidal Hudson River, NY
Weiss, Lawrence A.; Schaffranek, Raymond W.; de Vries, M. Peter
1994-01-01
A one-dimensional dynamic-flow model and a one-dimensional solute-transport model were used to evaluate the effects of hypothetical public-supply water withdrawals on saltwater intrusion in a 133-mile reach of the tidal Hudson River between Green Island dam, near Troy, N.Y., and Hastings-on-Hudson, N.Y. Regression techniques were used in analyses of current and extreme historical conditions, and numerical models were used to investigate the effect of various water withdrawals. Of four withdrawal scenarios investigated, simulations of a 27-day period during which discharges at Green Island dam averaged 7,090 ft3/s indicate that increasing the present Chelsea pumping-station withdrawal rate of 100 Mgal/d (million gallons per day) to 300 Mgal/d would have the least effect on upstream saltwater movement. A 90-day simulation, during which discharges at Green Island dam averaged 25,200 ft3/s, indicates that withdrawals of 1,940 Mgal/d at Chelsea would not measurably increase chloride concentrations at Chelsea under normal tidal and meteorological conditions, but withdrawals of twice that rate (3,880 Mgal/d) could increase the chloride concentration at Chelsea to 250 mg/L.
A consideration of the operation of automatic production machines
HOSHI, Toshiro; SUGIMOTO, Noboru
2015-01-01
At worksites, various automatic production machines are in use to release workers from muscular labor or labor in the detrimental environment. On the other hand, a large number of industrial accidents have been caused by automatic production machines. In view of this, this paper considers the operation of automatic production machines from the viewpoint of accident prevention, and points out two types of machine operation − operation for which quick performance is required (operation that is not permitted to be delayed) − and operation for which composed performance is required (operation that is not permitted to be performed in haste). These operations are distinguished by operation buttons of suitable colors and shapes. This paper shows that these characteristics are evaluated as “asymmetric on the time-axis”. Here, in order for workers to accept the risk of automatic production machines, it is preconditioned in general that harm should be sufficiently small or avoidance of harm is easy. In this connection, this paper shows the possibility of facilitating the acceptance of the risk of automatic production machines by enhancing the asymmetric on the time-axis. PMID:25739898
Holzer, T.L. )
1990-09-01
The extensive network of geodetic leveling lines in the Houston-Galveston, Texas, area, where at least 110 oil and gas fields have been developed, provides the most comprehensive opportunity in the Gulf Coast to search for the occurrence of land subsidence caused by withdrawal of oil and gas. Although the evaluation is complicated by regional subsidence caused by a decline of ground-water level in aquifers beneath the area, subsidence caused by oil and gas withdrawal can be examined by searching for local increases of subsidence at oil and gas fields crossed by leveling lines. Twenty-nine fields are crossed by lines with repeated leveling surveys. Subsidence profiles across these fields indicate local increases of subsidence at six fields-Alco-Mag, Chocolate Bayou, Goose Creek, Hastings, Mykawa, and South Houston. Although ground-water withdrawal is undoubtedly the most important factor contributing to the total subsidence at each field, oil and gas withdrawal may be partly responsible for the local increases. Except for Chocolate Bayou, the volume of petroleum production at each field was sufficient to account for the increase. The volume of petroleum production, however, in general is not a reliable index for predicting the local increase because land within many fields with significant production did not show local increases of subsidence. With the exception of the 1 m subsidence caused by petroleum withdrawal at Goose Creek (1917-1925), local increases of subsidence were less than 0.3 m.
Ghanem, Roger G. . E-mail: ghanem@usc.edu; Doostan, Alireza . E-mail: doostan@jhu.edu
2006-09-01
This paper investigates the predictive accuracy of stochastic models. In particular, a formulation is presented for the impact of data limitations associated with the calibration of parameters for these models, on their overall predictive accuracy. In the course of this development, a new method for the characterization of stochastic processes from corresponding experimental observations is obtained. Specifically, polynomial chaos representations of these processes are estimated that are consistent, in some useful sense, with the data. The estimated polynomial chaos coefficients are themselves characterized as random variables with known probability density function, thus permitting the analysis of the dependence of their values on further experimental evidence. Moreover, the error in these coefficients, associated with limited data, is propagated through a physical system characterized by a stochastic partial differential equation (SPDE). This formalism permits the rational allocation of resources in view of studying the possibility of validating a particular predictive model. A Bayesian inference scheme is relied upon as the logic for parameter estimation, with its computational engine provided by a Metropolis-Hastings Markov chain Monte Carlo procedure.
Innovative Composite Wall System for Sheathing Masonry Walls
Wendt, Robert L.; Cavallo, James
1997-09-25
Existing Housing - Much of the older multifamily housing stock in the United States includes units in structures with uninsulated masonry walls. Included in this stock are two- and three-story walk-up apartments, larger apartment complexes, and public housing (both high- rise and townhouse). This older multifamily housing has seen years of heavy use that may have left the plaster wall marred or damaged. Long- term building settlement or movement may have cracked the plaster, sometimes severely. Moisture from invented kitchens and baths may have caused condensation on uninsulated exterior walls. At best this condensation has left stains on the paint or wallpaper. At worst it has supported mold and mildew growth, fouling the air and creating unhealthy living conditions. Deteriorating plaster and flaking paint also result from wet walls. The presence of flaking, lead-based paint in older (pre-1978) housing is a major public health concern. Children can suffer permanent mental handicaps and psychological disorders if they are subjected to elevated levels of lead, while adults can suffer hypertension and other maladies. Studies have found that, in some urban communities with older housing stocks, over 35% of children tested have elevated blood lead levels (Hastings, et al.: 1997). Nationally, nearly 22% of black, non-hispanic children living in pre-1946 housing were found to have elevated levels of lead in their blood (MWWR Article: February 21,1997). The deterioration of many of these walls is to the point that lead can freely enter the living space.
Rape and the prevalence of hybrids in broadly sympatric species: a case study using albatrosses
Harris, Rebecca B.; Walsh, Hollie E.
2014-01-01
Conspecific rape often increases male reproductive success. However, the haste and aggression of forced copulations suggests that males may sometimes rape heterospecific females, thus making rape a likely, but undocumented, source of hybrids between broadly sympatric species. We present evidence that heterospecific rape may be the source of hybrids between Black-footed and Laysan Albatrosses (Phoebastria nigripes, and P. immutabilis, respectively). Extensive field studies have shown that paired (but not unpaired) males of both of these albatross species use rape as a supplemental reproductive strategy. Between species differences in size, timing of laying, and aggressiveness suggest that Black-footed Albatrosses should be more successful than Laysan Albatrosses in heteropspecific rape attempts, and male Black-footed Albatrosses have been observed attempting to force copulations on female Laysan Albatrosses. Nuclear markers showed that the six hybrids we studied were F1s and mitochondrial markers showed that male Black-footed Albatrosses sired all six hybrids. Long-term gene exchange between these species has been from Black-footed Albatrosses into Laysan Albatrosses, suggesting that the siring asymmetry found in our hybrids has long persisted. If hybrids are sired in heterospecific rapes, they presumably would be raised and sexually imprinted on Laysan Albatrosses, and two unmated hybrids in a previous study courted only Laysan Albatrosses. PMID:24949232
Grid-based Exploration of Cosmological Parameter Space with Snake
NASA Astrophysics Data System (ADS)
Mikkelsen, K.; Næss, S. K.; Eriksen, H. K.
2013-11-01
We present a fully parallelized grid-based parameter estimation algorithm for investigating multidimensional likelihoods called Snake, and apply it to cosmological parameter estimation. The basic idea is to map out the likelihood grid-cell by grid-cell according to decreasing likelihood, and stop when a certain threshold has been reached. This approach improves vastly on the "curse of dimensionality" problem plaguing standard grid-based parameter estimation simply by disregarding grid cells with negligible likelihood. The main advantages of this method compared to standard Metropolis-Hastings Markov Chain Monte Carlo methods include (1) trivial extraction of arbitrary conditional distributions; (2) direct access to Bayesian evidences; (3) better sampling of the tails of the distribution; and (4) nearly perfect parallelization scaling. The main disadvantage is, as in the case of brute-force grid-based evaluation, a dependency on the number of parameters, N par. One of the main goals of the present paper is to determine how large N par can be, while still maintaining reasonable computational efficiency; we find that N par = 12 is well within the capabilities of the method. The performance of the code is tested by comparing cosmological parameters estimated using Snake and the WMAP-7 data with those obtained using CosmoMC, the current standard code in the field. We find fully consistent results, with similar computational expenses, but shorter wall time due to the perfect parallelization scheme.
HZAR: hybrid zone analysis using an R software package.
Derryberry, Elizabeth P; Derryberry, Graham E; Maley, James M; Brumfield, Robb T
2014-05-01
We present a new software package (HZAR) that provides functions for fitting molecular genetic and morphological data from hybrid zones to classic equilibrium cline models using the Metropolis-Hastings Markov chain Monte Carlo (MCMC) algorithm. The software applies likelihood functions appropriate for different types of data, including diploid and haploid genetic markers and quantitative morphological traits. The modular design allows flexibility in fitting cline models of varying complexity. To facilitate hypothesis testing, an autofit function is included that allows automated model selection from a set of nested cline models. Cline parameter values, such as cline centre and cline width, are estimated and may be compared statistically across clines. The package is written in the R language and is available through the Comprehensive R Archive Network (CRAN; http://cran.r-project.org/). Here, we describe HZAR and demonstrate its use with a sample data set from a well-studied hybrid zone in western Panama between white-collared (Manacus candei) and golden-collared manakins (M. vitellinus). Comparisons of our results with previously published results for this hybrid zone validate the hzar software. We extend analysis of this hybrid zone by fitting additional models to molecular data where appropriate. PMID:24373504
NASA Astrophysics Data System (ADS)
Grayver, Alexander V.; Kuvshinov, Alexey V.
2016-02-01
This paper presents a methodology to sample equivalence domain (ED) in non-linear PDE-constrained inverse problems. For this purpose, we first applied state-of-the-art stochastic optimization algorithm called Covariance Matrix Adaptation Evolution Strategy (CMAES) to identify low misfit regions of the model space. These regions were then randomly sampled to create an ensemble of equivalent models and quantify uncertainty. CMAES is aimed at exploring model space globally and is robust on very ill-conditioned problems. We show that the number of iterations required to converge grows at a moderate rate with respect to number of unknowns and the algorithm is embarrassingly parallel. We formulated the problem by using the generalized Gaussian distribution. This enabled us to seamlessly use arbitrary norms for residual and regularization terms. We show that various regularization norms facilitate studying different classes of equivalent solutions. We further show how performance of the standard Metropolis-Hastings Markov chain Monte Carlo (MCMC) algorithm can be substantially improved by using information CMAES provides. This methodology was tested by using individual and joint inversions of Magneotelluric, Controlled-source Electromagnetic (EM) and Global EM induction data.
Moisture in multilayer ceramic capacitors
NASA Astrophysics Data System (ADS)
Donahoe, Daniel Noel
When both precious metal electrode and base metal electrode (BME) capacitors were subjected to autoclave (120°C/100% RH) testing, it was found that the precious metal capacitors aged according to a well known aging mechanism (less than 3% from their starting values), but the BME capacitors degraded to below the -30% criterion at 500 hours of exposure. The reasons for this new failure mechanism are complex, and there were two theories that were hypothesized. The first was that there could be oxidation or corrosion of the nickel plates. The other hypothesis was that the loss of capacitance was due to molecular changes in the barium titanate. This thesis presents the evaluation of these hypotheses and the physics of the degradation mechanism. It is concluded by proof by elimination that there are molecular changes in the barium titanate. Furthermore, the continuous reduction in capacitor size makes the newer base metal electrode capacitors more vulnerable to moisture degradation than the older generation precious metal capacitors. In addition, standard humidity life testing, such as JESD-22 THB and HAST, will likely not uncover this problem. Therefore, poor reliability due to degradation of base metal electrode multilayer ceramic capacitors may catch manufacturers and consumers by surprise.
Bayesian Statistical Approach To Binary Asteroid Orbit Determination
NASA Astrophysics Data System (ADS)
Dmitrievna Kovalenko, Irina; Stoica, Radu S.
2015-08-01
Orbit determination from observations is one of the classical problems in celestial mechanics. Deriving the trajectory of binary asteroid with high precision is much more complicate than the trajectory of simple asteroid. Here we present a method of orbit determination based on the algorithm of Monte Carlo Markov Chain (MCMC). This method can be used for the preliminary orbit determination with relatively small number of observations, or for adjustment of orbit previously determined.The problem consists on determination of a conditional a posteriori probability density with given observations. Applying the Bayesian statistics, the a posteriori probability density of the binary asteroid orbital parameters is proportional to the a priori and likelihood probability densities. The likelihood function is related to the noise probability density and can be calculated from O-C deviations (Observed minus Calculated positions). The optionally used a priori probability density takes into account information about the population of discovered asteroids. The a priori probability density is used to constrain the phase space of possible orbits.As a MCMC method the Metropolis-Hastings algorithm has been applied, adding a globally convergent coefficient. The sequence of possible orbits derives through the sampling of each orbital parameter and acceptance criteria.The method allows to determine the phase space of every possible orbit considering each parameter. It also can be used to derive one orbit with the biggest probability density of orbital elements.
Land subsidence associated with hydrocarbon production, Texas Gulf Coast
Kreitler, C.W.; White, W.A.; Akhter, M.S.
1988-01-01
Although ground-water withdrawal has been the predominant cause of land subsidence in the Texas Gulf Coast, localized subsidence and faulting have also resulted from hydrocarbon production. Subsidence was documented as early as the 1920s over the Goose Creek field. Since then, subsidence and/or faulting have been identified over the Saxet, South Houston, Chocolate Bayou, Hastings, Alco-Mag, Clinton, Mykawa, Blue Ridge, Webster, and Caplen oil fields. Oil-production-related subsidence over these fields generally creates few environmental or engineering problems. One exception is the subsidence and faulting over the Caplen oil field on Bolivar Peninsula, where more than 1,000 ac of saltwater marsh has been replaced by subaqueous flats. Subsidence may be occurring over other fields but has not been identified because of limited releveled benchmark data. An evaluation of drill-stem and bottom-hole pressure data for the Frio Formation in Texas indicates extensive depressurization presumably from hydrocarbon production. Nearly 12,000 measurements from a pressure data base of 17,000 measurements indicate some depressurization. Some of the Frio zones have pressure declines of more than 1,500 psi from original hydrostatic conditions. Subsidence and faulting may be associated with these fields in the Frio as well as other Tertiary formations where extensive hydrocarbon production and subsequent depressurization have occurred.
Successive equimarginal approach for optimal design of a pump and treat system
NASA Astrophysics Data System (ADS)
Guo, Xiaoniu; Zhang, Chuan-Mian; Borthwick, John C.
2007-08-01
An economic concept-based optimization method is developed for groundwater remediation design. Design of a pump and treat (P&T) system is viewed as a resource allocation problem constrained by specified cleanup criteria. An optimal allocation of resources requires that the equimarginal principle, a fundamental economic principle, must hold. The proposed method is named successive equimarginal approach (SEA), which continuously shifts a pumping rate from a less effective well to a more effective one until equal marginal productivity for all units is reached. Through the successive process, the solution evenly approaches the multiple inequality constraints that represent the specified cleanup criteria in space and in time. The goal is to design an equal protection system so that the distributed contaminant plumes can be equally contained without bypass and overprotection is minimized. SEA is a hybrid of the gradient-based method and the deterministic heuristics-based method, which allows flexibility in dealing with multiple inequality constraints without using a penalty function and in balancing computational efficiency with robustness. This method was applied to design a large-scale P&T system for containment of multiple plumes at the former Blaine Naval Ammunition Depot (NAD) site, near Hastings, Nebraska. To evaluate this method, the SEA results were also compared with those using genetic algorithms.
Catching supermassive black hole binaries without a net
NASA Astrophysics Data System (ADS)
Cornish, Neil J.; Porter, Edward K.
2007-01-01
The gravitational wave signals from coalescing Supermassive Black Hole Binaries are prime targets for the Laser Interferometer Space Antenna (LISA). With optimal data processing techniques, the LISA observatory should be able to detect black hole mergers anywhere in the Universe. The challenge is to find ways to dig the signals out of a combination of instrument noise and the large foreground from stellar mass binaries in our own galaxy. The standard procedure of matched filtering against a grid of templates can be computationally prohibitive, especially when the black holes are spinning or the mass ratio is large. Here we develop an alternative approach based on Metropolis-Hastings sampling and simulated annealing that is orders of magnitude cheaper than a grid search. For the first time, we show that it is possible to detect and characterize the signals from binary systems of Schwarzschild Black Holes that are embedded in instrument noise and a foreground containing millions of galactic binaries. Our technique is computationally efficient, robust, and applicable to both high and low signal-to-noise ratio systems.
Local perturbations perturb—exponentially–locally
De Roeck, W. Schütz, M.
2015-06-15
We elaborate on the principle that for gapped quantum spin systems with local interaction, “local perturbations [in the Hamiltonian] perturb locally [the groundstate].” This principle was established by Bachmann et al. [Commun. Math. Phys. 309, 835–871 (2012)], relying on the “spectral flow technique” or “quasi-adiabatic continuation” [M. B. Hastings, Phys. Rev. B 69, 104431 (2004)] to obtain locality estimates with sub-exponential decay in the distance to the spatial support of the perturbation. We use ideas of Hamza et al. [J. Math. Phys. 50, 095213 (2009)] to obtain similarly a transformation between gapped eigenvectors and their perturbations that is local with exponential decay. This allows to improve locality bounds on the effect of perturbations on the low lying states in certain gapped models with a unique “bulk ground state” or “topological quantum order.” We also give some estimate on the exponential decay of correlations in models with impurities where some relevant correlations decay faster than one would naively infer from the global gap of the system, as one also expects in disordered systems with a localized groundstate.
The Atmosphere of WASP-14b Revealed by Three Spitzer Eclipses
NASA Astrophysics Data System (ADS)
Blecic, Jasmina; Harrington, J.; Stevenson, K. B.; Madhusudhan, N.; Hardy, R. A.; Campo, C. J.; Bowman, W. C.; Nymeyer, S.; Cubbillos, P.; WASP Consortium
2010-10-01
WASP-14b is a hot Jupiter planet (Y. C. Joshy et al. 2009, MNRAS 392, 1532-1538) with mass (7.3 Jupiter masses) and density (4.6 g/cm3) that exceed those of most known extrasolar planets. It is very close to its host star (semimajor axis = 0.036 AU), giving it a typical equilibrium temperature of 1800 K for 0 albedo, circular orbit, and uniform reemission. In addition, its significant orbital eccentricity (0.09) suggests the possibility of a companion planet. All of the above makes this object very interesting for atmospheric study. Spitzer program 60021 (H. Knutson, PI) obtained a secondary eclipse light curve at 3.6 microns on 2010-03-19 and the Spitzer Exoplanet Target of Opportunity Program (program 50517, J. Harrington, PI) obtained eclipse data at 4.5 and 8.0 microns on 2009-03-18. Analytic light curve models fit the data using a Metropolis-Hastings Markov Chain Monte Carlo (MCMC) algorithm that incorporates corrections for systematic effects. We present estimates of infrared brightness temperatures and constraints on atmospheric composition and thermal structure. This work is based on observations made with the Spitzer Space Telescope, which is operated by the Jet Propulsion Laboratory, California Institute of Technology under a contract with NASA. Support for this work was provided by NASA through an award issued by JPL/Caltech.
NASA Astrophysics Data System (ADS)
Allard, Alexandre; Fischer, Nicolas; Ebrard, Géraldine; Hay, Bruno; Harris, Peter; Wright, Louise; Rochais, Denis; Mattout, Jeremie
2016-02-01
The determination of thermal diffusivity is at the heart of modern materials characterisation. The evaluation of the associated uncertainty is difficult because the determination is performed in an indirect way, in the sense that the thermal diffusivity cannot be measured directly. The well-known GUM uncertainty framework does not provide a reliable evaluation of measurement uncertainty for such inverse problems, because in that framework the underlying measurement model is supposed to be a direct relationship between the measurand (the quantity intended to be measured) and the input quantities on which the measurand depends. This paper is concerned with the development of a Bayesian approach to evaluate the measurement uncertainty associated with thermal diffusivity. A Bayesian model is first developed for a single thermogram and is then extended to the case of several thermograms obtained under repeatability and reproducibility conditions. This multi-thermogram based model is able to take into consideration a large set of influencing quantities that occur during the measurements and yields a more reliable uncertainty evaluation than the one obtained from a single thermogram. Different aspects of the Bayesian model are discussed, including the sensitivity to the choice of the prior distribution, the Metropolis-Hastings algorithm used for the inference and the convergence of the Markov chains.
Gronemus, Jenny Q; Hair, Pamela S; Crawford, Katrina B; Nyalwidhe, Julius O; Cunnion, Kenji M; Krishna, Neel K
2010-01-01
Previous work from our laboratories has demonstrated that purified, recombinant human astrovirus coat protein (HAstV CP) binds C1q and mannose-binding lectin (MBL) inhibiting activation of the classical and lectin pathways of complement, respectively. Analysis of the 787 amino acid CP molecule revealed that residues 79-139 share limited sequence homology with human neutrophil defensin-1 (HNP-1), a molecule previously demonstrated to bind C1q and MBL, inhibiting activation of the classical and lectin pathways of complement, respectively. A 30 amino acid peptide derived from this region of the CP molecule competitively inhibited the binding of wild-type CP to C1q. The parent peptide and various derivatives were subsequently assayed for C1q binding, inhibition of C1 and C4 activation as well as suppression of complement activation in hemolytic assays. The parent peptide and several derivatives inhibited complement activation in these functional assays to varying degrees. One peptide derivative in particular (E23A) displayed superior inhibition of complement activation in multiple assays of classical complement pathway activation. Further analysis revealed homology to a plant defensin allowing development of a proposed structural model for E23A. Based upon these findings, we hypothesize that further rationale optimization of E23A may result in a promising therapeutic inhibitor for the treatment of inflammatory and autoimmune diseases in which dysregulated activation of the classical and lectin pathways of complement contribute to pathogenesis. PMID:20728940
Alternative Test Methods for Electronic Parts
NASA Technical Reports Server (NTRS)
Plante, Jeannette
2004-01-01
It is common practice within NASA to test electronic parts at the manufacturing lot level to demonstrate, statistically, that parts from the lot tested will not fail in service using generic application conditions. The test methods and the generic application conditions used have been developed over the years through cooperation between NASA, DoD, and industry in order to establish a common set of standard practices. These common practices, found in MIL-STD-883, MIL-STD-750, military part specifications, EEE-INST-002, and other guidelines are preferred because they are considered to be effective and repeatable and their results are usually straightforward to interpret. These practices can sometimes be unavailable to some NASA projects due to special application conditions that must be addressed, such as schedule constraints, cost constraints, logistical constraints, or advances in the technology that make the historical standards an inappropriate choice for establishing part performance and reliability. Alternate methods have begun to emerge and to be used by NASA programs to test parts individually or as part of a system, especially when standard lot tests cannot be applied. Four alternate screening methods will be discussed in this paper: Highly accelerated life test (HALT), forward voltage drop tests for evaluating wire-bond integrity, burn-in options during or after highly accelerated stress test (HAST), and board-level qualification.
The High Flux Beam Reactor at Brookhaven National Laboratory
Shapiro, S.M.
1994-12-31
Brookhaven National Laboratory`s High Flux Beam Reactor (HFBR) was built because of the need of the scientist to always want `more`. In the mid-50`s the Brookhaven Graphite reactor was churning away producing a number of new results when the current generation of scientists, led by Donald Hughes, realized the need for a high flux reactor and started down the political, scientific and engineering path that led to the BFBR. The effort was joined by a number of engineers and scientists among them, Chemick, Hastings, Kouts, and Hendrie, who came up with the novel design of the HFBR. The two innovative features that have been incorporated in nearly all other research reactors built since are: (i) an under moderated core arrangement which enables the thermal flux to peak outside the core region where beam tubes can be placed, and (ii) beam tubes that are tangential to the core which decrease the fast neutron background without affecting the thermal beam intensity. Construction began in the fall of 1961 and four years later, at a cost of $12 Million, criticality was achieved on Halloween Night, 1965. Thus began 30 years of scientific accomplishments.
Performance of Thermal Mass Flow Meters in a Variable Gravitational Environment
NASA Technical Reports Server (NTRS)
Brooker, John E.; Ruff, Gary A.
2004-01-01
The performance of five thermal mass flow meters, MKS Instruments 179A and 258C, Unit Instruments UFM-8100, Sierra Instruments 830L, and Hastings Instruments HFM-200, were tested on the KC-135 Reduced Gravity Aircraft in orthogonal, coparallel, and counterparallel orientations relative to gravity. Data was taken throughout the parabolic trajectory where the g-level varied from 0.01 to 1.8 times normal gravity. Each meter was calibrated in normal gravity in the orthogonal position prior to flight followed by ground testing at seven different flow conditions to establish a baseline operation. During the tests, the actual flow rate was measured independently using choked-flow orifices. Gravitational acceleration and attitude had a unique effect on the performance of each meter. All meters operated within acceptable limits at all gravity levels in the calibrated orthogonal position. However, when operated in other orientations, the deviations from the reference flow became substantial for several of the flow meters. Data analysis indicated that the greatest source of error was the effect of orientation, followed by the gravity level. This work emphasized that when operating thermal flow meters in a variable gravity environment, it is critical to orient the meter in the same direction relative to gravity in which it was calibrated. Unfortunately, there was no test in normal gravity that could predict the performance of a meter in reduced gravity. When operating in reduced gravity, all meters indicated within 5 percent of the full scale reading at all flow conditions and orientations.
Makita, Hiroki; Hastings, Gary
2016-01-01
Time-resolved visible and infrared absorption difference spectroscopy data at both 298 and 77 K were obtained using cyanobacterial menB− mutant photosystem I particles with several non-native quinones incorporated into the A1 binding site. Data was obtained for photosystem I particles with phylloquinone (2-methyl-3-phytyl-1,4-naphthoquinone), 2-bromo-1,4-naphthoquinone, 2-chloro-1,4-naphthoquinone, 2-methyl-1,4-naphthoquinone, 2,3-dibromo-1,4-naphthoquinone, 2,3-dichloro-1,4-naphthoquinone, and 9,10-anthraquinone incorporated. Transient absorption data were obtained at 487 and 703 nm in the visible spectral range, and 1950–1100 cm−1 in the infrared region. Time constants obtained from fitting the time-resolved infrared and visible data are in good agreement. The measured time constants are crucial for the development of appropriate kinetic models that can describe electron transfer processes in photosystem I, “Modeling Electron Transfer in Photosystem I” Makita and Hastings (2016) [1]. PMID:27182540
Makita, Hiroki; Hastings, Gary
2016-06-01
Time-resolved visible and infrared absorption difference spectroscopy data at both 298 and 77 K were obtained using cyanobacterial menB (-) mutant photosystem I particles with several non-native quinones incorporated into the A1 binding site. Data was obtained for photosystem I particles with phylloquinone (2-methyl-3-phytyl-1,4-naphthoquinone), 2-bromo-1,4-naphthoquinone, 2-chloro-1,4-naphthoquinone, 2-methyl-1,4-naphthoquinone, 2,3-dibromo-1,4-naphthoquinone, 2,3-dichloro-1,4-naphthoquinone, and 9,10-anthraquinone incorporated. Transient absorption data were obtained at 487 and 703 nm in the visible spectral range, and 1950-1100 cm(-1) in the infrared region. Time constants obtained from fitting the time-resolved infrared and visible data are in good agreement. The measured time constants are crucial for the development of appropriate kinetic models that can describe electron transfer processes in photosystem I, "Modeling Electron Transfer in Photosystem I" Makita and Hastings (2016) [1]. PMID:27182540
Exact sampling of the unobserved covariates in Bayesian spline models for measurement error problems
Carroll, Raymond J.
2015-01-01
In truncated polynomial spline or B-spline models where the covariates are measured with error, a fully Bayesian approach to model fitting requires the covariates and model parameters to be sampled at every Markov chain Monte Carlo iteration. Sampling the unobserved covariates poses a major computational problem and usually Gibbs sampling is not possible. This forces the practitioner to use a Metropolis–Hastings step which might suffer from unacceptable performance due to poor mixing and might require careful tuning. In this article we show for the cases of truncated polynomial spline or B-spline models of degree equal to one, the complete conditional distribution of the covariates measured with error is available explicitly as a mixture of double-truncated normals, thereby enabling a Gibbs sampling scheme. We demonstrate via a simulation study that our technique performs favorably in terms of computational efficiency and statistical performance. Our results indicate up to 62 and 54 % increase in mean integrated squared error efficiency when compared to existing alternatives while using truncated polynomial splines and B-splines respectively. Furthermore, there is evidence that the gain in efficiency increases with the measurement error variance, indicating the proposed method is a particularly valuable tool for challenging applications that present high measurement error. We conclude with a demonstration on a nutritional epidemiology data set from the NIH-AARP study and by pointing out some possible extensions of the current work. PMID:27418743
Mathieu, Jordane A; Hatté, Christine; Balesdent, Jérôme; Parent, Éric
2015-11-01
The response of soil carbon dynamics to climate and land-use change will affect both the future climate and the quality of ecosystems. Deep soil carbon (>20 cm) is the primary component of the soil carbon pool, but the dynamics of deep soil carbon remain poorly understood. Therefore, radiocarbon activity (Δ14C), which is a function of the age of carbon, may help to understand the rates of soil carbon biodegradation and stabilization. We analyzed the published 14C contents in 122 profiles of mineral soil that were well distributed in most of the large world biomes, except for the boreal zone. With a multivariate extension of a linear mixed-effects model whose inference was based on the parallel combination of two algorithms, the expectation-maximization (EM) and the Metropolis-Hasting algorithms, we expressed soil Δ14C profiles as a four-parameter function of depth. The four-parameter model produced insightful predictions of soil Δ14C as dependent on depth, soil type, climate, vegetation, land-use and date of sampling (R2=0.68). Further analysis with the model showed that the age of topsoil carbon was primarily affected by climate and cultivation. By contrast, the age of deep soil carbon was affected more by soil taxa than by climate and thus illustrated the strong dependence of soil carbon dynamics on other pedologic traits such as clay content and mineralogy. PMID:26119088
Characterization of Interstellar Organic Molecules
Gencaga, Deniz; Knuth, Kevin H.; Carbon, Duane F.
2008-11-06
Understanding the origins of life has been one of the greatest dreams throughout history. It is now known that star-forming regions contain complex organic molecules, known as Polycyclic Aromatic Hydrocarbons (PAHs), each of which has particular infrared spectral characteristics. By understanding which PAH species are found in specific star-forming regions, we can better understand the biochemistry that takes place in interstellar clouds. Identifying and classifying PAHs is not an easy task: we can only observe a single superposition of PAH spectra at any given astrophysical site, with the PAH species perhaps numbering in the hundreds or even thousands. This is a challenging source separation problem since we have only one observation composed of numerous mixed sources. However, it is made easier with the help of a library of hundreds of PAH spectra. In order to separate PAH molecules from their mixture, we need to identify the specific species and their unique concentrations that would provide the given mixture. We develop a Bayesian approach for this problem where sources are separated from their mixture by Metropolis Hastings algorithm. Separated PAH concentrations are provided with their error bars, illustrating the uncertainties involved in the estimation process. The approach is demonstrated on synthetic spectral mixtures using spectral resolutions from the Infrared Space Observatory (ISO). Performance of the method is tested for different noise levels.
The artificial nutrition debate: still an issue... after all these years.
Monturo, Cheryl
2009-01-01
Debate over withdrawal or withholding of artificial nutrition appeared a distant discussion until the furor over the Schiavo case and a Papal Allocation reignited this ethical dilemma. The purpose of this article is to provide a review of the bioethical opinion regarding artificial nutrition, as published in the Hastings Center Report from 1971 until 2007. A clinical and religious history of the evolution and use of artificial nutrition prefaces the review containing common themes and categories framed within a chronology of bioethical and legal events. Finally, an interpretative philosophical discussion is offered on the resurgence of the ethical dilemma concerning withdrawal or withholding of artificial nutrition. Through a combination of classic content analysis and grounded theory, 8 inductively derived categories emerged from a sample of 63 articles/letters with a primary focus on artificial nutrition, enteral nutrition or parenteral nutrition. These categories included illness/treatment trajectory, personhood, family, provider, cost, religion, legal, and ethics and morality. In more than 35 years, surprisingly little has changed with regard to withdrawal or withholding of artificial nutrition. As the Schiavo case revealed, despite a sense in bioethics of a firm consensus about handling the withdrawal of food and water, many are still searching for answers to this dilemma. PMID:19321894
Measurement Scale of the SOLIS Vector Spectromagnetograph
NASA Technical Reports Server (NTRS)
Jones, Harrison P.; Harvey, John W.; Henney, Carl J.; Keller, Christoph U.; Malanushenko, Olena M.
2004-01-01
Longitudinal magnetograms obtained with thc SOLIS Vector Spectromagnetograph (VSM) during a cross-calibration period are compared with similar data from the NASA/NSO Spectromagnetograph (SPM) at the NSO/Kitt Peak Vacuum Telescope as well as with SOHO/MDI and GONG magnetogram. The VSM began observation at the University of Arizona agricultural test site and collaborative observations were obtained with both the VSM and SPM from 2003 Aug 05 through 2003 Sep 21 where the SPM was officially retired. The VSM replaces the SPM and continues the 3O-year NSO/Kitt Peak synoptic magnetogram record. Magnetograms are compared by equating histograms and, for selected examples, by pixel-by-pixel comparison of co-registered images. The VSM was not corrected for polarization crosstalk and was operated without hast guiding. Solar activity was at best moderate during this period. Over the range of observed fields, the VSM magnetograms show greatly improved sensitivity but are otherwise virtually identical with "raw" SPM magnetogram. GONG magnetograms are also closely comparable with the SPM while MDI flux values tend to be stronger by a factor of 1.2 - 1.4. Dependence of the results on seeing will be discussed. Partial funding for this work was provided through Solar and Heliospheric Research Supporting Research and Technology grants from NASA's Office of Space Sciences.
Algorithmic and architectural optimizations for computationally efficient particle filtering.
Sankaranarayanan, Aswin C; Srivastava, Ankur; Chellappa, Rama
2008-05-01
In this paper, we analyze the computational challenges in implementing particle filtering, especially to video sequences. Particle filtering is a technique used for filtering nonlinear dynamical systems driven by non-Gaussian noise processes. It has found widespread applications in detection, navigation, and tracking problems. Although, in general, particle filtering methods yield improved results, it is difficult to achieve real time performance. In this paper, we analyze the computational drawbacks of traditional particle filtering algorithms, and present a method for implementing the particle filter using the Independent Metropolis Hastings sampler, that is highly amenable to pipelined implementations and parallelization. We analyze the implementations of the proposed algorithm, and, in particular, concentrate on implementations that have minimum processing times. It is shown that the design parameters for the fastest implementation can be chosen by solving a set of convex programs. The proposed computational methodology was verified using a cluster of PCs for the application of visual tracking. We demonstrate a linear speed-up of the algorithm using the methodology proposed in the paper. PMID:18390378
Hydrogen Safety Knowledge Tools
Fassbender, Linda L.
2011-01-31
With hydrogen gaining acceptance as an energy carrier for fuel cell vehicles and stationary fuel cell applications, a new community of hydrogen users is emerging and continues to grow. With this growth has come the need to spread the word about safe practices for handling, storing, and using hydrogen. Like all energy forms, hydrogen can be used safely through proper procedures and engineering techniques. However, hydrogen involves a degree of risk that must be respected, and the importance of avoiding complacency or haste in the safe conduct and performance of projects involving hydrogen cannot be overstated. To encourage and promote the safe use of hydrogen, Pacific Northwest National Laboratory (PNNL) has developed and continues to enhance two software tools in support of the U.S. Department of Energy's Fuel Cell Technologies Program: the Hydrogen Safety Best Practices online manual (www.H2BestPractices.org) and the Hydrogen Incident Reporting and Lessons Learned database (www.H2Incidents.org).
Trowell, Stephen C; Dacres, Helen; Dumancic, Mira M; Leitch, Virginia; Rickards, Rodney W
2016-09-16
Bioluminescence is the emission of visible light by living organisms. Here we describe the isolation and characterisation of a cDNA encoding a MW ≈ 59,000 Da luciferase from the Australian glow-worm, Arachnocampa richardsae. The enzyme is a member of the acyl-CoA ligase superfamily and produces blue light on addition of D-luciferin. These results are contrary to earlier reports (Lee, J., Photochem Photobiol 24, 279-285 (1976), Viviani, V. R., Hastings, J. W. & Wilson, T., Photochem Photobiol 75, 22-27 (2002)), which suggested glow-worm luciferase has MW ≈ 36,000 Da and is unreactive with beetle luciferin. There are more than 2000 species of firefly, which all produce emissions from D-luciferin in the green to red regions of the electromagnetic spectrum. Although blue-emitting luciferases are known from marine organisms, they belong to different structural families and use a different substrate. The observation of blue emission from a D-luciferin-using enzyme is therefore unprecedented. PMID:27457804
The search for massive black hole binaries with LISA
NASA Astrophysics Data System (ADS)
Cornish, Neil J.; Porter, Edward K.
2007-12-01
In this work we focus on the search and detection of massive black hole binary (MBHB) systems, including systems at high redshift. As well as expanding on previous works where we used a variant of Markov chain Monte Carlo (MCMC), called Metropolis Hastings Monte Carlo, with simulated annealing, we introduce a new search method based on frequency annealing which leads to a more rapid and robust detection. We compare the two search methods on systems where we do and do not see the merger of the black holes. In the non-merger case, we also examine the posterior distribution exploration using a 7D MCMC algorithm. We demonstrate that this method is effective in dealing with the high correlations between parameters, has a higher acceptance rate than previously proposed methods and produces posterior distribution functions that are close to the prediction from the Fisher information matrix. Finally, after carrying out searches where there is only one binary in the data stream, we examine the case where two black hole binaries are present in the same data stream. We demonstrate that our search algorithm can accurately recover both binaries, and more importantly showing that we can safely extract the MBHB sources without contaminating the rest of the data stream.
A Hamiltonian Monte-Carlo method for Bayesian inference of supermassive black hole binaries
NASA Astrophysics Data System (ADS)
Porter, Edward K.; Carré, Jérôme
2014-07-01
We investigate the use of a Hamiltonian Monte-Carlo to map out the posterior density function for supermassive black hole binaries. While previous Markov Chain Monte-Carlo (MCMC) methods, such as Metropolis-Hastings MCMC, have been successfully employed for a number of different gravitational wave sources, these methods are essentially random walk algorithms. The Hamiltonian Monte-Carlo treats the inverse likelihood surface as a ‘gravitational potential’ and by introducing canonical positions and momenta, dynamically evolves the Markov chain by solving Hamilton's equations of motion. This method is not as widely used as other MCMC algorithms due to the necessity of calculating gradients of the log-likelihood, which for most applications results in a bottleneck that makes the algorithm computationally prohibitive. We circumvent this problem by using accepted initial phase-space trajectory points to analytically fit for each of the individual gradients. Eliminating the waveform generation needed for the numerical derivatives reduces the total number of required templates for a {{10}^{6}} iteration chain from \\sim {{10}^{9}} to \\sim {{10}^{6}}. The result is in an implementation of the Hamiltonian Monte-Carlo that is faster, and more efficient by a factor of approximately the dimension of the parameter space, than a Hessian MCMC.
Villeirs, Geert M. . E-mail: Geert.Villeirs@ugent.be; Meerleer, Gert O. de; Verstraete, Koenraad L.; Neve, Wilfried J. de
2004-12-01
Purpose: To measure prostate motion with magnetic resonance imaging (MRI) during a course of intensity-modulated radiotherapy. Methods and materials: Seven patients with prostate carcinoma were scanned supine on a 1.5-Tesla MRI system with weekly pretreatment and on-treatment HASTE T2-weighted images in 3 orthogonal planes. The bladder and rectal volumes and position of the prostatic midpoint (PMP) and margins relative to the bony pelvis were measured. Results: All pretreatment positions were at the mean position as computed from the on-treatment scans in each patient. The PMP variability (given as 1 SD) in the anterior-posterior (AP), superior-inferior (SI), and right-left (RL) directions was 2.6, 2.4, and 1.0 mm, respectively. The largest variabilities occurred at the posterior (3.2 mm), superior (2.6 mm), and inferior (2.6 mm) margins. A strong correlation was found between large rectal volume (>95th percentile) and anterior PMP displacement. A weak correlation was found between bladder volume and superior PMP displacement. Conclusions: All pretreatment positions were representative of the subsequent on-treatment positions. A clinical target volume (CTV) expansion of 5.3 mm in any direction was sufficient to ascertain a 95% coverage of the CTV within the planning target volume (PTV), provided that a rectal suppository is administered to avoid rectal overdistension and that the patient has a comfortably filled bladder (<300 mL)
Improving Bayesian analysis for LISA Pathfinder using an efficient Markov Chain Monte Carlo method
NASA Astrophysics Data System (ADS)
Ferraioli, Luigi; Porter, Edward K.; Armano, Michele; Audley, Heather; Congedo, Giuseppe; Diepholz, Ingo; Gibert, Ferran; Hewitson, Martin; Hueller, Mauro; Karnesis, Nikolaos; Korsakova, Natalia; Nofrarias, Miquel; Plagnol, Eric; Vitale, Stefano
2014-02-01
We present a parameter estimation procedure based on a Bayesian framework by applying a Markov Chain Monte Carlo algorithm to the calibration of the dynamical parameters of the LISA Pathfinder satellite. The method is based on the Metropolis-Hastings algorithm and a two-stage annealing treatment in order to ensure an effective exploration of the parameter space at the beginning of the chain. We compare two versions of the algorithm with an application to a LISA Pathfinder data analysis problem. The two algorithms share the same heating strategy but with one moving in coordinate directions using proposals from a multivariate Gaussian distribution, while the other uses the natural logarithm of some parameters and proposes jumps in the eigen-space of the Fisher Information matrix. The algorithm proposing jumps in the eigen-space of the Fisher Information matrix demonstrates a higher acceptance rate and a slightly better convergence towards the equilibrium parameter distributions in the application to LISA Pathfinder data. For this experiment, we return parameter values that are all within ˜1 σ of the injected values. When we analyse the accuracy of our parameter estimation in terms of the effect they have on the force-per-unit of mass noise, we find that the induced errors are three orders of magnitude less than the expected experimental uncertainty in the power spectral density.
Ma, Jianming; Kockelman, Kara M; Damien, Paul
2008-05-01
Numerous efforts have been devoted to investigating crash occurrence as related to roadway design features, environmental factors and traffic conditions. However, most of the research has relied on univariate count models; that is, traffic crash counts at different levels of severity are estimated separately, which may neglect shared information in unobserved error terms, reduce efficiency in parameter estimates, and lead to potential biases in sample databases. This paper offers a multivariate Poisson-lognormal (MVPLN) specification that simultaneously models crash counts by injury severity. The MVPLN specification allows for a more general correlation structure as well as overdispersion. This approach addresses several questions that are difficult to answer when estimating crash counts separately. Thanks to recent advances in crash modeling and Bayesian statistics, parameter estimation is done within the Bayesian paradigm, using a Gibbs Sampler and the Metropolis-Hastings (M-H) algorithms for crashes on Washington State rural two-lane highways. Estimation results from the MVPLN approach show statistically significant correlations between crash counts at different levels of injury severity. The non-zero diagonal elements suggest overdispersion in crash counts at all levels of severity. The results lend themselves to several recommendations for highway safety treatments and design policies. For example, wide lanes and shoulders are key for reducing crash frequencies, as are longer vertical curves. PMID:18460364
Estimation of seabed shear-wave velocity profiles using shear-wave source data.
Dong, Hefeng; Nguyen, Thanh-Duong; Duffaut, Kenneth
2013-07-01
This paper estimates seabed shear-wave velocity profiles and their uncertainties using interface-wave dispersion curves extracted from data generated by a shear-wave source. The shear-wave source generated a seismic signature over a frequency range between 2 and 60 Hz and was polarized in both in-line and cross-line orientations. Low-frequency Scholte- and Love-waves were recorded. Dispersion curves of the Scholte- and Love-waves for the fundamental mode and higher-order modes are extracted by three time-frequency analysis methods. Both the vertically and horizontally polarized shear-wave velocity profiles in the sediment are estimated by the Scholte- and Love-wave dispersion curves, respectively. A Bayesian approach is utilized for the inversion. Differential evolution, a global search algorithm is applied to estimate the most-probable shear-velocity models. Marginal posterior probability profiles are computed by Metropolis-Hastings sampling. The estimated vertically and horizontally polarized shear-wave velocity profiles fit well with the core and in situ measurements. PMID:23862796
Jalli, Reza; Jafari, Seyed Hamed; Sefidbakht, Sepideh; Kazemi, Kourosh
2015-01-01
Background: Hepatocellular carcinoma (HCC) is a neoplasm usually arising in a cirrhotic liver by a multistep carcinogenesis process. Early detection of HCC and accurate assessment of tumor burden are crucial to successful treatment planning and long-term survival. Objectives: In this study, we compared the accuracy of diffusion weighted imaging (DWI) combined with limited sequence magnetic resonance imaging (MRI) set as a potentially quick and practical MR candidate with ultrasonography (US) for screening of HCC in patients with cirrhosis. Patients and Methods: Of 96 patients with cirrhosis, 30 who had concomitant HCC proved by pathology were selected. MRI, DWI, and US of the liver were performed for the patients. Sensitivity, specificity, and accuracy of DWI alone, limited sequences MRI alone, a combination of them, and US were calculated for the detection of HCC in these patients and then comparison between these modalities was performed. Results: Combination of limited sequences MRI and DWI had the highest accuracy (94.79%) followed by DWI alone followed by limited sequence MRI alone. The least accuracy was for US (78.12%) with a statistically significant difference. Conclusion: Due to the significant improvement in the treatment of early stage of HCC compared to the previous decade, we suggest a fast, non-invasive, more accurate, but more expensive method (HASTE, OP/IP T1W sequences MRI combined with DWI) rather than US for the screening of HCC in liver cirrhosis. PMID:25785178
Compressible generalized hybrid Monte Carlo
NASA Astrophysics Data System (ADS)
Fang, Youhan; Sanz-Serna, J. M.; Skeel, Robert D.
2014-05-01
One of the most demanding calculations is to generate random samples from a specified probability distribution (usually with an unknown normalizing prefactor) in a high-dimensional configuration space. One often has to resort to using a Markov chain Monte Carlo method, which converges only in the limit to the prescribed distribution. Such methods typically inch through configuration space step by step, with acceptance of a step based on a Metropolis(-Hastings) criterion. An acceptance rate of 100% is possible in principle by embedding configuration space in a higher dimensional phase space and using ordinary differential equations. In practice, numerical integrators must be used, lowering the acceptance rate. This is the essence of hybrid Monte Carlo methods. Presented is a general framework for constructing such methods under relaxed conditions: the only geometric property needed is (weakened) reversibility; volume preservation is not needed. The possibilities are illustrated by deriving a couple of explicit hybrid Monte Carlo methods, one based on barrier-lowering variable-metric dynamics and another based on isokinetic dynamics.
The ethics of using quality improvement methods in health care.
Lynn, Joanne; Baily, Mary Ann; Bottrell, Melissa; Jennings, Bruce; Levine, Robert J; Davidoff, Frank; Casarett, David; Corrigan, Janet; Fox, Ellen; Wynia, Matthew K; Agich, George J; O'Kane, Margaret; Speroff, Theodore; Schyve, Paul; Batalden, Paul; Tunis, Sean; Berlinger, Nancy; Cronenwett, Linda; Fitzmaurice, J Michael; Dubler, Nancy Neveloff; James, Brent
2007-05-01
Quality improvement (QI) activities can improve health care but must be conducted ethically. The Hastings Center convened leaders and scholars to address ethical requirements for QI and their relationship to regulations protecting human subjects of research. The group defined QI as systematic, data-guided activities designed to bring about immediate improvements in health care delivery in particular settings and concluded that QI is an intrinsic part of normal health care operations. Both clinicians and patients have an ethical responsibility to participate in QI, provided that it complies with specified ethical requirements. Most QI activities are not human subjects research and should not undergo review by an institutional review board; rather, appropriately calibrated supervision of QI activities should be part of professional supervision of clinical practice. The group formulated a framework that would use key characteristics of a project and its context to categorize it as QI, human subjects research, or both, with the potential of a customized institutional review board process for the overlap category. The group recommended a period of innovation and evaluation to refine the framework for ethical conduct of QI and to integrate that framework into clinical practice. PMID:17438310
δ 18Osw estimate for Globigerinoides ruber from core-top sediments in the East China Sea
NASA Astrophysics Data System (ADS)
Horikawa, Keiji; Kodaira, Tomohiro; Zhang, Jing; Murayama, Masafumi
2015-12-01
The paired analyses of the Mg/Ca ratio and oxygen isotopic composition ( δ 18Oc) of surface-dwelling planktonic foraminifera have become a widely used method for reconstructing the oxygen isotopic composition of ambient seawater ( δ 18Osw) as a robust proxy for surface salinity. Globigerinoides ruber ( G. ruber) is a mixed-layer dweller, and its fossil shell is an ideal archive for recording past sea surface water conditions, such as those caused by variability in the East Asian summer monsoon (EASM). Here, we investigate the validity of shell-derived δ 18Osw estimates for G. ruber using core-top sediments from the East China Sea (ECS). First, we determined a local δ 18Osw-salinity equation for the eastern part of the ECS in July [ δ 18Osw = -7.74 + 0.23 × salinity]. Then, we calculated δ 18Osw from core-top δ 18Oc and Mg/Ca values in G. ruber using the δ 18Oc-temperature equation of Bemis et al. (Paleoceanography 13(2):150-160, 1998) and the Mg/Ca-temperature equation of Hastings et al. (EOS 82:PP12B-10, 2001). The core-top δ 18Osw and salinity were estimated to be in the ranges of -0.2 to +0.39 ‰ and 33.7 to 34.5, respectively, which fall close to the local δ 18Osw-salinity regression line. The core-top data showed that the Mg/Ca-temperature calibration by Hastings et al. (EOS 82:PP12B-10, 2001) and the δ 18Oc-temperature equation by Bemis et al. (Paleoceanography 13(2):150-160, 1998) are appropriate for calculating δ 18Osw in the ECS. Furthermore, we measured core-top Ba/Ca ratios of G. ruber (Ba/Ca G. ruber ), which ranged from 0.66 to 2.82 μmol mol-1. There was not a significant relationship between the salinity and Ba/Ca G. ruber ratios due to the highly variable Ba/Ca G. ruber data. Given the seawater Ba/Ca data and the published partition coefficient for Ba ( D Ba = 0.15-0.22), pristine Ba/Ca G. ruber ratios at northern Okinawa Trough sites should be less than 0.84 μmol mol-1. Anomalously high Ba/Ca G. ruber ratios (>0.84 μmol mol-1) might
NASA Astrophysics Data System (ADS)
Provencher, Guillaume
This thesis is concerned with the study of critical phenomena for two-dimensional models on the lattice. Its results are contained in two articles: A first one, devoted to measuring geometric exponents, and a second one to the construction of idempotents for the XXZ spin chain projecting on indecomposable modules of the Temperley-Lieb algebra. Monte Carlo experiments, for a family of loop models in their dilute phase, are presented in the first article. Coined dilute loop models ( DLM ), this family is based upon an O (n) model introduced by Nienhuis (1990). It is defined by two coprime integers p, p' and an anisotropy parameter. In the continuum limit, DLM (p, p') is expected to yield a logarithmic conformal field theory of central charge c(kappa) = 13 - 6(kappa + kappa -1), where the ratio kappa = pp' is related to the loop gas fugacity beta = -2 cos pk . Critical exponents pertaining to valuable geometrical objects, namely the hull, external perimeter and red bonds, were measured. The Metropolis-Hastings algorithm, as well as several methods improving its efficiency, are presented. Despite the extrapolation of curves presenting large slopes, values as close as three to four digits from the theoretical predictions were attained through rigorous statistical analysis. The second article describes the decomposition of the XXZ spin chain Hilbert space ⊗n C2 using idempotents. The model of interest (Pasquier & Saleur (1990)) is described by a parameter-dependent Hamiltonian HXXZ (q), q ∈ Cx , expressible as a sum of elements of the Temperley-Lieb algebra TL n(q). The spectrum of HXXZ (q) in the continuum limit is also believed to be related to conformal field theories whose central charge is set by q. Using the quantum Schur-Weyl duality, an expression for the primitive idempotents of EndTLn ⊗n C2 , involving Uq sl2 elements, is obtained. These idempotents allow for the explicit construction of the indecomposable TLn-modules of ⊗ n C2 , all of which are
Predicting the global warming potential of agro-ecosystems in Europe
NASA Astrophysics Data System (ADS)
Lehuger, S.; Gabrielle, B.; Chaumartin, F.
2009-04-01
Nitrous oxide, carbon dioxide and methane are the main biogenic greenhouse gases contributing to the global warming potential (GWP) of agro-ecosystems. Evaluating the impact of agriculture on climate requires a capacity to predict the net exchanges of these gases in an integrated manner, as related to pedo-climatic conditions and crop management. The biophysical crop model CERES-EGC is designed to predict the productivity and GWP of agro-ecosystems at the plot-scale. Here we applied a Bayesian calibration to its both sub-models of N2O emissions and CO2 fluxes to deal with parameterization and uncertainty analysis. The N2O emission module of CERES-EGC was calibrated against chamber measurements from 7 arable sites in France and the CO2 flux module was calibrated against eddy-covariance measurements from 3 sites in Europe. Measurements from the various sites were assimilated in the posterior probability density functions for the different parameters, using a Bayesian calibration method based on the Metropolis-Hastings algorithm. The model was subsequently extrapolated to predict CO2 and N2O fluxes over entire crop rotations of 3 European experimental sites of the NitroEurope-IP network. Indirect GHG emissions arising from the production of agricultural inputs and from cropping operations were also added to the final GWP. Such modelling approach makes it possible to test various agronomic management scenarios, in order to design productive agro-ecosystems with low global warming potential. The model would be extrapolated from plot- to regional-scale, with the ultimate goal of generating spatialized GHG inventories. Differentiating the emissions in space would thus make it possible to target critical zones in mitigation scenarios at regional scale.
Abortion funding: legal and moral questions.
Altman, A
1978-04-01
M. Segers and G. Annas' (Hastings Center Report, August 1977) criticisms of the U.S. Supreme Court's recent abortion decisions are thought to be unpersuasive. Any sound argument against the Court's decision must avoid the conclusion that the government, either state or federal, is constitutionally required to finance any activity which is constitutionally protected if the person wishing to engage in the activity is unable to finance the activity. The argument given by Segers does not avoid this implausible conclusion. She contends that the lack of legislation providing for the public financing of elective abortions "plainly discriminates against a social class, since a right guaranteed to the rich is denied in practice to the poor." Annas' reasoning is considered better, for he implies that the failure publicly to finance elective abortion constitutes unconstitutional interference with the indigent woman's right to an abortion, citing the Doe v. Bolton ruling which struck down the Georgia law requiring the concurrence of 2 physicians before an abortion lawfully could be performed. Of the 3 articles in the Report for August, it is felt that Annas' comes closest to recognizing the true nature of the constitutional issue raised by these abortion cases, but even his argument eventually moves into viewing the issue as one of "the rich" vs. "the poor." Possibly there is an issue here of social justice which can be viewed in terms of "the rich" vs. "the poor," and the demands of justice might categorically require the financing of all abortions for the indigent so that they can exercise this important legal right. However, the Constitution is not a document that incorporates all of the principles of social justice and does not impose such a requirement. PMID:649373
Secchi, Francesco; Resta, Elda Chiara; Di Leo, Giovanni; Petrini, Marcello; Messina, Carmelo; Carminati, Mario; Sardanelli, Francesco
2014-08-01
Our aim was to compare two different approaches for segmentation of single ventricle (SV) on cardiac magnetic resonance (CMR) cine images. We retrospectively studied 30 consecutive patients (23 males; aged 27 ± 10 years) with a treated SV who underwent 1.5-T CMR using ECG-triggered axial true-FISP, HASTE and cine true-FISP sequences. We classified patients for visceroatrial situs, cardiac axis orientation, ventricular loop, morphology of SV and position of great arteries. One experienced reader segmented cine images twice, firstly including only the systemic ventricle, secondly including both systemic and accessorial ventricles. Ejection fraction (EF), indexed end-diastolic volume (EDVI), end-systolic volume (ESVI), and stroke volume (SVI) were calculated. Data were presented as medians and interquartile intervals. Four patients presented dextrocardia and one patient mesocardia. Two had situs ambiguus with asplenia and one situs ambiguus with polisplenia. Four patients showed right morphology of the SV and three levo-ventricle loop. We found 14 levo-trasposition of great arteries (TGA), 4 levo-malposition of great arteries (MGA), four dextro-MGA, two dextro-TGA, and one inverted vessel position. When segmenting only the systemic ventricle, EDVI (mL/m2) was 65 (50-91), when segmenting both ventricles 76 (58-110) (P < 0.001); ESVI (mL/m2) was 32 (24-45) and 45 (33-60), respectively (P < 0.001); EF (%) was 49 (43-57) and 33 (24-47), respectively (P = 0.003); SVI (mL/m2) was 34 (17-48) and 33 (24-47) (P = 0.070). The inclusion of the accessorial ventricle in the segmentation of SV produce a biased lower EF showing a very low contribution to the pump function. PMID:24801178
Fit for high altitude: are hypoxic challenge tests useful?
2011-01-01
Altitude travel results in acute variations of barometric pressure, which induce different degrees of hypoxia, changing the gas contents in body tissues and cavities. Non ventilated air containing cavities may induce barotraumas of the lung (pneumothorax), sinuses and middle ear, with pain, vertigo and hearing loss. Commercial air planes keep their cabin pressure at an equivalent altitude of about 2,500 m. This leads to an increased respiratory drive which may also result in symptoms of emotional hyperventilation. In patients with preexisting respiratory pathology due to lung, cardiovascular, pleural, thoracic neuromuscular or obesity-related diseases (i.e. obstructive sleep apnea) an additional hypoxic stress may induce respiratory pump and/or heart failure. Clinical pre-altitude assessment must be disease-specific and it includes spirometry, pulsoximetry, ECG, pulmonary and systemic hypertension assessment. In patients with abnormal values we need, in addition, measurements of hemoglobin, pH, base excess, PaO2, and PaCO2 to evaluate whether O2- and CO2-transport is sufficient. Instead of the hypoxia altitude simulation test (HAST), which is not without danger for patients with respiratory insufficiency, we prefer primarily a hyperoxic challenge. The supplementation of normobaric O2 gives us information on the acute reversibility of the arterial hypoxemia and the reduction of ventilation and pulmonary hypertension, as well as about the efficiency of the additional O2-flow needed during altitude exposure. For difficult judgements the performance of the test in a hypobaric chamber with and without supplemental O2-breathing remains the gold standard. The increasing numbers of drugs to treat acute pulmonary hypertension due to altitude exposure (acetazolamide, dexamethasone, nifedipine, sildenafil) or to other etiologies (anticoagulants, prostanoids, phosphodiesterase-5-inhibitors, endothelin receptor antagonists) including mechanical aids to reduce periodical or
Probabilistic graphical model representation in phylogenetics.
Höhna, Sebastian; Heath, Tracy A; Boussau, Bastien; Landis, Michael J; Ronquist, Fredrik; Huelsenbeck, John P
2014-09-01
Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis-Hastings or Gibbs sampling of the posterior distribution. PMID:24951559
Statistical and numerical methods to improve the transient divided bar method
NASA Astrophysics Data System (ADS)
Bording, Thue; Bom Nielsen, Søren; Balling, Niels
2014-05-01
A key element in studying subsurface heat transfer processes is accurate knowledge of the thermal properties. These properties include thermal conductivity, thermal diffusivity and heat capacity. The divided bar method is a commonly used method to estimate thermal conductivity of rock samples. In the method's simplest form, a fixed temperature difference is imposed on a stack consisting of the rock sample and a standard material with known thermal conductivity. Temperature measurements along the stack are used to estimate the temperature gradients and the thermal conductivity of the sample can then be found by Fourier's law. We present several improvements to this method that allows for simultaneous measurements of both thermal conductivity and thermal diffusivity. The divided bar setup is run in a transient mode, and a time-dependent temperature profile is measured at four points along the stack: on either side of the sample and at the top and bottom of the stack. To induce a thermal signal, a time-varying temperature is imposed at one end of the stack during measurements. Using the measured temperatures at both ends as Dirichlet boundary conditions, a finite element procedure is used to model the temperature profile. This procedure is used as the forward model. A Markov Chain Monte Carlo Metropolis Hastings algorithm is used for the inversion modelling. The unknown parameters are thermal conductivity and volumetric heat capacity of the sample and the contact resistances between the elements in the stack. The contact resistances are not resolved and must be made as small as possible by careful sample preparation and stack assembly. Histograms of the unknown parameters are produced. The ratio of thermal conductivity and volumetric heat capacity yields a histogram of thermal diffusivity. Since density can be measured independently, the specific heat capacity is also obtained. The main improvement with this method is that not only are we able to measure thermal
NASA Astrophysics Data System (ADS)
Dutta, Rishabh; Jónsson, Sigurjón
2016-04-01
Earthquake faults are generally considered planar (or of other simple geometry) in earthquake source parameter estimations. However, simplistic fault geometries likely result in biases in estimated slip distributions and increased fault slip uncertainties. In case of large subduction zone earthquakes, these biases and uncertainties propagate into tsunami waveform modeling and other calculations related to postseismic studies, Coulomb failure stresses, etc. In this research, we parameterize 3D non-planar fault geometry for the 2011 Tohoku-Oki earthquake (Mw 9.1) and estimate these geometrical parameters along with fault slip parameters from onland and offshore GPS using Bayesian inference. This non-planar fault is formed using several 3rd degree polynomials in along-strike (X-Y plane) and along-dip (X-Z plane) directions that are tied together using a triangular mesh. The coefficients of these polynomials constitute the fault geometrical parameters. We use the trench and locations of past seismicity as a priori information to constrain these fault geometrical parameters and the Laplacian to characterize the fault slip smoothness. Hyper-parameters associated to these a priori constraints are estimated empirically and the posterior probability distribution of the model (fault geometry and slip) parameters is sampled using an adaptive Metropolis Hastings algorithm. The across-strike uncertainties in the fault geometry (effectively the local fault location) around high-slip patches increases from 6 km at 10km depth to about 35 km at 50km depth, whereas around low-slip patches the uncertainties are larger (from 7 km to 70 km). Uncertainties in reverse slip are found to be higher at high slip patches than at low slip patches. In addition, there appears to be high correlation between adjacent patches of high slip. Our results demonstrate that we can constrain complex non-planar fault geometry together with fault slip from GPS data using past seismicity as a priori
Assessment of Flood Disaster Impacts in Cambodia: Implications for Rapid Disaster Response
NASA Astrophysics Data System (ADS)
Ahamed, Aakash; Bolten, John; Doyle, Colin
2016-04-01
Disaster monitoring systems can provide near real time estimates of population and infrastructure affected by sudden onset natural hazards. This information is useful to decision makers allocating lifesaving resources following disaster events. Floods are the world's most common and devastating disasters (UN, 2004; Doocy et al., 2013), and are particularly frequent and severe in the developing countries of Southeast Asia (Long and Trong, 2001; Jonkman, 2005; Kahn, 2005; Stromberg, 2007; Kirsch et al., 2012). Climate change, a strong regional monsoon, and widespread hydropower construction contribute to a complex and unpredictable regional hydrodynamic regime. As such, there is a critical need for novel techniques to assess flood impacts to population and infrastructure with haste during and following flood events in order to enable governments and agencies to optimize response efforts following disasters. Here, we build on methods to determine regional flood extent in near real time and develop systems that automatically quantify the socioeconomic impacts of flooding in Cambodia. Software developed on cloud based, distributed processing Geographic Information Systems (GIS) is used to demonstrate spatial and numerical estimates of population, households, roadways, schools, hospitals, airports, agriculture and fish catch affected by severe monsoon flooding occurring in the Cambodian portion of Lower Mekong River Basin in 2011. Results show modest agreement with government and agency estimates. Maps and statistics generated from the system are intended to complement on the ground efforts and bridge information gaps to decision makers. The system is open source, flexible, and can be applied to other disasters (e.g. earthquakes, droughts, landslides) in various geographic regions.
NASA Astrophysics Data System (ADS)
Tak, Hyungsuk; Mandel, Kaisey; van Dyk, David A.; Kashyap, Vinay; Meng, Xiao-Li; Siemiginowska, Aneta
2016-01-01
The gravitational field of a galaxy can act as a lens and deflect the light emitted by a more distant object such as a quasar. If the galaxy is a strong gravitational lens, it can produce multiple images of the same quasar in the sky. Since the light in each gravitationally lensed image traverses a different path length and gravitational potential from the quasar to the Earth, fluctuations in the source brightness are observed in the several images at different times. We infer the time delay between these fluctuations in the brightness time series data of each image, which can be used to constrain cosmological parameters. Our model is based on a state-space representation for irregularly observed time series data generated from a latent continuous-time Ornstein-Uhlenbeck process. We account for microlensing variations via a polynomial regression in the model. Our Bayesian strategy adopts scientifically motivated hyper-prior distributions and a Metropolis-Hastings within Gibbs sampler. We improve the sampler by using an ancillarity-sufficiency interweaving strategy, and adaptive Markov chain Monte Carlo. We introduce a profile likelihood of the time delay as an approximation to the marginal posterior distribution of the time delay. The Bayesian and profile likelihood approaches complement each other, producing almost identical results; the Bayesian method is more principled but the profile likelihood is faster and simpler to implement. We demonstrate our estimation strategy using simulated data of doubly- and quadruply-lensed quasars from the Time Delay Challenge, and observed data of quasars Q0957+561 and J1029+2623.
Bayesian network reconstruction using systems genetics data: comparison of MCMC methods.
Tasaki, Shinya; Sauerwine, Ben; Hoff, Bruce; Toyoshiba, Hiroyoshi; Gaiteri, Chris; Chaibub Neto, Elias
2015-04-01
Reconstructing biological networks using high-throughput technologies has the potential to produce condition-specific interactomes. But are these reconstructed networks a reliable source of biological interactions? Do some network inference methods offer dramatically improved performance on certain types of networks? To facilitate the use of network inference methods in systems biology, we report a large-scale simulation study comparing the ability of Markov chain Monte Carlo (MCMC) samplers to reverse engineer Bayesian networks. The MCMC samplers we investigated included foundational and state-of-the-art Metropolis-Hastings and Gibbs sampling approaches, as well as novel samplers we have designed. To enable a comprehensive comparison, we simulated gene expression and genetics data from known network structures under a range of biologically plausible scenarios. We examine the overall quality of network inference via different methods, as well as how their performance is affected by network characteristics. Our simulations reveal that network size, edge density, and strength of gene-to-gene signaling are major parameters that differentiate the performance of various samplers. Specifically, more recent samplers including our novel methods outperform traditional samplers for highly interconnected large networks with strong gene-to-gene signaling. Our newly developed samplers show comparable or superior performance to the top existing methods. Moreover, this performance gain is strongest in networks with biologically oriented topology, which indicates that our novel samplers are suitable for inferring biological networks. The performance of MCMC samplers in this simulation framework can guide the choice of methods for network reconstruction using systems genetics data. PMID:25631319
[Improvement of gynecologic radium therapy through the afterloading method using cesium 137].
Fournier, D V; Senf, W; Kuttig, H; Kubli, F
1976-03-01
For all centers performing gynecological contact irradiation the use of afterloading techniques is urgently required, since they eliminate any radiation exposure to the personnel. The radio-therapist may position and control the empty applicators still free from radiation withoug haste. This procedure diminishes the occurrence of overdosages and underdosages. The care for the patients is possible without radiation exposure, and the morbidity of contact therapy can be reduced by occasional mobilization of the patient, the applicator without sources remaining at its place. The fully automatic apparatus "Curietron" using cesium-137 sources (0.662 MeV gamma emission, half-life period 26.6 years) with an equivalent source activity (factor 2.6) yields the dose distribution demanded in the gynecologic field which in practice is identical to that of Ra-226 (medium gamma emission 1 MeV, half-life period 1620 years). With similar dose distribution, a biological and therapeutical effect alike to Ra-226 may be expected from Cs-137. In comparison with Ra-226, the following advantages of Cs-137 may be mentioned: Lower half-value thickness of 5.5 mm lead (low expenses for radioprotection), less danger with handling (no emanation of radioactive gases), and lower contamination risks in case of breaking. The measured dose distributions at equivalent source activity and similar geometry of the applicators revealed the possibility with regard of all techniques of gynecologic irradiation utilized in our field of arriving at similar relative and absolute dose distributions by means of the Cs-137 afterloading technique. Whilst short-term afterloading irradiation with highly active sources, their radiobiological effectiveness being not yet ascertained, has to be tested at appropriate scientific centers, it is necessary to demand afterloading techniques with dosages and duration of irradiations approved over decades for all centers of gynecological contact therapy because of radiation
Markov Chain Monte-Carlo Orbit Computation for Binary Asteroids
NASA Astrophysics Data System (ADS)
Oszkiewicz, D.; Hestroffer, D.; Pedro, David C.
2013-11-01
We present a novel method of orbit computation for resolved binary asteroids. The method combines the Thiele, Innes, van den Bos method with a Markov chain Monte Carlo technique (MCMC). The classical Thiele-van den Bos method has been commonly used in multiple applications before, including orbits of binary stars and asteroids; conversely this novel method can be used for the analysis of binary stars, and of other gravitationally bound binaries. The method requires a minimum of three observations (observing times and relative positions - Cartesian or polar) made at the same tangent plane - or close enough for enabling a first approximation. Further, the use of the MCMC technique for statistical inversion yields the whole bundle of possible orbits, including the one that is most probable. In this new method, we make use of the Metropolis-Hastings algorithm to sample the parameters of the Thiele-van den Bos method, that is the orbital period (or equivalently the double areal constant) together with three randomly selected observations from the same tangent plane. The observations are sampled within their observational errors (with an assumed distribution) and the orbital period is the only parameter that has to be tuned during the sampling procedure. We run multiple chains to ensure that the parameter phase space is well sampled and that the solutions have converged. After the sampling is completed we perform convergence diagnostics. The main advantage of the novel approach is that the orbital period does not need to be known in advance and the entire region of possible orbital solutions is sampled resulting in a maximum likelihood solution and the confidence regions. We have tested the new method on several known binary asteroids and conclude a good agreement with the results obtained with other methods. The new method has been implemented into the Gaia DPAC data reduction pipeline and can be used to confirm the binary nature of a suspected system, and for deriving
Cosmological Parameters from CMB Maps without Likelihood Approximation
NASA Astrophysics Data System (ADS)
Racine, B.; Jewell, J. B.; Eriksen, H. K.; Wehus, I. K.
2016-03-01
We propose an efficient Bayesian Markov chain Monte Carlo (MCMC) algorithm for estimating cosmological parameters from cosmic microwave background (CMB) data without the use of likelihood approximations. It builds on a previously developed Gibbs sampling framework that allows for exploration of the joint CMB sky signal and power spectrum posterior, P({\\boldsymbol{s}},{C}{\\ell }| {\\boldsymbol{d}}), and addresses a long-standing problem of efficient parameter estimation simultaneously in regimes of high and low signal-to-noise ratio. To achieve this, our new algorithm introduces a joint Markov chain move in which both the signal map and power spectrum are synchronously modified, by rescaling the map according to the proposed power spectrum before evaluating the Metropolis-Hastings accept probability. Such a move was already introduced by Jewell et al., who used it to explore low signal-to-noise posteriors. However, they also found that the same algorithm is inefficient in the high signal-to-noise regime, since a brute-force rescaling operation does not account for phase information. This problem is mitigated in the new algorithm by subtracting the Wiener filter mean field from the proposed map prior to rescaling, leaving high signal-to-noise information invariant in the joint step, and effectively only rescaling the low signal-to-noise component. To explore the full posterior, the new joint move is then interleaved with a standard conditional Gibbs move for the sky map. We apply our new algorithm to simplified simulations for which we can evaluate the exact posterior to study both its accuracy and its performance, and find good agreement with the exact posterior; marginal means agree to ≲0.006σ and standard deviations to better than ˜3%. The Markov chain correlation length is of the same order of magnitude as those obtained by other standard samplers in the field.
An Introduction to Thinking about Trustworthy Research into the Genetics of Intelligence.
Parens, Erik; Appelbaum, Paul S
2015-01-01
The advent of new technologies has rekindled some hopes that it will be possible to identify genetic variants that will help to explain why individuals are different with respect to complex traits. At least one leader in the development of "whole genome sequencing"-the Chinese company BGI-has been quite public about its commitment to using the technique to investigate the genetics of intelligence in general and high intelligence in particular. Because one needs large samples to detect the small effects associated with small genetic differences in the sequence of those base pairs, to make headway with the new sequencing technologies, one also needs to enlist much larger numbers of study participants than geneticists have enrolled before. In an effort to increase the size of a sample, one team of researchers approached the Center for Talented Youth at Johns Hopkins University. They wanted to gain access to records concerning participants in CTY's ongoing Study of Exceptional Talent, and they wanted to approach those individuals to see if they would be willing to share samples of their DNA. We agreed that CTY's dilemma about whether to give the researchers access to those records raised larger questions about the ethics of research into the genetics of intelligence, and we decided to hold a workshop at The Hastings Center that could examine those questions. Our purpose was to create what, borrowing from Sarah Richardson, we came to call a "transformative conversation" about research into the genetics of general cognitive ability-a conversation that would take a wide and long view and would involve a diverse group of stakeholders, including both people who have been highly critical of the research and people who engage in it. This collection of essays, which grew out of that workshop, is intended to provide an introduction to and exploration of this complex and important area. PMID:26413943
Watanabe, Kunihiro; Ishimori, Yoshiyuki; Sakurai, Hitoshi; Iwai, Yuji; Miida, Kazuo; Kurita, Kouki
2016-04-01
Manganese chloride tetrahydrate (MCT) is one of the oral negative contrast agents which is indispensable for imaging of magnetic resonance cholangiopancreatography (MRCP). In this study, improvement of the image quality of MRCP by using low-temperature MCT is verified. All MR imagings were performed using 1.5 T scanner. The T(1) and T(2) values of the different temperature MCTs were measured in the phantom study. Different concentrations of MCT-doped water (30%, 50%, 70%, and 90%) were measured at several temperature conditions (10°C, 15°C, 23°C, 35°C, and 40°C). As a result, the T(1) and T(2) values became larger with a temperature rise. It was more remarkable in low-concentration MCT. Then, 17 healthy subjects were scanned two times with different temperatures of MCT. The MCT of the normal temperature (23°C) and low temperature (10°C) were taken at consecutive 2 days. The contrast between the stomach and the spleen were significantly higher in 2D half Fourier acquisition single shot turbo spin echo (HASTE) images by use of the low-temperature MCT. The contrast between the common bile duct and the adjacent background were significantly higher in the source images of 3D MRCP by use of the low temperature MCT. In addition, 76% of subjects answered in the questionnaire that the low temperature MCT is easier to drink. The low temperature MCT improves the image quality of MRCP and contributes to performing noninvasive examination. PMID:27097992
Quantifying Uncertainty in Velocity Models using Bayesian Methods
NASA Astrophysics Data System (ADS)
Hobbs, R.; Caiado, C.; Majdański, M.
2008-12-01
Quanitifying uncertainty in models derived from observed data is a major issue. Public and political understanding of uncertainty is poor and for industry inadequate assessment of risk costs money. In this talk we will examine the geological structure of the subsurface, however our principal exploration tool, controlled source seismology, gives its data in time. Inversion tools exist to map these data into a depth model but a full exploration of the uncertainty of the model is rarely done because robust strategies do not exist for large non-linear complex systems. There are two principal sources of uncertainty: the first comes from the input data which is noisy and bandlimited; the second, and more sinister, is from the model parameterisation and forward algorithms themselves, which approximate to the physics to make the problem tractable. To address these issues we propose a Bayesian approach. One philosophy is to estimate the uncertainty in a possible model derived using standard inversion tools. During the inversion stage we can use our geological prejudice to derive an acceptable model. Then we use a local random walk using the Metropolis- Hastings algorithm to explore the model space immediately around a possible solution. For models with a limited number of parameters we can use the forward modeling step from the inversion code. However as the number of parameters increase and/or the cost of the forward modeling step becomes significant, we need to use fast emulators to act as proxies so a sufficient number of iterations can be performed on which to base our statistical measures of uncertainty. In this presentation we show examples of uncertainty estimation using both pre- and post-critical seismic data. In particular, we will demonstrate uncertainty introduced by the approximation of the physics by using a tomographic inversion of bandlimited data and show that uncertainty increases as the central frequency of the data decreases. This is consistent with the
NASA Astrophysics Data System (ADS)
Jersild, A.; Thomas, R. Q.; Brooks, E.; Teskey, R. O.; Wynne, R. H.; Arthur, D.; Gonzalez, C.; Thomas, V. A.; Fox, T. D.; Smallman, L.
2015-12-01
Predictions of the how forest productivity and carbon sequestration will respond to climate change are essential for assisting land managers in adapting to future climate. However, current predictions can include considerable uncertainty that is often not well quantified. To address the need for better quantification of uncertainty, we calculated and compared parameter and climate prediction uncertainty for predictions of Southeastern U.S. pine forest productivity. We used a Metropolis-Hastings Markov Chain Monte Carlo-based data assimilation technique to fuse regionally widespread and diverse datasets with the Physiological Principles Predicting Growth model (3PG) model. The datasets incorporated include biomass observations from forest research plots that are part of the Pine Integrated Network: Education, Mitigation, and Adaptation project (PINEMAP) project, photosynthesis and evaporation observations from loblolly pine Ameriflux sites, and productivity responses to elevated CO2 from the Duke Free Air C site. These spatially and temporally diverse data sets give our unique analysis a more accurately measured uncertainty by constraining complimentary components of the model. In our analysis, parameter uncertainty was quantified using simulations that integrate across the posterior parameter distributions, while climate model uncertainty was quantified using downscaled RCP 8.5 simulations from twenty different CMIP5 climate models. Overall, we found that the uncertainty in future productivity of Southeastern U.S. managed pine forests that was associated with parameterization is comparable to the uncertainty associated with climate simulations. Our results indicate that reducing parameterization in ecosystem model development can improve future predictions of forest productivity and carbon sequestration, but uncertainties in future climate predictions also need to be properly quantified and communicated to forest owners and managers.
NASA Astrophysics Data System (ADS)
Jakkareddy, Pradeep S.; Balaji, C.
2016-05-01
This paper reports the results of an experimental study to estimate the heat flux and convective heat transfer coefficient using liquid crystal thermography and Bayesian inference in a heat generating sphere, enclosed in a cubical Teflon block. The geometry considered for the experiments comprises a heater inserted in a hollow hemispherical aluminium ball, resulting in a volumetric heat generation source that is placed at the center of the Teflon block. Calibrated thermochromic liquid crystal sheets are used to capture the temperature distribution at the front face of the Teflon block. The forward model is the three dimensional conduction equation which is solved within the Teflon block to obtain steady state temperatures, using COMSOL. Match up experiments are carried out for various velocities by minimizing the residual between TLC and simulated temperatures for every assumed loss coefficient, to obtain a correlation of average Nusselt number against Reynolds number. This is used for prescribing the boundary condition for the solution to the forward model. A surrogate model obtained by artificial neural network built upon the data from COMSOL simulations is used to drive a Markov Chain Monte Carlo based Metropolis Hastings algorithm to generate the samples. Bayesian inference is adopted to solve the inverse problem for determination of heat flux and heat transfer coefficient from the measured temperature field. Point estimates of the posterior like the mean, maximum a posteriori and standard deviation of the retrieved heat flux and convective heat transfer coefficient are reported. Additionally the effect of number of samples on the performance of the estimation process has been investigated.
NASA Astrophysics Data System (ADS)
LIU, B.; Liang, Y.
2015-12-01
Markov chain Monte Carlo (MCMC) simulation is a powerful statistical method in solving inverse problems that arise from a wide range of applications, such as nuclear physics, computational biology, financial engineering, among others. In Earth sciences applications of MCMC are primarily in the field of geophysics [1]. The purpose of this study is to introduce MCMC to geochemical inverse problems related to trace element fractionation during concurrent melting, melt transport and melt-rock reaction in the mantle. MCMC method has several advantages over linearized least squares methods in inverting trace element patterns in basalts and mantle rocks. First, MCMC can handle equations that have no explicit analytical solutions which are required by linearized least squares methods for gradient calculation. Second, MCMC converges to global minimum while linearized least squares methods may be stuck at a local minimum or converge slowly due to nonlinearity. Furthermore, MCMC can provide insight into uncertainties of model parameters with non-normal trade-off. We use MCMC to invert for extent of melting, amount of trapped melt, and extent of chemical disequilibrium between the melt and residual solid from REE data in abyssal peridotites from Central Indian Ridge and Mid-Atlantic Ridge. In the first step, we conduct forward calculation of REE evolution with melting models in a reasonable model space. We then build up a chain of melting models according to Metropolis-Hastings algorithm to represent the probability of specific model. We show that chemical disequilibrium is likely to play an important role in fractionating LREE in residual peridotites. In the future, MCMC will be applied to more realistic but also more complicated melting models in which partition coefficients, diffusion coefficients, as well as melting and melt suction rates vary as functions of temperature, pressure and mineral compositions. [1]. Sambridge & Mosegarrd [2002] Rev. Geophys.
Towards interpreting nitrate-δ15N records in ice cores in terms of nitrogen oxide sources
NASA Astrophysics Data System (ADS)
Hastings, M. G.; Buffen, A. M.
2011-12-01
The isotopic composition of nitrate preserved in ice cores offers unique potential for reconstructing past contributions of nitrogen oxides (NOx = NO and NO2) to the atmosphere. Sources of NOx imprint a nitrogen stable isotopic (δ15N) signature, which can be conserved during subsequent oxidation to form nitrate. Major sources of NOx include fossil fuels combustion, biomass burning, microbial processes in soils, and lightning, and thus a quantitative tracer of emissions would help detail connections between the atmosphere, the biosphere, and climate. Unfortunately, the δ15N signatures of most NOx sources are not yet well enough constrained to allow for quantitative partitioning, though new methodology for directly collecting NOx for isotopic analysis is promising (Fibiger and Hastings, A43D-0265, AGU 2010). Still, a growing network of ice core δ15N records may offer insight into source signatures, as different sources are important to different regions of the world. For example, a 300-year ice core record of nitrate-δ15N from Summit, Greenland shows a clear and significant 12% (vs. N2) decrease since the Preindustrial that reflects emissions from fossil fuel combustion and/or soils related to changing agricultural practices in North America and Europe. Over the same time period, Antarctic ice cores show no such trend in δ15N. This would be consistent with previous work suggesting that biomass burning and/or stratospheric intrusion of NOx produced from N2O oxidation are dominant sources for nitrate formation at high southern latitudes. In comparison to the polar records, nitrate in tropical ice cores should represent more significant inputs from lightning, microbial processes in soils, and biomass burning. This may be reflected in new results from a high-elevation site in the Peruvian Andes that shows strong seasonal δ15N cycles of up to 15% (vs. N2). We compare and contrast these records in an effort to evaluate the contribution of NOx sources to nitrate over
Felix, J V; Papathanasopoulos, M A; Smith, A A; von Holy, A; Hastings, J W
1994-10-01
Leuconostoc (Lc.) carnosum Ta11a, isolated from vacuum-packaged processed meats, produced a bacteriocin designated leucocin B-Ta11a. The crude bacteriocin was heat stable and sensitive to proteolytic enzymes, but not to catalase, lysozyme, or chloroform. It was active against Listeria monocytogenes and several lactic acid bacteria. Leucocin B-Ta11a was optimally produced at 25 degrees C in MRS broth at an initial pH of 6.0 or 6.5. An 8.9-MDa plasmid in Leuconostoc carnosum Ta11a hybridized to a 36-mer oligonucleotide probe (JF-1) that was homologous to leucocin A-UAL187. A 4.9-kb Sau3A fragment from a partial digest of the 8.9-MDa plasmid was cloned into pUC118. The 8.1-kb recombinant plasmid (pJF8.1) was used for sequencing and revealed the presence of two open reading frames (ORFs). ORF1 codes for a protein of 61 amino acids comprising a 37-amino-acid bacteriocin that was determined to be the leucocin B-Ta11a structural gene by virtue of its homology to leucocin A-UAL 187 (Hastings et al. 1991. J. Bacteriol 173:7491-7500). The 24-amino-acid N-terminal extension, however, differs from that of leucocin A-UAL187 by seven residues. The predicted protein of the ORF2 has 113 amino acids and is identical with the amino acid sequence of the cognate ORF of the leucocin A-UAL 187 operon. PMID:7765496
Bayesian inference in an item response theory model with a generalized student t link function
NASA Astrophysics Data System (ADS)
Azevedo, Caio L. N.; Migon, Helio S.
2012-10-01
In this paper we introduce a new item response theory (IRT) model with a generalized Student t-link function with unknown degrees of freedom (df), named generalized t-link (GtL) IRT model. In this model we consider only the difficulty parameter in the item response function. GtL is an alternative to the two parameter logit and probit models, since the degrees of freedom (df) play a similar role to the discrimination parameter. However, the behavior of the curves of the GtL is different from those of the two parameter models and the usual Student t link, since in GtL the curve obtained from different df's can cross the probit curves in more than one latent trait level. The GtL model has similar proprieties to the generalized linear mixed models, such as the existence of sufficient statistics and easy parameter interpretation. Also, many techniques of parameter estimation, model fit assessment and residual analysis developed for that models can be used for the GtL model. We develop fully Bayesian estimation and model fit assessment tools through a Metropolis-Hastings step within Gibbs sampling algorithm. We consider a prior sensitivity choice concerning the degrees of freedom. The simulation study indicates that the algorithm recovers all parameters properly. In addition, some Bayesian model fit assessment tools are considered. Finally, a real data set is analyzed using our approach and other usual models. The results indicate that our model fits the data better than the two parameter models.
The value of fast MR imaging as an adjunct to ultrasound in prenatal diagnosis.
Breysem, L; Bosmans, H; Dymarkowski, S; Schoubroeck, D Van; Witters, I; Deprest, J; Demaerel, P; Vanbeckevoort, D; Vanhole, C; Casaer, P; Smet, M
2003-07-01
The aim of this study was to evaluate the role of MR imaging of the fetus to improve sonographic prenatal diagnosis of congenital anomalies. In 40 fetuses (not consecutive cases) with an abnormality diagnosed with ultrasound, additional MR imaging was performed. The basic sequence was a T2-weighted single-shot half Fourier (HASTE) technique. Head, neck, spinal, thoracic, urogenital, and abdominal fetal pathologies were found. This retrospective, observational study compared MR imaging findings with ultrasonographic findings regarding detection, topography, and etiology of the pathology. The MR findings were evaluated as superior, equal to, or inferior compared with US, in consent with the referring gynecologists. The role of these findings in relation to pregnancy management was studied and compared with postnatal follow-up in 30 of 40 babies. Fetal MRI technique was successful in 36 of 39 examinations and provided additional information in 21 of 40 fetuses (one twin pregnancy with two members to evaluate). More precise anatomy and location of fetal pathology (20 of 40 cases) and additional etiologic information (8 of 40 cases) were substantial advantages in cerebrospinal abnormalities [ventriculomegaly, encephalocele, vein of Galen malformation, callosal malformations, meningo(myelo)cele], in retroperitoneal abnormalities (lymphangioma, renal agenesis, multicystic renal dysplasia), and in neck/thoracic pathology [cervical cystic teratoma, congenital hernia diaphragmatica, congenital cystic adenomatoid lung malformation (CCAM)]. This improved parental counseling and pregnancy management in 15 pregnancies. In 3 cases, prenatal MRI findings did not correlate with prenatal ultrasonographic findings or neonatal diagnosis. The MRI provided a more detailed description and insight into fetal anatomy, pathology, and etiology in the vast majority of these selected cases. This improved prenatal parental counseling and postnatal therapeutic planning. PMID:12695920
Bayesian Network Reconstruction Using Systems Genetics Data: Comparison of MCMC Methods
Tasaki, Shinya; Sauerwine, Ben; Hoff, Bruce; Toyoshiba, Hiroyoshi; Gaiteri, Chris; Chaibub Neto, Elias
2015-01-01
Reconstructing biological networks using high-throughput technologies has the potential to produce condition-specific interactomes. But are these reconstructed networks a reliable source of biological interactions? Do some network inference methods offer dramatically improved performance on certain types of networks? To facilitate the use of network inference methods in systems biology, we report a large-scale simulation study comparing the ability of Markov chain Monte Carlo (MCMC) samplers to reverse engineer Bayesian networks. The MCMC samplers we investigated included foundational and state-of-the-art Metropolis–Hastings and Gibbs sampling approaches, as well as novel samplers we have designed. To enable a comprehensive comparison, we simulated gene expression and genetics data from known network structures under a range of biologically plausible scenarios. We examine the overall quality of network inference via different methods, as well as how their performance is affected by network characteristics. Our simulations reveal that network size, edge density, and strength of gene-to-gene signaling are major parameters that differentiate the performance of various samplers. Specifically, more recent samplers including our novel methods outperform traditional samplers for highly interconnected large networks with strong gene-to-gene signaling. Our newly developed samplers show comparable or superior performance to the top existing methods. Moreover, this performance gain is strongest in networks with biologically oriented topology, which indicates that our novel samplers are suitable for inferring biological networks. The performance of MCMC samplers in this simulation framework can guide the choice of methods for network reconstruction using systems genetics data. PMID:25631319
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Harrington, Joseph; Blecic, Jasmina; Stemm, Madison M.; Lust, Nate B.; Foster, Andrew S.; Rojo, Patricio M.; Loredo, Thomas J.
2014-11-01
Multi-wavelength secondary-eclipse and transit depths probe the thermo-chemical properties of exoplanets. In recent years, several research groups have developed retrieval codes to analyze the existing data and study the prospects of future facilities. However, the scientific community has limited access to these packages. Here we premiere the open-source Bayesian Atmospheric Radiative Transfer (BART) code. We discuss the key aspects of the radiative-transfer algorithm and the statistical package. The radiation code includes line databases for all HITRAN molecules, high-temperature H2O, TiO, and VO, and includes a preprocessor for adding additional line databases without recompiling the radiation code. Collision-induced absorption lines are available for H2-H2 and H2-He. The parameterized thermal and molecular abundance profiles can be modified arbitrarily without recompilation. The generated spectra are integrated over arbitrary bandpasses for comparison to data. BART's statistical package, Multi-core Markov-chain Monte Carlo (MC3), is a general-purpose MCMC module. MC3 implements the Differental-evolution Markov-chain Monte Carlo algorithm (ter Braak 2006, 2009). MC3 converges 20-400 times faster than the usual Metropolis-Hastings MCMC algorithm, and in addition uses the Message Passing Interface (MPI) to parallelize the MCMC chains. We apply the BART retrieval code to the HD 209458b data set to estimate the planet's temperature profile and molecular abundances. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.
Duality in Phase Space and Complex Dynamics of an Integrated Pest Management Network Model
NASA Astrophysics Data System (ADS)
Yuan, Baoyin; Tang, Sanyi; Cheke, Robert A.
Fragmented habitat patches between which plants and animals can disperse can be modeled as networks with varying degrees of connectivity. A predator-prey model with network structures is proposed for integrated pest management (IPM) with impulsive control actions. The model was analyzed using numerical methods to investigate how factors such as the impulsive period, the releasing constant of natural enemies and the mode of connections between the patches affect pest outbreak patterns and the success or failure of pest control. The concept of the cluster as defined by Holland and Hastings is used to describe variations in results ranging from global synchrony when all patches have identical fluctuations to n-cluster solutions with all patches having different dynamics. Heterogeneity in the initial densities of either pest or natural enemy generally resulted in a variety of cluster oscillations. Surprisingly, if n > 1, the clusters fall into two groups one with low amplitude fluctuations and the other with high amplitude fluctuations (i.e. duality in phase space), implying that control actions radically alter the system's characteristics by inducing duality and more complex dynamics. When the impulsive period is small enough, i.e. the control strategy is undertaken frequently, the pest can be eradicated. As the period increases, the pest's dynamics shift from a steady state to become chaotic with periodic windows and more multicluster oscillations arise for heterogenous initial density distributions. Period-doubling bifurcation and periodic halving cascades occur as the releasing constant of the natural enemy increases. For the same ecological system with five differently connected networks, as the randomness of the connectedness increases, the transient duration becomes smaller and the probability of multicluster oscillations appearing becomes higher.
NASA Astrophysics Data System (ADS)
MacBean, N.; Disney, M.; Gomez-Dans, J.; Lewis, P.; Ineson, P.
2010-12-01
Peatlands are important stores of carbon through the partial decomposition of organic matter. However peatlands not only sequester CO2 but they are the main natural source of methane (CH4) due to anaerobic microbial activity under waterlogged conditions. Northern wetlands contribute about 35TgCH4yr-1 [1]. The uncertainty on this estimate is large (from 1mgCH4 m-2 y-1 to 2200mgCH4 m-2 y-1), therefore there is a need to better quantify CH4 emissions and their role in the net carbon balance of peatlands. A correct representation of the hydrology of the system is necessary for modelling CH4 flux as the water table depth controls the area where methanogenic bacteria are active. One of the key variables in the calculation of water table depth is the soil moisture. Soil moisture also affects the decomposition rates of carbon in the soil and influences the water and energy fluxes at the land surface - atmosphere boundary. Microwave measurements of surface soil moisture from satellites can theoretically be used to improve estimates predicted by models. Results from an Observing System Simulation Experiment (OSSE), designed to investigate how observations from satellites may be able to constrain modelled carbon fluxes, are presented. An adapted version of the Carnegie-Ames-Stanford Approach (CASA) model [2] is used that includes a representation of methane dynamics [3]. Synthetic satellite observations of soil moisture are used to update model estimates using a Metropolis Hastings Markov Chain Monte Carlo (MCMC) approach. The effect of temporal frequency and spacing, and precision of observations, is examined with a view to establishing the set of observations that would make a significant improvement in model uncertainty. The results are compared with the system characteristics of existing satellite soil moisture measurements. We believe this is the first attempt to assimilate surface soil moisture into an ecosystem model that includes a full representation of CH4 flux.
Fit for high altitude: are hypoxic challenge tests useful?
Matthys, Heinrich
2011-01-01
Altitude travel results in acute variations of barometric pressure, which induce different degrees of hypoxia, changing the gas contents in body tissues and cavities. Non ventilated air containing cavities may induce barotraumas of the lung (pneumothorax), sinuses and middle ear, with pain, vertigo and hearing loss. Commercial air planes keep their cabin pressure at an equivalent altitude of about 2,500 m. This leads to an increased respiratory drive which may also result in symptoms of emotional hyperventilation. In patients with preexisting respiratory pathology due to lung, cardiovascular, pleural, thoracic neuromuscular or obesity-related diseases (i.e. obstructive sleep apnea) an additional hypoxic stress may induce respiratory pump and/or heart failure. Clinical pre-altitude assessment must be disease-specific and it includes spirometry, pulsoximetry, ECG, pulmonary and systemic hypertension assessment. In patients with abnormal values we need, in addition, measurements of hemoglobin, pH, base excess, PaO2, and PaCO2 to evaluate whether O2- and CO2-transport is sufficient.Instead of the hypoxia altitude simulation test (HAST), which is not without danger for patients with respiratory insufficiency, we prefer primarily a hyperoxic challenge. The supplementation of normobaric O2 gives us information on the acute reversibility of the arterial hypoxemia and the reduction of ventilation and pulmonary hypertension, as well as about the efficiency of the additional O2-flow needed during altitude exposure. For difficult judgements the performance of the test in a hypobaric chamber with and without supplemental O2-breathing remains the gold standard. The increasing numbers of drugs to treat acute pulmonary hypertension due to altitude exposure (acetazolamide, dexamethasone, nifedipine, sildenafil) or to other etiologies (anticoagulants, prostanoids, phosphodiesterase-5-inhibitors, endothelin receptor antagonists) including mechanical aids to reduce periodical or
The Brief Kinesthesia test is feasible and sensitive: a study in stroke
Borstad, Alexandra; Nichols-Larsen, Deborah S.
2016-01-01
BACKGROUND: Clinicians lack a quantitative measure of kinesthetic sense, an important contributor to sensorimotor control of the hand and arm. OBJECTIVES: The objective here was to determine the feasibility of administering the Brief Kinesthesia Test (BKT) and begin to validate it by 1) reporting BKT scores from persons with chronic stroke and a healthy comparison group and 2) examining the relationship between the BKT scores and other valid sensory and motor measures. METHOD: Adults with stroke and mild to moderate hemiparesis (N=12) and an age-, gender-, and handedness-matched healthy comparison group (N=12) completed the BKT by reproducing three targeted reaching movements per hand with vision occluded. OTHER MEASURES: the Hand Active Sensation Test (HASTe), Touch-Test(tm) monofilament aesthesiometer, 6-item Wolf Motor Function Test (Wolf), the Motor Activity Log (MAL), and the Box and Blocks Test (BBT). A paired t-test compared BKT scores between groups. Pearson product-moment correlation coefficients assessed the relationship between BKT scores and other measures. RESULTS: Post-stroke participants performed more poorly on the BKT than comparison participants with their contralesional and ipsilesional upper extremity. The mean difference for the contralesional upper extremity was 3.7 cm (SE=1.1, t=3.34; p<0.008). The BKT score for the contralesional limb was strongly correlated with the MAL-how much (r=0.84, p=0.001), the MAL-how well (r=0.76, p=0.007), Wolf (r=0.69, p=0.02), and the BBT (r=0.77, p=0.006). CONCLUSIONS: The BKT was feasible to administer and sensitive to differences in reaching accuracy between persons with stroke and a comparison group. With further refinement, The BKT may become a valuable clinical measure of post-stroke kinesthetic impairment. PMID:26786083
Quantum dot transport in soil, plants, and insects.
Al-Salim, Najeh; Barraclough, Emma; Burgess, Elisabeth; Clothier, Brent; Deurer, Markus; Green, Steve; Malone, Louise; Weir, Graham
2011-08-01
Environmental risk assessment of nanomaterials requires information not only on their toxicity to non-target organisms, but also on their potential exposure pathways. Here we report on the transport and fate of quantum dots (QDs) in the total environment: from soils, through their uptake into plants, to their passage through insects following ingestion. Our QDs are nanoparticles with an average particle size of 6.5 nm. Breakthrough curves obtained with CdTe/mercaptopropionic acid QDs applied to columns of top soil from a New Zealand organic apple orchard, a Hastings silt loam, showed there to be preferential flow through the soil's macropores. Yet the effluent recovery of QDs was just 60%, even after several pore volumes, indicating that about 40% of the influent QDs were filtered and retained by the soil column via some unknown exchange/adsorption/sequestration mechanism. Glycine-, mercaptosuccinic acid-, cysteine-, and amine-conjugated CdSe/ZnS QDs were visibly transported to a limited extent in the vasculature of ryegrass (Lolium perenne), onion (Allium cepa) and chrysanthemum (Chrysanthemum sp.) plants when cut stems were placed in aqueous QD solutions. However, they were not seen to be taken up at all by rooted whole plants of ryegrass, onion, or Arabidopsis thaliana placed in these solutions. Leafroller (Lepidoptera: Tortricidae) larvae fed with these QDs for two or four days, showed fluorescence along the entire gut, in their frass (larval feces), and, at a lower intensity, in their haemolymph. Fluorescent QDs were also observed and elevated cadmium levels detected inside the bodies of adult moths that had been fed QDs as larvae. These results suggest that exposure scenarios for QDs in the total environment could be quite complex and variable in each environmental domain. PMID:21632093
Kramerov, A A; Golub, A G; Bdzhola, V G; Yarmoluk, S M; Ahmed, K; Bretner, M; Ljubimov, A V
2011-03-01
Ubiquitous protein kinase CK2 is a key regulator of cell migration, proliferation and tumor growth. CK2 is abundant in retinal astrocytes, and its inhibition suppresses retinal neovascularization in a mouse retinopathy model. In human astrocytes, CK2 co-distributes with GFAP-containing intermediate filaments, which implies its association with cytoskeleton. Contrary to astrocytes, CK2 is co-localized in microvascular endothelial cells (HBMVEC) with microtubules and actin stress fibers, but not with vimentin-containing intermediate filaments. Specific CK2 inhibitors (TBB, TBI, TBCA and DMAT) and nine novel CK2 inhibiting compounds (TID43, TID46, Quinolone-7, Quinolone-39, FNH28, FNH62, FNH64, FNH68 and FNH74) were tested at 10-200 μM for their ability to induce morphological alterations in cultured human astrocytes (HAST-40), and HBMVEC (For explanation of the inhibitor names, see "Methods" section). CK2 inhibitors caused dramatic changes in shape of cultured cells with effective inhibitor concentrations between 50 and 100 μM. Attached cells retracted, acquired shortened processes, and eventually rounded up and detached. CK2 inhibitor-induced morphological alterations were completely reversible and were not blocked by caspase inhibition. However, longer treatment or higher inhibitor concentration did cause apoptosis. The speed and potency of the CK2 inhibitors effects on cell shape and adhesion were inversely correlated with serum concentration. Western analyses showed that TBB and TBCA elicited a significant (about twofold) increase in the activation of p38 and ERK1/2 MAP kinases that may be involved in cytoskeleton regulation. This novel early biological cell response to CK2 inhibition may underlie the anti-angiogenic effect of CK2 suppression in the retina. PMID:21125314
Markov random field restoration of point correspondences for active shape modeling
NASA Astrophysics Data System (ADS)
Hilger, Klaus B.; Paulsen, Rasmus R.; Larsen, Rasmus
2004-05-01
In this paper it is described how to build a statistical shape model using a training set with a sparse of landmarks. A well defined model mesh is selected and fitted to all shapes in the training set using thin plate spline warping. This is followed by a projection of the points of the warped model mesh to the target shapes. When this is done by a nearest neighbour projection it can result in folds and inhomogeneities in the correspondence vector field. The novelty in this paper is the use and extension of a Markov random field regularisation of the correspondence field. The correspondence field is regarded as a collection of random variables, and using the Hammersley-Clifford theorem it is proved that it can be treated as a Markov Random Field. The problem of finding the optimal correspondence field is cast into a Bayesian framework for Markov Random Field restoration, where the prior distribution is a smoothness term and the observation model is the curvature of the shapes. The Markov Random Field is optimised using a combination of Gibbs sampling and the Metropolis-Hasting algorithm. The parameters of the model are found using a leave-one-out approach. The method leads to a generative model that produces highly homogeneous polygonised shapes with improved reconstruction capabilities of the training data. Furthermore, the method leads to an overall reduction in the total variance of the resulting point distribution model. The method is demonstrated on a set of human ear canals extracted from 3D-laser scans.
Emulation of MIROC5 with a simple climate model
NASA Astrophysics Data System (ADS)
Ishizaki, Yasuhiro; Emori, Seita; Shiogama, Hideo; Takahashi, Kiyoshi; Yokohata, Tokuta; Yoshimori, Masakazu
2014-05-01
We developed a simple climate model based on MAGICC6, and investigated the ability of the simple climate model to emulate global mean surface air temperature (SAT) changes of an atmosphere-ocean general circulation model (MIROC5) in the twenty-first century in representative concentration pathways (RCPs). Some previous research indicated that climate sensitivity, ocean vertical diffusion and forcing of anthropogenic aerosols (direct and indirect effects of sulfate aerosol, black carbon and organic carbon) are important factors to emulate global mean SAT changes of atmosphere-ocean general circulation models CMIP3. We therefore estimate these important parameters in the simple climate model using a Metropolis-Hastings Markov chain Monte Carlo (MCMC) approach. The estimated values of the important parameters by the MCMC are physically valid, and our simple climate model can successfully emulate global mean SAT changes of MIROC5 in RCPs with the estimated parameters by the MCMC approach. In addition, we estimated the relative contributions f each important parameter in sensitivity experiments, in which we change the value of an important parameter from the estimated one by the MCMC to the default value of MAGICC6. As a result, we found that the estimation of climate sensitivity is the most important factor for the emulation of the AOGCM, and the stimation of ocean vertical diffusion is also important factor. Although the estimations of the anthropogenic aerosols forcing are very important for the emulation of the AOGCM in the twenty century, the influence of them on the emulation of the AOGCM in the twenty first century is very small. This is because emissions of anthropogenic aerosols are projected to decrease in the twenty first century, and relative contributions of the forcing of anthropogenic aerosols also decrease. Carbon cycle models are not incorporated into our simple climate model yet. A sophisticated carbon cycle model is required to be incorporated into
Zhang, Linlin; Guindani, Michele; Versace, Francesco; Vannucci, Marina
2014-07-15
In this paper we present a novel wavelet-based Bayesian nonparametric regression model for the analysis of functional magnetic resonance imaging (fMRI) data. Our goal is to provide a joint analytical framework that allows to detect regions of the brain which exhibit neuronal activity in response to a stimulus and, simultaneously, infer the association, or clustering, of spatially remote voxels that exhibit fMRI time series with similar characteristics. We start by modeling the data with a hemodynamic response function (HRF) with a voxel-dependent shape parameter. We detect regions of the brain activated in response to a given stimulus by using mixture priors with a spike at zero on the coefficients of the regression model. We account for the complex spatial correlation structure of the brain by using a Markov random field (MRF) prior on the parameters guiding the selection of the activated voxels, therefore capturing correlation among nearby voxels. In order to infer association of the voxel time courses, we assume correlated errors, in particular long memory, and exploit the whitening properties of discrete wavelet transforms. Furthermore, we achieve clustering of the voxels by imposing a Dirichlet process (DP) prior on the parameters of the long memory process. For inference, we use Markov Chain Monte Carlo (MCMC) sampling techniques that combine Metropolis-Hastings schemes employed in Bayesian variable selection with sampling algorithms for nonparametric DP models. We explore the performance of the proposed model on simulated data, with both block- and event-related design, and on real fMRI data. PMID:24650600
Changes in the social context and conduct of eating in four Nordic countries between 1997 and 2012.
Holm, Lotte; Lauridsen, Drude; Lund, Thomas Bøker; Gronow, Jukka; Niva, Mari; Mäkelä, Johanna
2016-08-01
How have eating patterns changed in modern life? In public and academic debate concern has been expressed that the social function of eating may be challenged by de-structuration and the dissolution of traditions. We analyzed changes in the social context and conduct of eating in four Nordic countries over the period 1997-2012. We focused on three interlinked processes often claimed to be distinctive of modern eating: delocalization of eating from private households to commercial settings, individualization in the form of more eating alone, and informalization, implying more casual codes of conduct. We based the analysis on data from two surveys conducted in Denmark, Finland, Norway and Sweden in 1997 and 2012. The surveys reported in detail one day of eating in representative samples of adult populations in the four countries (N = 4823 and N = 8242). We compared data regarding where, with whom, and for how long people ate, and whether parallel activities took place while eating. While Nordic people's primary location for eating remained the home and the workplace, the practices of eating in haste, and while watching television increased and using tablets, computers and smartphones while eating was frequent in 2012. Propensity to eat alone increased slightly in Denmark and Norway, and decreased slightly in Sweden. While such practices vary with socio-economic background, regression analysis showed several changes were common across the Nordic populations. However, the new practice of using tablets, computers, and smartphones while eating was strongly associated with young age. Further, each of the practices appeared to be related to different types of meal. We conclude that while the changes in the social organization of eating were not dramatic, signs of individualization and informalization could be detected. PMID:27131417
Bayesian model selection validates a biokinetic model for zirconium processing in humans
2012-01-01
Background In radiation protection, biokinetic models for zirconium processing are of crucial importance in dose estimation and further risk analysis for humans exposed to this radioactive substance. They provide limiting values of detrimental effects and build the basis for applications in internal dosimetry, the prediction for radioactive zirconium retention in various organs as well as retrospective dosimetry. Multi-compartmental models are the tool of choice for simulating the processing of zirconium. Although easily interpretable, determining the exact compartment structure and interaction mechanisms is generally daunting. In the context of observing the dynamics of multiple compartments, Bayesian methods provide efficient tools for model inference and selection. Results We are the first to apply a Markov chain Monte Carlo approach to compute Bayes factors for the evaluation of two competing models for zirconium processing in the human body after ingestion. Based on in vivo measurements of human plasma and urine levels we were able to show that a recently published model is superior to the standard model of the International Commission on Radiological Protection. The Bayes factors were estimated by means of the numerically stable thermodynamic integration in combination with a recently developed copula-based Metropolis-Hastings sampler. Conclusions In contrast to the standard model the novel model predicts lower accretion of zirconium in bones. This results in lower levels of noxious doses for exposed individuals. Moreover, the Bayesian approach allows for retrospective dose assessment, including credible intervals for the initially ingested zirconium, in a significantly more reliable fashion than previously possible. All methods presented here are readily applicable to many modeling tasks in systems biology. PMID:22863152
Uncertainty in dual permeability model parameters for structured soils
Arora, B.; Mohanty, B. P.; McGuire, J. T.
2013-01-01
Successful application of dual permeability models (DPM) to predict contaminant transport is contingent upon measured or inversely estimated soil hydraulic and solute transport parameters. The difficulty in unique identification of parameters for the additional macropore- and matrix-macropore interface regions, and knowledge about requisite experimental data for DPM has not been resolved to date. Therefore, this study quantifies uncertainty in dual permeability model parameters of experimental soil columns with different macropore distributions (single macropore, and low- and high-density multiple macropores). Uncertainty evaluation is conducted using adaptive Markov chain Monte Carlo (AMCMC) and conventional Metropolis-Hastings (MH) algorithms while assuming 10 out of 17 parameters to be uncertain or random. Results indicate that AMCMC resolves parameter correlations and exhibits fast convergence for all DPM parameters while MH displays large posterior correlations for various parameters. This study demonstrates that the choice of parameter sampling algorithms is paramount in obtaining unique DPM parameters when information on covariance structure is lacking, or else additional information on parameter correlations must be supplied to resolve the problem of equifinality of DPM parameters. This study also highlights the placement and significance of matrix-macropore interface in flow experiments of soil columns with different macropore densities. Histograms for certain soil hydraulic parameters display tri-modal characteristics implying that macropores are drained first followed by the interface region and then by pores of the matrix domain in drainage experiments. Results indicate that hydraulic properties and behavior of the matrix-macropore interface is not only a function of saturated hydraulic conductivity of the macroporematrix interface (Ksa) and macropore tortuosity (lf) but also of other parameters of the matrix and macropore domains. PMID:24478531
NASA Astrophysics Data System (ADS)
Gehrmann, Romina A. S.; Schwalenberg, Katrin; Riedel, Michael; Spence, George D.; Spieß, Volkhard; Dosso, Stan E.
2016-01-01
This paper applies nonlinear Bayesian inversion to marine controlled source electromagnetic (CSEM) data collected near two sites of the Integrated Ocean Drilling Program (IODP) Expedition 311 on the northern Cascadia Margin to investigate subseafloor resistivity structure related to gas hydrate deposits and cold vents. The Cascadia margin, off the west coast of Vancouver Island, Canada, has a large accretionary prism where sediments are under pressure due to convergent plate boundary tectonics. Gas hydrate deposits and cold vent structures have previously been investigated by various geophysical methods and seabed drilling. Here, we invert time-domain CSEM data collected at Sites U1328 and U1329 of IODP Expedition 311 using Bayesian methods to derive subsurface resistivity model parameters and uncertainties. The Bayesian information criterion is applied to determine the amount of structure (number of layers in a depth-dependent model) that can be resolved by the data. The parameter space is sampled with the Metropolis-Hastings algorithm in principal-component space, utilizing parallel tempering to ensure wider and efficient sampling and convergence. Nonlinear inversion allows analysis of uncertain acquisition parameters such as time delays between receiver and transmitter clocks as well as input electrical current amplitude. Marginalizing over these instrument parameters in the inversion accounts for their contribution to the geophysical model uncertainties. One-dimensional inversion of time-domain CSEM data collected at measurement sites along a survey line allows interpretation of the subsurface resistivity structure. The data sets can be generally explained by models with 1 to 3 layers. Inversion results at U1329, at the landward edge of the gas hydrate stability zone, indicate a sediment unconformity as well as potential cold vents which were previously unknown. The resistivities generally increase upslope due to sediment erosion along the slope. Inversion
Non-linearity in Bayesian 1-D magnetotelluric inversion
NASA Astrophysics Data System (ADS)
Guo, Rongwen; Dosso, Stan E.; Liu, Jianxin; Dettmer, Jan; Tong, Xiaozhong
2011-05-01
This paper applies a Bayesian approach to examine non-linearity for the 1-D magnetotelluric (MT) inverse problem. In a Bayesian formulation the posterior probability density (PPD), which combines data and prior information, is interpreted in terms of parameter estimates and uncertainties, which requires optimizing and integrating the PPD. Much work on 1-D MT inversion has been based on (approximate) linearized solutions, but more recently fully non-linear (numerical) approaches have been applied. This paper directly compares results of linearized and non-linear uncertainty estimation for 1-D MT inversion; to do so, advanced methods for both approaches are applied. In the non-linear formulation used here, numerical optimization is carried out using an adaptive-hybrid algorithm. Numerical integration applies Metropolis-Hastings sampling, rotated to a principal-component parameter space for efficient sampling of correlated parameters, and employing non-unity sampling temperatures to ensure global sampling. Since appropriate model parametrizations are generally not known a priori, both under- and overparametrized approaches are considered. For underparametrization, the Bayesian information criterion is applied to determine the number of layers consistent with the resolving power of the data. For overparametrization, prior information is included which favours simple structure in a manner similar to regularized inversion. The data variance and/or trade-off parameter regulating data and prior information are treated in several ways, including applying fixed optimal estimates (an empirical Bayesian approach) or including them as hyperparameters in the sampling (hierarchical Bayesian). The latter approach has the benefit of accounting for the uncertainty in the hyperparameters in estimating model parameter uncertainties. Non-linear and linearized inversion results are compared for synthetic test cases and for the measured COPROD1 MT data by considering marginal probability
An implementation of differential evolution algorithm for inversion of geoelectrical data
NASA Astrophysics Data System (ADS)
Balkaya, Çağlayan
2013-11-01
Differential evolution (DE), a population-based evolutionary algorithm (EA) has been implemented to invert self-potential (SP) and vertical electrical sounding (VES) data sets. The algorithm uses three operators including mutation, crossover and selection similar to genetic algorithm (GA). Mutation is the most important operator for the success of DE. Three commonly used mutation strategies including DE/best/1 (strategy 1), DE/rand/1 (strategy 2) and DE/rand-to-best/1 (strategy 3) were applied together with a binomial type crossover. Evolution cycle of DE was realized without boundary constraints. For the test studies performed with SP data, in addition to both noise-free and noisy synthetic data sets two field data sets observed over the sulfide ore body in the Malachite mine (Colorado) and over the ore bodies in the Neem-Ka Thana cooper belt (India) were considered. VES test studies were carried out using synthetically produced resistivity data representing a three-layered earth model and a field data set example from Gökçeada (Turkey), which displays a seawater infiltration problem. Mutation strategies mentioned above were also extensively tested on both synthetic and field data sets in consideration. Of these, strategy 1 was found to be the most effective strategy for the parameter estimation by providing less computational cost together with a good accuracy. The solutions obtained by DE for the synthetic cases of SP were quite consistent with particle swarm optimization (PSO) which is a more widely used population-based optimization algorithm than DE in geophysics. Estimated parameters of SP and VES data were also compared with those obtained from Metropolis-Hastings (M-H) sampling algorithm based on simulated annealing (SA) without cooling to clarify uncertainties in the solutions. Comparison to the M-H algorithm shows that DE performs a fast approximate posterior sampling for the case of low-dimensional inverse geophysical problems.
Segmentation of polycystic kidneys from MR images
NASA Astrophysics Data System (ADS)
Racimora, Dimitri; Vivier, Pierre-Hugues; Chandarana, Hersh; Rusinek, Henry
2010-03-01
Polycystic kidney disease (PKD) is a disorder characterized by the growth of numerous fluid filled cysts in the kidneys. Measuring cystic kidney volume is thus crucial to monitoring the evolution of the disease. While T2-weighted MRI delineates the organ, automatic segmentation is very difficult due to highly variable shape and image contrast. The interactive stereology methods used currently involve a compromise between segmentation accuracy and time. We have investigated semi-automated methods: active contours and a sub-voxel morphology based algorithm. Coronal T2- weighted images of 17 patients were acquired in four breath-holds using the HASTE sequence on a 1.5 Tesla MRI unit. The segmentation results were compared to ground truth kidney masks obtained as a consensus of experts. Automatic active contour algorithm yielded an average 22% +/- 8.6% volume error. A recently developed method (Bridge Burner) based on thresholding and constrained morphology failed to separate PKD from the spleen, yielding 37.4% +/- 8.7% volume error. Manual post-editing reduced the volume error to 3.2% +/- 0.8% for active contours and 3.2% +/- 0.6% for Bridge Burner. The total time (automated algorithm plus editing) was 15 min +/- 5 min for active contours and 19 min +/- 11 min for Bridge Burner. The average volume errors for stereology method were 5.9%, 6.2%, 5.4% for mesh size 6.6, 11, 16.5 mm. The average processing times were 17, 7, 4 min. These results show that nearly two-fold improvement in PKD segmentation accuracy over stereology technique can be achieved with a combination of active contours and postediting.
Model and parameter uncertainty in IDF relationships under climate change
NASA Astrophysics Data System (ADS)
Chandra, Rupa; Saha, Ujjwal; Mujumdar, P. P.
2015-05-01
Quantifying distributional behavior of extreme events is crucial in hydrologic designs. Intensity Duration Frequency (IDF) relationships are used extensively in engineering especially in urban hydrology, to obtain return level of extreme rainfall event for a specified return period and duration. Major sources of uncertainty in the IDF relationships are due to insufficient quantity and quality of data leading to parameter uncertainty due to the distribution fitted to the data and uncertainty as a result of using multiple GCMs. It is important to study these uncertainties and propagate them to future for accurate assessment of return levels for future. The objective of this study is to quantify the uncertainties arising from parameters of the distribution fitted to data and the multiple GCM models using Bayesian approach. Posterior distribution of parameters is obtained from Bayes rule and the parameters are transformed to obtain return levels for a specified return period. Markov Chain Monte Carlo (MCMC) method using Metropolis Hastings algorithm is used to obtain the posterior distribution of parameters. Twenty six CMIP5 GCMs along with four RCP scenarios are considered for studying the effects of climate change and to obtain projected IDF relationships for the case study of Bangalore city in India. GCM uncertainty due to the use of multiple GCMs is treated using Reliability Ensemble Averaging (REA) technique along with the parameter uncertainty. Scale invariance theory is employed for obtaining short duration return levels from daily data. It is observed that the uncertainty in short duration rainfall return levels is high when compared to the longer durations. Further it is observed that parameter uncertainty is large compared to the model uncertainty.
NASA Astrophysics Data System (ADS)
Levia, Delphis
2015-04-01
advisory Environmental Council include Drs. Delphis Levia (Program Director & Chair), Nancy Targett (Dean), Frank Newton, Tracy Deliberty, Steve Hastings, John Madsen, Paul Imhoff, Jan Johnson, Jerry Kauffman, Murray Johnston.
Hardage, Bob
2013-07-01
This 3-year project was terminated at the end of Year 1 because the DOE Geothermal project-evaluation committee decided one Milestone was not met and also concluded that our technology would not be successful. The Review Panel recommended a ?no-go? decision be implemented by DOE. The Principal Investigator and his research team disagreed with the conclusions reached by the DOE evaluation committee and wrote a scientifically based rebuttal to the erroneous claims made by the evaluators. We were not told if our arguments were presented to the people who evaluated our work and made the ?no-go? decision. Whatever the case regarding the information we supplied in rebuttal, we received an official letter from Laura Merrick, Contracting Officer at the Golden Field Office, dated June 11, 2013 in which we were informed that project funding would cease and instructed us to prepare a final report before September 5, 2013. In spite of the rebuttal arguments we presented to DOE, this official letter repeated the conclusions of the Review Panel that we had already proven to be incorrect. This is the final report that we are expected to deliver. The theme of this report will be another rebuttal of the technical deficiencies claimed by the DOE Geothermal Review Panel about the value and accomplishments of the work we did in Phase 1 of the project. The material in this report will present images made from direct-S modes produced by vertical-force sources using the software and research findings we developed in Phase 1 that the DOE Review Panel said would not be successful. We made these images in great haste when we were informed that DOE Geothermal rejected our rebuttal arguments and still regarded our technical work to be substandard. We thought it was more important to respond quickly rather than to take additional time to create better quality images than what we present in this Final Report.
Probabilistic Graphical Model Representation in Phylogenetics
Höhna, Sebastian; Heath, Tracy A.; Boussau, Bastien; Landis, Michael J.; Ronquist, Fredrik; Huelsenbeck, John P.
2014-01-01
Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis–Hastings or Gibbs sampling of the posterior distribution. [Computation; graphical models; inference; modularization; statistical phylogenetics; tree plate.] PMID:24951559
Consert during the Philae Descent
NASA Astrophysics Data System (ADS)
Herique, Alain; Berquin, Yann; Blazquez, Alejandro; Antoine Foulon, Marc; Hahnel, Ronny; Hegler, Sebastian; Jurado, Eric; Kofman, Wlodek; Plettemeier, Dirk; Rogez, Yves; Statz, Christoph; Zine, Sonia
2014-05-01
The CONSERT experiment on board Rosetta and Philae is to perform the tomography of the 67P/CG comet nucleus measuring radio waves transmission from the Rosetta S/C to the Philae Lander and using the 67P nucleus rotation to cover different geometries. CONSERT will operate during the Philae descent. This geometry strongly differs from the "nominal" bistatic tomography where the orbiter is on the opposite side of the nucleus by regard to the lander. During the descent, CONSERT will measure direct wave propagating from orbiter to lander and waves reflected / scattered by the 67P surface and subsurface. This signal will provide information of the greatest interest for both scientific investigations of 67P and technical operations of Philae. The landing site position is known a priori with a large ellipse of dispersion due to uncertainties on the Rosetta velocity and Rosetta/Philae separation strength. This dispersion is increased by the difference between nominal and emergency separation strength. An accurate estimation of the landing position as soon as possible after landing is of the greatest interest to optimize Philae operation during FSS. So propagation delay of the direct and reflected waves measured by CONSERT will help to reconstruct the descent geometry in order to more precisely estimate the landing position. The reflected signal is determined by the surface properties: its dielectric permittivity, its roughness and layering. The signal power inversion will allow to map surface properties especially in the vicinity of the landing site. This paper details the measurement configuration. It presents the data retrieval based on Monte-Carlo simulation using Metropolis-Hastings algorithm and expected performances for both science and operations.
NASA Astrophysics Data System (ADS)
Naylor, Mark; Mudd, Simon; Yoo, Kyungsoo
2010-05-01
Hillslope topography and soil thickness respond to changes in river incision or deposition. For example, accelerated river incision leads to a wave of steepening and soil thinning that begins at the channel and moves upslope [1]. Because of the coupled response of topography, soil thickness and channel incision or deposition rates, it may be possible to use hillslope properties to reconstruct the erosional or depositional history of channels. A prerequisite for such inversion of hillslope properties to reconstruct historical landscape dynamics is a method that allows one to quantify both the most likely channel history as well as the uncertainties in changing channel erosion or deposition rates through time. Here we present robust methods ideally suited for this purpose: Monte Carlo Markov Chain (MCMC) methods. Specifically, MCMC methods [2] involve (i) taking some assumed base level history, (ii) perturbing that history (iii) running a forward model to estimate new hillslope profiles (iv) choosing whether to accept the new history by a Metropolis-Hastings Algorithm [3] (v) storing the favoured history and repeating. In this way we iterate towards the most likely channel history whilst exploring parameter space in such a way that confidence intervals can quantified. Here we demonstrate how this approach returns not only a best estimate history, but also credibility intervals which reflect the progressive loss of information with time. These techniques are generic and should be employed more generally within geomorphology. [1] Mudd, S.M., and D.J. Furbish (2007), Responses of soil mantled hillslopes to transient channel incision rates, Journal of Geophysical Research-Earth Surface, 112, F03S18, doi:10.1029/2006JF000516 [2] K. Gallagher, K. Charvin, S. Nielsen, M. Sambridge and J. Stephenson (2009) Markov chain Monte Carlo (MCMC) sampling methods to determine optimal models, model resolution and model choice for Earth Science problems Marine and Petroleum Geology 26
Essentials in the diagnosis of acid-base disorders and their high altitude application.
Paulev, P E; Zubieta-Calleja, G R
2005-09-01
This report describes the historical development in the clinical application of chemical variables for the interpretation of acid-base disturbances. The pH concept was already introduced in 1909. Following World War II, disagreements concerning the definition of acids and bases occurred, and since then two strategies have been competing. Danish scientists in 1923 defined an acid as a substance able to give off a proton at a given pH, and a base as a substance that could bind a proton, whereas the North American Singer-Hasting school in 1948 defined acids as strong non-buffer anions and bases as non-buffer cations. As a consequence of this last definition, electrolyte disturbances were mixed up with real acid-base disorders and the variable, strong ion difference (SID), was introduced as a measure of non-respiratory acid-base disturbances. However, the SID concept is only an empirical approximation. In contrast, the Astrup/Siggaard-Andersen school of scientists, using computer strategies and the Acid-base Chart, has made diagnosis of acid-base disorders possible at a glance on the Chart, when the data are considered in context with the clinical development. Siggaard-Andersen introduced Base Excess (BE) or Standard Base Excess (SBE) in the extracellular fluid volume (ECF), extended to include the red cell volume (eECF), as a measure of metabolic acid-base disturbances and recently replaced it by the term Concentration of Titratable Hydrogen Ion (ctH). These two concepts (SBE and ctH) represent the same concentration difference, but with opposite signs. Three charts modified from the Siggaard-Andersen Acid-Base Chart are presented for use at low, medium and high altitudes of 2500 m, 3500 m, and 4000 m, respectively. In this context, the authors suggest the use of Titratable Hydrogen Ion concentration Difference (THID) in the extended extracellular fluid volume, finding it efficient and better than any other determination of the metabolic component in acid
NASA Astrophysics Data System (ADS)
Stanaway, D. J.; Flores, A. N.; Haggerty, R.; Benner, S. G.; Feris, K. P.
2011-12-01
curves (BTC), subsequently modeled by the RRADE and thereby allowing derivation of in situ rates of metabolism. RRADE parameter values are estimated through Metropolis Hastings MCMC optimization. Unknown prior parameter distributions (PD) were constrained via a sensitivity analysis, except for the empirically estimated velocity. MCMC simulations were initiated at random points within the PD. Convergence of target distributions (TD) is achieved when the variance of the mode values of the six RRADE parameters in independent model replication is at least 10^{-3} less than the mode value. Convergence of k12, the parameter of interest, was more resolved, with modal variance of replicate simulations ranging from 10^{-4} less than the modal value to 0. The MCMC algorithm presented here offers a robust approach to solve the inverse RRST model and could be easily adapted to other inverse problems.
Improving carbon model phenology using data assimilation
NASA Astrophysics Data System (ADS)
Exrayat, Jean-François; Smallman, T. Luke; Bloom, A. Anthony; Williams, Mathew
2015-04-01
Carbon cycle dynamics is significantly impacted by ecosystem phenology, leading to substantial seasonal and inter-annual variation in the global carbon balance. Representing inter-annual variability is key for predicting the response of the terrestrial ecosystem to climate change and disturbance. Existing terrestrial ecosystem models (TEMs) often struggle to accurately simulate observed inter-annual variability. TEMs often use different phenological models based on plant functional type (PFT) assumptions. Moreover, due to a high level of computational overhead in TEMs they are unable to take advantage of globally available datasets to calibrate their models. Here we describe the novel CARbon DAta MOdel fraMework (CARDAMOM) for data assimilation. CARDAMOM is used to calibrate the Data Assimilation Linked Ecosystem Carbon version 2 (DALEC2) model using Bayes' Theorem within a Metropolis Hastings - Markov Chain Monte Carlo (MH-MCMC). CARDAMOM provides a framework which combines knowledge from observations, such as remotely sensed LAI, and heuristic information in the form of Ecological and Dynamical Constraints (EDCs). The EDCs are representative of real world processes and constrain parameter interdependencies and constrain carbon dynamics. We used CARDAMOM to bring together globally spanning datasets of LAI and the DALEC2 and DALEC2-GSI models. These analyses allow us to investigate the sensitivity ecosystem processes to the representation of phenology. DALEC2 uses an analytically solved model of phenology which is invariant between years. In contrast DALEC2-GSI uses a growing season index (GSI) calculated as a function of temperature, vapour pressure deficit (VPD) and photoperiod to calculate bud-burst and leaf senescence, allowing the model to simulate inter-annual variability in response to climate. Neither model makes any PFT assumptions about the phenological controls of a given ecosystem, allowing the data alone to determine the impact of the meteorological
NASA Astrophysics Data System (ADS)
Dornmayr-Pfaffenhuemer, Marion; Pierson, Elisabeth; Janssen, Geert-Jan; Stan-Lotter, Helga
2010-05-01
The research into extreme environments hast important implications for biology and other sciences. Many of the organisms found there provide insights into the history of Earth. Life exists in all niches where water is present in liquid form. Isolated environments such as caves and other subsurface locations are of interest for geomicrobiological studies. And because of their "extra-terrestrial" conditions such as darkness and mostly extreme physicochemical state they are also of astrobiological interest. The slightly radioactive thermal spring at Bad Gastein (Austria) was therefore examined for the occurrence of subsurface microbial communities. The surfaces of the submerged rocks in this warm spring were overgrown by microbial mats. Scanning electron microscopy (SEM) performed by the late Dr. Wolfgang Heinen revealed an interesting morphological diversity in biofilms found in this environment (1, 2). Molecular analysis of the community structure of the radioactive subsurface thermal spring was performed by Weidler et al. (3). The growth of these mats was simulated using sterile glass slides which were exposed to the water stream of the spring. Those mats were analysed microscopically. Staining, using fluorescent dyes such as 4',6-Diamidino-2-phenylindol (DAPI), gave an overview of the microbial diversity of these biofilms. Additional SEM samples were prepared using different fixation protocols. Scanning confocal laser microscopy (SCLM) allowed a three dimensional view of the analysed biofilms. This work presents some electron micrographs of Dr. Heinen and additionally new microscopic studies of the biofilms formed on the glass slides. The appearances of the new SEM micrographs were compared to those of Dr. Heinen that were done several years ago. The morphology and small-scale distribution in the microbial mat was analyzed by fluorescence microscopy. The examination of natural biomats and biofilms grown on glass slides using several microscopical techniques
Degradation monitoring using probabilistic inference
NASA Astrophysics Data System (ADS)
Alpay, Bulent
In order to increase safety and improve economy and performance in a nuclear power plant (NPP), the source and extent of component degradations should be identified before failures and breakdowns occur. It is also crucial for the next generation of NPPs, which are designed to have a long core life and high fuel burnup to have a degradation monitoring system in order to keep the reactor in a safe state, to meet the designed reactor core lifetime and to optimize the scheduled maintenance. Model-based methods are based on determining the inconsistencies between the actual and expected behavior of the plant, and use these inconsistencies for detection and diagnostics of degradations. By defining degradation as a random abrupt change from the nominal to a constant degraded state of a component, we employed nonlinear filtering techniques based on state/parameter estimation. We utilized a Bayesian recursive estimation formulation in the sequential probabilistic inference framework and constructed a hidden Markov model to represent a general physical system. By addressing the problem of a filter's inability to estimate an abrupt change, which is called the oblivious filter problem in nonlinear extensions of Kalman filtering, and the sample impoverishment problem in particle filtering, we developed techniques to modify filtering algorithms by utilizing additional data sources to improve the filter's response to this problem. We utilized a reliability degradation database that can be constructed from plant specific operational experience and test and maintenance reports to generate proposal densities for probable degradation modes. These are used in a multiple hypothesis testing algorithm. We then test samples drawn from these proposal densities with the particle filtering estimates based on the Bayesian recursive estimation formulation with the Metropolis Hastings algorithm, which is a well-known Markov chain Monte Carlo method (MCMC). This multiple hypothesis testing
NASA Astrophysics Data System (ADS)
Pachhai, S.; Tkalčić, H.; Dettmer, J.
2014-11-01
Ultralow velocity zones (ULVZs) are small-scale structures with a sharp decrease in S and P wave velocity, and an increase in the density on the top of the Earth's core-mantle boundary. The ratio of S and P wave velocity reduction and density anomaly are important to understanding whether ULVZs consist of partial melt or chemically distinct material. However, existing methods such as forward waveform modeling that utilize 1-D and 2-D Earth-structure models face challenges when trying to uniquely quantify ULVZ properties because of inherent nonuniqueness and nonlinearity. This paper develops a Bayesian inversion for ULVZ parameters and uncertainties with rigorous noise treatment to address these challenges. The posterior probability density of the ULVZ parameters (the solution to the inverse problem) is sampled by the Metropolis-Hastings algorithm. To improve sampling efficiency, parallel tempering is applied by simulating a sequence of tempered Markov chains in parallel and allowing information exchange between chains. First, the Bayesian inversion is applied to simulated noisy data for a realistic ULVZ model. Then, measured data sampling the lowermost mantle under the Philippine Sea are considered. Cluster analysis and visual waveform inspection suggest that two distinct classes of ScP (S waves converted to, and reflected as, P waves) waves exist in this region. The distinct waves likely correspond to lateral variability in the lowermost mantle properties in a NE-SW direction. For the NE area, Bayesian model selection identifies a two-layer model with a gradual density increase as a function of depth as optimal. This complex ULVZ structure can be due to the percolation of iron-enriched, molten material in the lowermost mantle. The results for the SW area are more difficult to interpret, which may be due to the limited number of data available (too few waveforms to appropriately reduce noise) and/or complex 2-D and 3-D structures that cannot be explained properly
NASA Astrophysics Data System (ADS)
Li, Lu; Xia, Jun; Xu, Chong-Yu; Singh, V. P.
2010-09-01
SummaryQuantification of uncertainty of hydrological models has attracted much attention in the recent hydrological literature. Different results and conclusions have been reported which result from the use of different methods with different assumptions. In particular, the disagreement between the Generalized Likelihood Uncertainty Estimation (GLUE) and the Bayesian methods for assessing the uncertainty in conceptual watershed modelling has been widely discussed. What has been mostly criticized is the subjective choice as regards the influence of threshold value, number of sample simulations, and likelihood function in the GLUE method. In this study the impact of threshold values and number of sample simulations on the uncertainty assessment of GLUE is systematically evaluated, and a comprehensive evaluation about the posterior distribution, parameter and total uncertainty estimated by GLUE and a formal Bayesian approach using the Metropolis Hasting (MH) algorithm are performed for two well-tested conceptual hydrological models (WASMOD and DTVGM) in an arid basin from North China. The results show that in the GLUE method, the posterior distribution of parameters and the 95% confidence interval of the simulated discharge are sensitive to the choice of the threshold value as measured by the acceptable samples rate (ASR). However, when the threshold value in the GLUE method is high enough (i.e., when the ASR value is smaller than 0.1%), the posterior distribution of parameters, the 95% confidence interval of simulated discharge and the percent of observations bracketed by the 95% confidence interval (P-95CI) for the GLUE method approach those values estimated by the Bayesian method for both hydrological models. Second, in the GLUE method, the insufficiency of number of sample simulations will influence the maximum Nash-Sutcliffe (MNS) efficiency value when ASR is fixed. However, as soon as the number of sample simulations increases to 2 × 10 4 for WASMOD and to 8
Edmondson, Michael J.; Ward, Tracy R.; Maxwell, Lisa J.
2012-07-01
The Highly Active Liquor Evaporation and Storage (HALES) plant at Sellafield handles acidic fission product containing liquor with typical activities of the order of 18x10{sup 9} Bq/ml. A strategy experimental feedback approach has been used to establish a wash regime for the Post Operational Clean Out (POCO) of the oldest storage tanks for this liquor. Two different wash reagents have been identified as being potentially suitable for removal of acid insoluble fission product precipitates. Ammonium carbamate and sodium carbonate yield similar products during the proposed wash cycle. The proposed wash reagents provide dissolution of caesium phosphomolybdate (CPM) and zirconium molybdate (ZM) solid phases but yields a fine, mobile precipitate of metal carbonates from the Highly Active Liquor (HAL) supernate. Addition of nitric acid to the wash effluent can cause CPM to precipitate where there is sufficient caesium and phosphorous available. Where they are not present (from ZM dissolution) the nitric acid addition initially produces a nitrate precipitate which then re-dissolves, along with the metal carbonates, to give a solid-free solution. The different behaviour of the two solids during the wash cycle has led to the proposal for an amended flowsheet. Additional studies on the potential to change the morphology of crystallising ZM have presented opportunities for changing the rheology of ZM sediments through doping with tellurium or particular organic acids. Two different wash reagents have been identified as being potentially suitable for the POCO of HALES Oldside HASTs. AC and SC both yield similar products during the proposed wash cycle. However, the different behaviour of the two principle HAL solids, CPM and ZM, during the wash cycle has led to the proposal for an amended flowsheet. Additional studies on the potential to change the morphology of crystallising ZM have presented opportunities for changing its rheology through doping with tellurium or certain
Adding a Second Ku-Band Antenna to the International Space Station
NASA Technical Reports Server (NTRS)
DuSold, Chuck; Thacker, Corey; Kwatra, Sundeep
2011-01-01
The International Space Station, as originally developed, used the Ku-Band Tracking and Data Relay Satellite System communications link to transmit non-critical data to the ground. Since becoming operational, the use for the link evolved to include additional services that, although also not critical, were deemed to be necessary for the crew. The external Ku-Band Antennas were designed for transport to the ISS in the shuttle cargo bay and thus are not suitable for manifesting on any current cargo vehicle. The original intent was to stow two spare antennas on orbit in a protective container until such time as they were needed to replace a failing unit which is a long and complicated process due to the complexity of the removal and replacement procedure. The Boeing Company proposed manifesting one of those spare antennas in an operable configuration eliminating the need for an Extravehicular Activity (EVA) to correct the first failure and as such minimizing the time to hours rather than weeks required to restore the Ku-Band communications link after failures. After the first failure, an EVA would be scheduled to replace the failed antenna with the stowed spare antenna. Because the hot spare is activated internal to the ISS, the replacement of the failed unit can be done when convenient rather than in haste. This paper describes the methodology used to locate a suitable site to add a new antenna mast to the ISS as well the process followed to fabricate, deliver and install the new interface hardware. Because this was not planned when the ISS was originally designed, structural, power, data and Intermediate Frequency signal connections had to be found for use. With the movement of the P6 solar array element from the initial location in the center zenith location of the ISS to the end of the port side of the truss and concurrent relocation of one string of S-Band communications assets, there were candidate power, data and structural connections available on the Z1 Truss
Sikorski, Tomasz; Piotrowski, Dariusz; Gaszyński, Wojciech
2011-01-01
According to recent WHO reports, body traumas are ranked third with respect of frequency of occurrence right after cardiovascular diseases and tumours, and are considered one of the major medical problems. Trauma is a kind of energy (mechanical, thermal or chemical) affecting the human body. After crossing the threshold of tissue endurance, an injury or damage occurs. A common problem of all the centres that treat traumas is a reliable and comparable assessment of injury severity. Constant improvement of the trauma scores, contributes to increased objectivity of the assessment of injury severity and makes trauma research easier. To a large extent, commonness of the scores enables the exchange of experiences with respect to treating patients after trauma. An ideal scale should be reliable, easy to use, and most of all commonly used, thus enabling the employment of a common "traumatologic" language. In the following research, the test group was comprised of 137 adult patients including 113 men (82%) and 24 women (18%). Most patients were aged from 20 to 60 years, that is, in the productive age. Appropriate trauma treatment results in the reduction of the costs of hospitalisation time of those patients and their recovery. An accident or worse still death of a young person is not only a personal tragedy for the family. It is also a big economic loss for the society which results from "lost years of life" and thus "lost years of work". Quick and appropriate treatment, done in a proper centre with appropriately trained staff and highest quality equipment will allow not only to reduce the victim's suffering and return to their daily life, but also minimise the social costs connected with disability pensions, benefits and compensations. Most injuries happened at work--61% were probably due to haste but most of all not complying with occupational health and safety regulations, which all employees should know and comply with. It involves doctors writing a sick note for the
NASA Astrophysics Data System (ADS)
Yen, H.; Arabi, M.; Records, R.
2012-12-01
The structural complexity of comprehensive watershed models continues to increase in order to incorporate inputs at finer spatial and temporal resolutions and simulate a larger number of hydrologic and water quality responses. Hence, computational methods for parameter estimation and uncertainty analysis of complex models have gained increasing popularity. This study aims to evaluate the performance and applicability of a range of algorithms from computationally frugal approaches to formal implementations of Bayesian statistics using Markov Chain Monte Carlo (MCMC) techniques. The evaluation procedure hinges on the appraisal of (i) the quality of final parameter solution in terms of the minimum value of the objective function corresponding to weighted errors; (ii) the algorithmic efficiency in reaching the final solution; (iii) the marginal posterior distributions of model parameters; (iv) the overall identifiability of the model structure; and (v) the effectiveness in drawing samples that can be classified as behavior-giving solutions. The proposed procedure recognize an important and often neglected issue in watershed modeling that solutions with minimum objective function values may not necessarily reflect the behavior of the system. The general behavior of a system is often characterized by the analysts according to the goals of studies using various error statistics such as percent bias or Nash-Sutcliffe efficiency coefficient. Two case studies are carried out to examine the efficiency and effectiveness of four Bayesian approaches including Metropolis-Hastings sampling (MHA), Gibbs sampling (GSA), uniform covering by probabilistic rejection (UCPR), and differential evolution adaptive Metropolis (DREAM); a greedy optimization algorithm dubbed dynamically dimensioned search (DDS); and shuffle complex evolution (SCE-UA), a widely implemented evolutionary heuristic optimization algorithm. The Soil and Water Assessment Tool (SWAT) is used to simulate hydrologic and
Online Method for Oxygen Triple Isotope Analyses of Nitrate
NASA Astrophysics Data System (ADS)
Kaiser, J.; Hastings, M. G.; Houlton, B.; Roeckmann, T.; Sigman, D. M.
2004-12-01
average isotopic composition of hydroxy/peroxy radicals and tropospheric ozone [cf. contribution by Hastings et al. in session H38]. Nitrate isotope measurements in ice core samples offer opportunities for paleoatmospheric studies. The same method can also be used to analyze N2O itself or other oxy compounds of nitrogen that can be converted to nitrate.
The Time Is Now: Bioethics and LGBT Issues.
Powell, Tia; Foglia, Mary Beth
2014-09-01
Our goal in producing this special issue is to encourage our colleagues to incorporate topics related to LGBT populations into bioethics curricula and scholarship. Bioethics has only rarely examined the ways in which law and medicine have defined, regulated, and often oppressed sexual minorities. This is an error on the part of bioethics. Medicine and law have served in the past as society's enforcement arm toward sexual minorities, in ways that robbed many people of their dignity. We feel that bioethics has an obligation to discuss that history and to help us as a society take responsibility for it. We can address only a small number of topics in this special issue of the Hastings Center Report, and we selected topics we believe will stimulate discourse. Andrew Solomon offers an elegant overview of the challenges that bioethics faces in articulating a solid basis for LGBT rights. Timothy F. Murphy asks whether bioethics still faces issues related to lesbian, gay, and bisexual people, given the deletion of homosexuality as a disease and the progress toward same-sex marriage. Jamie Lindemann Nelson's essay addresses the search for identity for transgender persons and the role of science in that search. Two articles, those by Brendan S. Abel and by Jack Drescher and Jack Pula, take up the complex issue of medical treatment for children who reject their assigned birth gender. Celia B. Fisher and Brian Mustanski address the special challenges of engaging LGBT youth in research, balancing the need for better information about this vulnerable group against the existing restrictions on research involving children. Tia Powell and Edward Stein consider the merits of legal bans on psychotherapies intended to change sexual orientation, particularly in the light of current research on orientation. Mary Beth Foglia and Karen I. Fredricksen-Goldsen highlight health disparities and resilience among LGBT older adults and then discuss the role of nonconscious bias in perpetuating
NASA Astrophysics Data System (ADS)
Traer, M. M.; Hilley, G. E.; Fildani, A.
2009-12-01
Submarine turbidity currents derive their momentum from gravity acting upon the density contrast between sediment-laden and clear water, and so unlike fluvial systems, the dynamics of such flows are inextricably linked to the rates at which they deposit and entrain sediment. We have analyzed the sensitivity of the growth and maintenance of turbidity currents to sediment entrainment and deposition using the layer-averaged equations of conservation of fluid and sediment mass, and conservation of momentum and turbulent kinetic energy. Our model results show that the dynamics of turbidity currents are extremely sensitive to the functional form and empirical constants of the relationship between sediment entrainment and friction velocity. Data on the relationship between sediment entrainment and friction velocity for submarine density flows are few and as a result, entrainment formulations are populated with data from sub-aerial flows not driven by the density contrast between clear and turbid water. If we entertain the possibility that sediment entrainment in sub-aerial rivers is different than in dense underflows, flow parameters such as velocity, height, and concentration were found nearly impossible to predict beyond a few hundred meters based on the limited laboratory data available that constrain the sediment entrainment process in turbidity currents. The sensitivity of flow dynamics to the functional relationship between friction velocity and sediment entrainment indicates that independent calibration of a sediment entrainment law in the submarine environment is necessary to realistically predict the dynamics of these flows and the resulting patterns of erosion and deposition. To calibrate such a relationship, we have developed an inverse methodology that utilizes existing submarine channel morphology as a means of constraining the sediment entrainment function parameters. We use a Bayesian Metropolis-Hastings sampler to determine the sediment entrainment
The Ground State of a Gross-Pitaevskii Energy with General Potential in the Thomas-Fermi Limit
NASA Astrophysics Data System (ADS)
Karali, Georgia; Sourdis, Christos
2015-08-01
We study the ground state which minimizes a Gross-Pitaevskii energy with general non-radial trapping potential, under the unit mass constraint, in the Thomas-Fermi limit where a small parameter tends to 0. This ground state plays an important role in the mathematical treatment of recent experiments on the phenomenon of Bose-Einstein condensation, and in the study of various types of solutions of nonhomogeneous defocusing nonlinear Schrödinger equations. Many of these applications require delicate estimates for the behavior of the ground state near the boundary of the condensate, as , in the vicinity of which the ground state has irregular behavior in the form of a steep corner layer. In particular, the role of this layer is important in order to detect the presence of vortices in the small density region of the condensate, to understand the superfluid flow around an obstacle, and it also has a leading order contribution in the energy. In contrast to previous approaches, we utilize a perturbation argument to go beyond the classical Thomas-Fermi approximation and accurately approximate the layer by the Hastings-McLeod solution of the Painlevé-II equation. This settles an open problem (cf. Aftalion in Vortices in Bose Einstein Condensates. Birkhäuser Boston, Boston, 2006, pg. 13 or Open Problem 8.1), answered very recently only for the special case of the model harmonic potential (Gallo and Pelinovsky in Asymptot Anal 73:53-96, 2011). In fact, we even improve upon previous results that relied heavily on the radial symmetry of the potential trap. Moreover, we show that the ground state has the maximal regularity available, namely it remains uniformly bounded in the -Hölder norm, which is the exact Hölder regularity of the singular limit profile, as . Our study is highly motivated by an interesting open problem posed recently by A ftalion, Jerrard, and R oyo-L etelier (J Funct Anal 260:2387-2406 2011), and an open question of G allo and P elinovsky (J Math Anal
NASA Astrophysics Data System (ADS)
Pankratov, Oleg; Kuvshinov, Alexey
2016-01-01
Despite impressive progress in the development and application of electromagnetic (EM) deterministic inverse schemes to map the 3-D distribution of electrical conductivity within the Earth, there is one question which remains poorly addressed—uncertainty quantification of the recovered conductivity models. Apparently, only an inversion based on a statistical approach provides a systematic framework to quantify such uncertainties. The Metropolis-Hastings (M-H) algorithm is the most popular technique for sampling the posterior probability distribution that describes the solution of the statistical inverse problem. However, all statistical inverse schemes require an enormous amount of forward simulations and thus appear to be extremely demanding computationally, if not prohibitive, if a 3-D set up is invoked. This urges development of fast and scalable 3-D modelling codes which can run large-scale 3-D models of practical interest for fractions of a second on high-performance multi-core platforms. But, even with these codes, the challenge for M-H methods is to construct proposal functions that simultaneously provide a good approximation of the target density function while being inexpensive to be sampled. In this paper we address both of these issues. First we introduce a variant of the M-H method which uses information about the local gradient and Hessian of the penalty function. This, in particular, allows us to exploit adjoint-based machinery that has been instrumental for the fast solution of deterministic inverse problems. We explain why this modification of M-H significantly accelerates sampling of the posterior probability distribution. In addition we show how Hessian handling (inverse, square root) can be made practicable by a low-rank approximation using the Lanczos algorithm. Ultimately we discuss uncertainty analysis based on stochastic inversion results. In addition, we demonstrate how this analysis can be performed within a deterministic approach. In the
Tinker, A
1997-01-01
The issue of housing and the wider environment for an ageing population is one where there are many unanswered questions. In this paper a number of key issues are discussed and for each of these the focus is on three aspects. These are the current situation, its reasonableness and what research is needed in order to make decisions about policy and practice. The first three issues relate to the profile of older people themselves and the importance of home to them. The changing profile of older people is not just about an ageing population but also about the growing prominence of those with dementia, women, people from black and ethnic minority groups and one person households, yet little is known about the type of housing which should be provided. Of equal concern is the widening gap between those with a high standard of living (including housing) and those with a low standard of living. The importance of home to older people means that research must focus on how people can be enabled to remain there, and also on the costs, financial and otherwise, to carers and to society. The next three issues relate to the type of housing older people live in and moves in later life. The startling change in the tenure pattern with a growth of owner occupation brings problems as does the decline in social housing. The advantages and disadvantages of the different types of housing--mainstream and specialized--for older people are relatively well known. However the balance between the two needs more research as does that on retirement communities. While it is well known that there are peaks of migration in old age and that moves are often made in haste, little is known about the process of decision making. The final two topics concern links between housing and other aspects of older people's lives. On health more research is needed on temperature, mortality and morbidity, homelessness and accidents and especially on links between services. These topics have implications for planning
CMA Announces the 1996 Responsible Care Catalyst Awards Winners
NASA Astrophysics Data System (ADS)
1996-06-01
Eighteen exceptional teachers of science, chemical technology, chemistry, and chemical engineering have been selected to receive a Responsible Care Chemical Manufacturers Association's 1996 Catalyst Award. The Responsible Care Catalyst Awards Program honors individuals who have the ability to inspire students toward careers in chemistry and science-related fields through their excellent teaching ability in and out of the classroom. The program also seeks to draw public attention to the importance of quality chemistry and science teaching at the undergraduate level. Since the award was established in 1957, 502 teachers of science, chemistry, and chemical engineering have been honored. Winners are selected from a wide range of nominations submitted by colleagues, friends, and administrators. All pre-high school, high school, two and four-year college, or university teachers in the United States and Canada are eligible. Each award winner will be presented with a medal and citation. National award winners receive 5,000; regional award winners receive 2,500. National Winners. Martin N. Ackermann, Oberlin College, Oberlin, OH Kenneth R. Jolls, Iowa State University, Ames, IA Suzanne Zobrist Kelly, Warren H. Meeker Elementary School, Ames, IA John V. Kenkel, Southeast Community College, Lincoln, NE George C. Lisensky, Beloit College, Beloit, WI James M. McBride, Yale University, New Haven, CT Marie C. Sherman, Ursuline Academy, St. Louis, MO Dwight D. Sieggreen, Cooke Middle School, Northville, MI Regional Winners Two-Year College. East-Georgianna Whipple-VanPatter, Central Community College, Hastings, NE West-David N. Barkan, Northwest College, Powell, WY High School. East-John Hnatow, Jr., Emmaus High School, Northampton, PA South-Carole Bennett, Gaither High School, Tampa, FL Midwest-Kenneth J. Spengler, Palatine High School, Palatine, IL West-Ruth Rand, Albuquerque, Albuquerque, NM Middle School. East-Thomas P. Kelly, Grandville Public Schools, Grandville, NH
NASA Astrophysics Data System (ADS)
Ekinci, Yunus Levent; Balkaya, Çağlayan; Göktürkler, Gökhan; Turan, Seçil
2016-06-01
An efficient approach to estimate model parameters from residual gravity data based on differential evolution (DE), a stochastic vector-based metaheuristic algorithm, has been presented. We have showed the applicability and effectiveness of this algorithm on both synthetic and field anomalies. According to our knowledge, this is a first attempt of applying DE for the parameter estimations of residual gravity anomalies due to isolated causative sources embedded in the subsurface. The model parameters dealt with here are the amplitude coefficient (A), the depth and exact origin of causative source (zo and xo, respectively) and the shape factors (q and ƞ). The error energy maps generated for some parameter pairs have successfully revealed the nature of the parameter estimation problem under consideration. Noise-free and noisy synthetic single gravity anomalies have been evaluated with success via DE/best/1/bin, which is a widely used strategy in DE. Additionally some complicated gravity anomalies caused by multiple source bodies have been considered, and the results obtained have showed the efficiency of the algorithm. Then using the strategy applied in synthetic examples some field anomalies observed for various mineral explorations such as a chromite deposit (Camaguey district, Cuba), a manganese deposit (Nagpur, India) and a base metal sulphide deposit (Quebec, Canada) have been considered to estimate the model parameters of the ore bodies. Applications have exhibited that the obtained results such as the depths and shapes of the ore bodies are quite consistent with those published in the literature. Uncertainty in the solutions obtained from DE algorithm has been also investigated by Metropolis-Hastings (M-H) sampling algorithm based on simulated annealing without cooling schedule. Based on the resulting histogram reconstructions of both synthetic and field data examples the algorithm has provided reliable parameter estimations being within the sampling limits of
Moore, Edward W.
1970-01-01
Ion-exchange calcium electrodes represent the first practical method for the direct measurement of ionized calcium [Ca++] in biologic fluids. Using both “static” and “flow-through” electrodes, serum [Ca++] was within a rather narrow range: 0.94-1.33 mmoles/liter (mean, 1.14 mmoles/liter). Within a given individual, [Ca++] varied only about 6% over a several month period. Consistent pH effects on [Ca++] were observed in serum and whole blood, [Ca++] varying inversely with pH. Less consistent pH effects were also noted in ultrafiltrates, believed to largely represent precipitation of certain calcium complexes from a supersaturated solution. Heparinized whole blood [Ca++] was significantly less than in corresponding serum at normal blood pH, related to the formation of a calcium-heparin complex. [Ca++] in ultrafiltrates represented a variable fraction (66.7-90.2%) of total diffusible calcium. There was no apparent correlation between serum ionized and total calcium concentrations. Thus, neither serum total calcium nor total ultrafiltrable calcium provided a reliable index of serum [Ca++]. Change in serum total calcium was almost totally accounted for by corresponding change in protein-bound calcium [CaProt]. About 81% of [CaProt] was estimated to be bound to albumin and about 19% to globulins. From observed pH, serum protein, and [CaProt] data, a nomogram was developed for estimating [CaProt] without ultrafiltration. Data presented elsewhere indicate that calcium binding by serum proteins obeys the mass-law equation for a monoligand association. This was indicated in the present studies by a close correspondence of observed serum [Ca++] values with those predicted by the McLean-Hastings nomogram. While these electrodes allow study of numerous problems not possible previously, they have not been perfected to the same degree of reliability obtainable with current pH electrodes. The commercial (Orion flow-through) electrode is: (a) expensive. (b) requires
Making Space Travel to Jupiter Possible
NASA Technical Reports Server (NTRS)
Barker, Samuel P.
2004-01-01
into the Nb1Zr causing imbrittlement and possibly major failure. I will be testing the effects of Hast-X on Nb1Zr in a high temperature for 10, 50, 100, and 500 hours. After the samples are run through the heat treatment, strength and chemistry will be tested and reported. My appreciation for the research that goes behind every project has and will continue to grow. By digging through old documents written in the 50's and 60's, scouring through forgotten closets, and learning from those with experience in the refractory metals, I am bound to have an incredible learning experience here at NASA.
Bayesian inversion of microtremor array dispersion data in southwestern British Columbia
NASA Astrophysics Data System (ADS)
Molnar, Sheri; Dosso, Stan E.; Cassidy, John F.
2010-11-01
This paper applies Bayesian inversion, with evaluation of data errors and model parametrization, to produce the most-probable shear-wave velocity profile together with quantitative uncertainty estimates from microtremor array dispersion data. Generally, the most important property for characterizing earthquake site response is the shear-wave velocity (VS) profile. The microtremor array method determines phase velocity dispersion of Rayleigh surface waves from multi-instrument recordings of urban noise. Inversion of dispersion curves for VS structure is a non-unique and non-linear problem such that meaningful evaluation of confidence intervals is required. Quantitative uncertainty estimation requires not only a non-linear inversion approach that samples models proportional to their probability, but also rigorous estimation of the data error statistics and an appropriate model parametrization. This paper applies a Bayesian formulation that represents the solution of the inverse problem in terms of the posterior probability density (PPD) of the geophysical model parameters. Markov-chain Monte Carlo methods are used with an efficient implementation of Metropolis-Hastings sampling to provide an unbiased sample from the PPD to compute parameter uncertainties and inter-relationships. Nonparametric estimation of a data error covariance matrix from residual analysis is applied with rigorous a posteriori statistical tests to validate the covariance estimate and the assumption of a Gaussian error distribution. The most appropriate model parametrization is determined using the Bayesian information criterion, which provides the simplest model consistent with the resolving power of the data. Parametrizations considered vary in the number of layers, and include layers with uniform, linear and power-law gradients. Parameter uncertainties are found to be underestimated when data error correlations are neglected and when compressional-wave velocity and/or density (nuisance
Geochemical and Isotopic Composition of Aerosols in Tucson
NASA Astrophysics Data System (ADS)
Riha, K. M.; Michalski, G. M.; Lohse, K. A.; Gallo, E. L.; Brooks, P. D.; Meixner, T.
2010-12-01
isotopic analyses have been conducted on these samples using the denitrifier method (Casciotti et al., 2002). Observed elevated δ18O values correspond to atmospheric oxidation processes and varying δ15N are possibly linked to different N sources. These isotopic values will be used as a proxy for deposition in a mass balance mixing model for nitrogen in arid streams. References: Casciotti, K. L., D. M. Sigman, M. G. Hastings, J. K. Böhlke and A. Hilkert, Measurement of the oxygen isotopic composition of nitrate in seawater and freshwater using the denitrifier method, Anal. Chem., 74(19), 4905-4912, 2002. Michalski, G., Z. Scott, M. Kabiling and M. Thiemens, First Measurements and Modeling of Δ17O in Atmospheric Nitrate, Geophys. Res. Lett., 30(16), (1870), 2003.
NASA Astrophysics Data System (ADS)
Oikawa, P. Y.; Baldocchi, D. D.; Knox, S. H.; Sturtevant, C. S.; Verfaillie, J. G.; Dronova, I.; Jenerette, D.; Poindexter, C.; Huang, Y. W.
2015-12-01
We use multiple data streams in a model-data fusion approach to reduce uncertainty in predicting CO2 and CH4 exchange in drained and flooded peatlands. Drained peatlands in the Sacramento-San Joaquin River Delta, California are a strong source of CO2 to the atmosphere and flooded peatlands or wetlands are a strong CO2 sink. However, wetlands are also large sources of CH4 that can offset the greenhouse gas mitigation potential of wetland restoration. Reducing uncertainty in model predictions of annual CO2 and CH4 budgets is critical for including wetland restoration in Cap-and-Trade programs. We have developed and parameterized the Peatland Ecosystem Photosynthesis, Respiration, and Methane Transport model (PEPRMT) in a drained agricultural peatland and a restored wetland. Both ecosystem respiration (Reco) and CH4 production are a function of 2 soil carbon (C) pools (i.e. recently-fixed C and soil organic C), temperature, and water table height. Photosynthesis is predicted using a light use efficiency model. To estimate parameters we use a Markov Chain Monte Carlo approach with an adaptive Metropolis-Hastings algorithm. Multiple data streams are used to constrain model parameters including eddy covariance of CO2, 13CO2 and CH4, continuous soil respiration measurements and digital photography. Digital photography is used to estimate leaf area index, an important input variable for the photosynthesis model. Soil respiration and 13CO2 fluxes allow partitioning of eddy covariance data between Reco and photosynthesis. Partitioned fluxes of CO2 with associated uncertainty are used to parametrize the Reco and photosynthesis models within PEPRMT. Overall, PEPRMT model performance is high. For example, we observe high data-model agreement between modeled and observed partitioned Reco (r2 = 0.68; slope = 1; RMSE = 0.59 g C-CO2 m-2 d-1). Model validation demonstrated the model's ability to accurately predict annual budgets of CO2 and CH4 in a wetland system (within 14% and 1
NASA Astrophysics Data System (ADS)
Rupa, Chandra; Mujumdar, Pradeep
2016-04-01
In urban areas, quantification of extreme precipitation is important in the design of storm water drains and other infrastructure. Intensity Duration Frequency (IDF) relationships are generally used to obtain design return level for a given duration and return period. Due to lack of availability of extreme precipitation data for sufficiently large number of years, estimating the probability of extreme events is difficult. Typically, a single station data is used to obtain the design return levels for various durations and return periods, which are used in the design of urban infrastructure for the entire city. In an urban setting, the spatial variation of precipitation can be high; the precipitation amounts and patterns often vary within short distances of less than 5 km. Therefore it is crucial to study the uncertainties in the spatial variation of return levels for various durations. In this work, the extreme precipitation is modeled spatially using the Bayesian hierarchical analysis and the spatial variation of return levels is studied. The analysis is carried out with Block Maxima approach for defining the extreme precipitation, using Generalized Extreme Value (GEV) distribution for Bangalore city, Karnataka state, India. Daily data for nineteen stations in and around Bangalore city is considered in the study. The analysis is carried out for summer maxima (March - May), monsoon maxima (June - September) and the annual maxima rainfall. In the hierarchical analysis, the statistical model is specified in three layers. The data layer models the block maxima, pooling the extreme precipitation from all the stations. In the process layer, the latent spatial process characterized by geographical and climatological covariates (lat-lon, elevation, mean temperature etc.) which drives the extreme precipitation is modeled and in the prior level, the prior distributions that govern the latent process are modeled. Markov Chain Monte Carlo (MCMC) algorithm (Metropolis Hastings
MAGNOX:BUTEX URANIUM BEARING GLASSES PHYSICAL AND CHEMICAL ANALYSIS DATA PACKAGE
Peeler, D.; Imrich, K.; Click, D.
2011-03-08
Sellafield Ltd (United Kingdom) has requested technical support from the Savannah River National Laboratory (SRNL) to characterize a series of uranium-bearing, mixed alkali borosilicate glasses [WFO (2010)]. The specific glasses to be characterized are based on different blends of Magnox (WRW17 simulant) and Butex (or HASTs 1 and 2) waste types as well as different incorporation rates (or waste loadings) of each blend. Specific Magnox:Butex blend ratios of interest include: 75:25, 60:40, and 50:50. Each of these waste blend ratios will be mixed with a base glass additive composition targeting waste loadings (WLs) of 25, 28, and 32% which will result in nine different glasses. The nine glasses are to be fabricated and physically characterized to provide Sellafield Ltd with the technical data to evaluate the impacts of various Magnox:Butex blend ratios and WLs on key glass properties of interest. It should be noted that the use of 'acceptable' in the Work for Other (WFO) was linked to the results of a durability test (more specifically the Soxhlet leach test). Other processing (e.g., viscosity ({eta}), liquidus temperature (T{sub L})) or product performance (e.g., Product Consistency Test (PCT) results - in addition to the Soxhlet leach test) property constraints were not identified. For example, a critical hold point in the classification of an 'acceptable glass' prior to processing high-level waste (HLW) through the Defense Waste Processing Facility (DWPF) is an evaluation of specific processing and product performance properties against pre-defined constraints. This process is referred to as Slurry Mix Evaporator (SME) acceptability in which predicted glass properties (based on compositional measurements) are compared to predefined constraints to determine whether the glass is acceptable [Brown and Postles (1995)]. As an example, although the nominal melter temperature at DWPF is 1150 C, there is a T{sub L} constraint (without uncertainties applied) of 1050 C. Any
Katupitiya, A.; Eisenhauer, D.E.; Ferguson, R.B.; Spalding, R.F.; Roeth, F.W.; Bobier, M.W.
1997-01-01
Tillage influences the physical and biological environment of soil. Rotation of crops with a legume affects the soil N status. A furrow irrigated site was investigated for long-term tillage and crop rotation effects on leaching of nitrate from the root zone and accumulation in the intermediate vadose zone (IVZ). The investigated tillage systems were disk-plant (DP), ridge-till (RT) and slot-plant (SP). These tillage treatments have been maintained on the Hastings silt loam (Udic Argiustoll) and Crete silt loam (Pachic Argiustoll) soils since 1976. Continuous corn (CC) and corn soybean (CS) rotations were the subtreatments. Since 1984, soybeans have been grown in CS plots in even calendar years. All tillage treatments received the same N rate. The N rate varied annually depending on the root zone residual N. Soybeans were not fertilized with N-fertilizer. Samples for residual nitrate in the root zone were taken in 8 of the 15 year study while the IVZ was only sampled at the end of the study. In seven of eight years, root zone residual soil nitrate-N levels were greater with DP than RT and SP. Residual nitrate-N amounts were similar in RT and SP in all years. Despite high residual nitrate-N with DP and the same N application rate, crop yields were higher in RT and SP except when DP had an extremely high root zone nitrate level. By applying the same N rates on all tillage treatments, DP may have been fertilized in excess of crop need. Higher residual nitrate-N in DP was most likely due to a combination of increased mineralization with tillage and lower yield compared to RT and SP. Because of higher nitrate availability with DP, the potential for nitrate leaching from the root zone was greater with DP as compared to the RT and SP tillage systems. Spring residual nitrate-N contents of DP were larger than RT and SP in both crop rotations. Ridge till and SP systems had greater nitrate-N with CS than CC rotations. Nitrate accumulation in IVZ at the upstream end of the
NASA Astrophysics Data System (ADS)
Niemeijer, André R.; Boulton, Carolyn; Toy, Virginia; Townend, John; Sutherland, Rupert
2015-04-01
has occurred. Thus, depending on the background (nucleation) strain rate, our data indicate that the Alpine Fault should be able to generate earthquakes at all temperatures above room temperature. However, at the highest temperature investigated (600 oC), the transition to velocity-weakening is postponed to slip rates above 10 mm/s (strain rate ~10-2 s-1). This observation, combined with the absence of strength recovery after long holds, suggests that seismic slip may propagate into regions of the fault unlikely to nucleate earthquakes. We propose that in our porous gouges, thermally activated processes operate simultaneously with granular flow, postponing ductile flow to higher temperatures or lower strain rates. Sutherland, R., V.G. Toy, J. Townend, S.C. Cox, J.D. Eccles, D.R. Faulkner, D.J. Prior, R.J.Norris, E. Mariani, C. Boulton, B.M. Carpenter, C.D. Menzies, T.A. Little, M. Hasting, G.De Pascale, R.M. Langridge, H.R. Scott, Z. Reid-Lindroos, B. Fleming (2012), Drilling reveals fluid control on architecture and rupture of the Alpine Fault, New Zealand, Geology,40, 1143-1146, doi:10.1130/G33614.1. Toy, V.G., Craw, D., Cooper, A.F., and R.J. Norris (2010), Thermal regime in the central Alpine Fault zone, New Zealand: Constraints from microstructures, biotite chemistry and fluid inclusion data, Tectonophysics, doi:10.1016/j.tecto.2009.12.013
Leib, Thomas; Cole, Dan
2015-06-30
, construction labor, engineering, and other costs. The CCS Project Final Technical Report is based on a Front End Engineering and Design (FEED) study prepared by SK E&C, completed in [June] 2014. Subsequently, Fluor Enterprises completed a FEED validation study in mid-September 2014. The design analyses indicated that the FEED package was sufficient and as expected. However, Fluor considered the construction risk based on a stick-build approach to be unacceptable, but construction risk would be substantially mitigated through utilization of modular construction where site labor and schedule uncertainty is minimized. Fluor’s estimate of the overall EPC project cost utilizing the revised construction plan was comparable to SKE&C’s value after reflecting Fluor’s assessment of project scope and risk characteristic. Development was halted upon conclusion of Phase 2A FEED and the project was not constructed.Transport and Sequestration – The overall objective of the pipeline project was to construct a pipeline to transport captured CO_{2} from the Lake Charles Clean Energy project to the existing Denbury Green Line and then to the Hastings Field in Southeast Texas to demonstrate effective geologic sequestration of captured CO_{2} through commercial EOR operations. The overall objective of the MVA portion of the project was to demonstrate effective geologic sequestration of captured CO_{2} through commercial Enhanced Oil Recovery (EOR) operations in order to evaluate costs, operational processes and technical performance. The DOE target for the project was to capture and implement a research MVA program to demonstrate the sequestration through EOR of approximately one million tons of CO_{2} per year as an integral component of commercial operations.
Asteroid orbital inversion using uniform phase-space sampling
NASA Astrophysics Data System (ADS)
Muinonen, K.; Pentikäinen, H.; Granvik, M.; Oszkiewicz, D.; Virtanen, J.
2014-07-01
a set of virtual observations; second, corresponding virtual least-squares orbital elements are derived using the Nelder-Mead downhill simplex method; third, repeating the procedure two times allows for a computation of a difference for two sets of virtual orbital elements; and, fourth, this orbital-element difference constitutes a symmetric proposal in a random-walk Metropolis-Hastings algorithm, avoiding the explicit computation of the proposal p.d.f. In a discrete approximation, the allowed proposals coincide with the differences that are based on a large number of pre-computed sets of virtual least-squares orbital elements. The virtual-observation MCMC method is thus based on the characterization of the relevant volume in the orbital-element phase space. Here we utilize MCMC to map the phase-space domain of acceptable solutions. We can make use of the proposal p.d.f.s from the MCMC ranging and virtual-observation methods. The present phase-space mapping produces, upon convergence, a uniform sampling of the solution space within a pre-defined χ^2-value. The weights of the sampled orbital elements are then computed on the basis of the corresponding χ^2-values. The present method resembles the original ranging method. On one hand, MCMC mapping is insensitive to local extrema in the phase space and efficiently maps the solution space. This is somewhat contrary to the MCMC methods described above. On the other hand, MCMC mapping can suffer from producing a small number of sample elements with small χ^2-values, in resemblance to the original ranging method. We apply the methods to example near-Earth, main-belt, and transneptunian objects, and highlight the utilization of the methods in the data processing and analysis pipeline of the ESA Gaia space mission.
Carbon Capture and Sequestration from a Hydrogen Production Facility in an Oil Refinery
Engels, Cheryl; Williams, Bryan, Valluri, Kiranmal; Watwe, Ramchandra; Kumar, Ravi; Mehlman, Stewart
2010-06-21
The project proposed a commercial demonstration of advanced technologies that would capture and sequester CO2 emissions from an existing hydrogen production facility in an oil refinery into underground formations in combination with Enhanced Oil Recovery (EOR). The project is led by Praxair, Inc., with other project participants: BP Products North America Inc., Denbury Onshore, LLC (Denbury), and Gulf Coast Carbon Center (GCCC) at the Bureau of Economic Geology of The University of Texas at Austin. The project is located at the BP Refinery at Texas City, Texas. Praxair owns and operates a large hydrogen production facility within the refinery. As part of the project, Praxair would construct a CO2 capture and compression facility. The project aimed at demonstrating a novel vacuum pressure swing adsorption (VPSA) based technology to remove CO2 from the Steam Methane Reformers (SMR) process gas. The captured CO2 would be purified using refrigerated partial condensation separation (i.e., cold box). Denbury would purchase the CO2 from the project and inject the CO2 as part of its independent commercial EOR projects. The Gulf Coast Carbon Center at the Bureau of Economic Geology, a unit of University of Texas at Austin, would manage the research monitoring, verification and accounting (MVA) project for the sequestered CO2, in conjunction with Denbury. The sequestration and associated MVA activities would be carried out in the Hastings field at Brazoria County, TX. The project would exceed DOE?s target of capturing one million tons of CO2 per year (MTPY) by 2015. Phase 1 of the project (Project Definition) is being completed. The key objective of Phase 1 is to define the project in sufficient detail to enable an economic decision with regard to proceeding with Phase 2. This topical report summarizes the administrative, programmatic and technical accomplishments completed in Phase 1 of the project. It describes the work relative to project technical and design activities
Stewart Mehlman
2010-06-16
The project proposed a commercial demonstration of advanced technologies that would capture and sequester CO2 emissions from an existing hydrogen production facility in an oil refinery into underground formations in combination with Enhanced Oil Recovery (EOR). The project is led by Praxair, Inc., with other project participants: BP Products North America Inc., Denbury Onshore, LLC (Denbury), and Gulf Coast Carbon Center (GCCC) at the Bureau of Economic Geology of The University of Texas at Austin. The project is located at the BP Refinery at Texas City, Texas. Praxair owns and operates a large hydrogen production facility within the refinery. As part of the project, Praxair would construct a CO2 capture and compression facility. The project aimed at demonstrating a novel vacuum pressure swing adsorption (VPSA) based technology to remove CO2 from the Steam Methane Reformers (SMR) process gas. The captured CO2 would be purified using refrigerated partial condensation separation (i.e., cold box). Denbury would purchase the CO2 from the project and inject the CO2 as part of its independent commercial EOR projects. The Gulf Coast Carbon Center at the Bureau of Economic Geology, a unit of University of Texas at Austin, would manage the research monitoring, verification and accounting (MVA) project for the sequestered CO2, in conjunction with Denbury. The sequestration and associated MVA activities would be carried out in the Hastings field at Brazoria County, TX. The project would exceed DOE’s target of capturing one million tons of CO2 per year (MTPY) by 2015. Phase 1 of the project (Project Definition) is being completed. The key objective of Phase 1 is to define the project in sufficient detail to enable an economic decision with regard to proceeding with Phase 2. This topical report summarizes the administrative, programmatic and technical accomplishments completed in Phase 1 of the project. It describes the work relative to project technical and design activities
NASA Astrophysics Data System (ADS)
Sohn, Y. K.; Rhee, C. W.; Shon, H.
2001-09-01
-level fluctuations. We suggest that the four alternations of conglomerates (lowstand systems) and hemipelagic mudstones (condensed intervals) resulted most probably from the 3rd-order glacioeustatic cycles during the middle Miocene. This finding implies that the signatures of global sea-level fluctuations can be deciphered from a tectonically active sedimentary basin if the timing of regional tectonic development is well constrained, and the global sea-level chart of Haq et al. ( Haq, B.W., Hardenbol, J., Vail, P.R., 1987, Chronology of fluctuating sea levels since the Triassic. Science 235, 1156-1167; Haq, B.U., Hardenbol, J., Vail, P.R., 1988. Mesozoic and Cenozoic chronostratigraphy and eustatic cycles. In: Wilgus, C.K., Hastings, B.S., Posamentier, H., Van Wagoner, J., Ross, C.A., Kendall, C.G.S.C. (Eds.), Sea-Level Changes: an Integrated Approach: Soc. Econ. Paleont. Mineral. Spec. Publ. 42, pp. 71-108) may serve as a guide to basinfill interpretation even in tectonically active sedimentary basins.
Statistical Mechanics of the Shallow Water and Primitive Equations
NASA Astrophysics Data System (ADS)
Potters, M.; Bouchet, F.
2012-04-01
Geophysical flows are highly turbulent, yet embody large-scale coherent structures, such as ocean rings, jets, and large-scale circulations. Understanding how these structures appear and predicting their shape are major theoretical challenges. The statistical mechanics approach to geophysical flows is a powerful complement to more conventional theoretical and numerical methods [1]. In the inertial limit, it allows to describe, with only a few thermodynamical parameters, the long-time behavior of the largest scales of the flow. Recent studies in quasi-geostrophic models provide encouraging results: a model of the Great Red Spot of Jupiter [2], an explanation of the drift properties of ocean rings [3], the inertial structure of mid-basin eastward jets [3], bistability phenomena in complex turbulent flows [4], and so on. Generalization to more comprehensive hydrodynamical models, which include gravity wave dynamics and allow for the possibility of energy transfer through wave motion, would be extremely interesting. Namely, both are essential in understanding the geophysical flow energy balance. However, due to difficulties in essential theoretical parts of the statistical mechanics approach, previous methods describing statistical equilibria were up to now limited to the use of quasi-geostrophic models. The current study fills this gap. The new theory we propose describes geophysical phenomena using statistical mechanics applied to the shallow water model, and is easily generalizable to the primitive equations. Invariant measures of the shallow water model are built based on the Hamiltonian structure and the Liouville theorem. In parallel with the development of the theory, we devised an algorithm based on the Creutz algorithm [5] (a generalization of Metropolis-Hastings algorithm) in order to sample microcanonical measures. Numerical simulations are compared with the theoretical predictions [6]. We apply these new tools in order to describe vortex solutions similar
Statistical Mechanics of the Shallow Water and Primitive Equations
NASA Astrophysics Data System (ADS)
Potters, M.; Bouchet, F.
2011-12-01
Geophysical flows are highly turbulent, yet embody large-scale coherent structures, such as ocean rings, jets, and large-scale circulations. Understanding how these structures appear and predicting their shape are major theoretical challenges. The statistical mechanics approach to geophysical flows is a powerful complement to more conventional theoretical and numerical methods [1]. In the inertial limit, it allows to describe, with only a few thermodynamical parameters, the long-time behavior of the largest scales of the flow. Recent studies with quasi-geostrophic models provide encouraging results: a model of the Great Red Spot of Jupiter [2], an explanation of the drift properties of ocean rings [3], the inertial structure of mid-basin eastward jets [3], bistability phenomena in complex turbulent flows [4], and so on. Generalization to more comprehensive hydrodynamical models, which include gravity wave dynamics and the possibility of energy transfer through wave motion, would be extremely interesting. Namely, both are essential in understanding the geophysical flow energy balance. However, due to difficulties in essential theoretical parts of the statistical mechanics approach, previous methods describing statistical equilibria were up to now limited to the use of quasi-geostrophic models. The current study fills this gap. The new theory we propose describes geophysical phenomena using statistical mechanics applied to the shallow water model, and is easily generalizable to the primitive equations. Invariant measures of the shallow water model are built based on the Hamiltonian structure and the Liouville theorem. In parallel with the development of the theory, we devised an algorithm based on the Creutz algorithm [5] (a generalization of Metropolis-Hastings algorithm) in order to sample microcanonical measures. Numerical simulations are compared with the theoretical predictions [6]. We apply these new tools in order to describe vortex solutions similar to the
Analysis of the Thermo-Elastic Response of Space Reflectors to Simulated Space Environment
NASA Astrophysics Data System (ADS)
Allegri, G.; Ivagnes, M. M.; Marchetti, M.; Poscente, F.
2002-01-01
high pressure Xenon lamps to simulate the direct solar irradiation and a cryogenic heat exchanger to reproduce the earth shadowing of sunlight. The temperature of the thermal cycles ranges from -80°C up to 100°C: the thermo-elastic response of the antenna has been surveyed by employing strain gauges place on the structures at several different locations. The structure has been subjected to 100 thermal cycles, each of which lasting two hours: the total duration of the exposition to the vacuum environment has been equal to 300 hours. Finally the antenna has been disassembled and its elements have been examined to evaluate the effects of the simulated exposition on each of them: the total mass loss and the final thermo-mechanical properties of the polymeric based materials which constitute the structural core of the antenna have been surveyed. The experimental results have been compared to numerical simulation performed by the NASTRAN code: the basic FEM model, developed for the unexposed antenna, has been updated to take into account the thermo-mechanical degradation of the structural elements and materials. This has allowed to obtain, by extrapolation, a FEM based prevision of the antenna thermo-elastic response for long-term operative conditions. References. [1] D. Hastings, H. Garret "Spacecraft environment interactions", Cambridge University Press, Atmospheric Series, Cambridge, 1996. [2] IAF-01-I.6.05 "On the Reliability of Honeycomb Core Bonding Joint in Sandwich Composite Materials for Space Applications" G. Allegri, U. Lecci, M. Marchetti, F. Poscente, 52° IAF Congress, 2001. [3] Meguro A. and alii, "Technology status of the 13 m aperture deployment antenna reflectors for Engineering Test Satellite VIII", Acta Astronautica, Volume: 47, Issue: 2-9, July - November, 2000, pp. 147-152. [4] Novikov L. S. "Contemporary state of spacecraft/environment interaction research" Radiation Measurements, Volume: 30, Issue: 5, October, 1999, pp. 661-667. [5] IAF-01-I.1
Bacteriocins: modes of action and potentials in food preservation and control of food poisoning.
Abee, T; Krockel, L; Hill, C
1995-12-01
-negative bacteria possess an additional layer, the so-called outer membrane which is composed of phospholipids, proteins and lipopolysaccharides (LPS), and this membrane is impermeable to most molecules. Nevertheless, the presence of porins in this layer will allow the free diffusion of molecules with a molecular mass below 600 Da. The smallest bacteriocins produced by lactic acid bacteria are approximately 3 kDa and are thus too large to reach their target, the cytoplasmic membrane (Klaenhammer, 1993; Stiles and Hastings, 1991). However, Stevens et al. (1991) and Ray (1993) have demonstrated that Salmonella species and other Gram-negative bacteria become sensitive to nisin after exposure to treatments that change the permeability barrier properties of the outer membrane (see below). This review will focus on the mode of action of lantibiotics (class I) and class II LAB bacteriocins and their potentials in food preservation and control of food poisoning. PMID:8750665
Vertically Integrated Seismological Analysis II : Inference
NASA Astrophysics Data System (ADS)
Arora, N. S.; Russell, S.; Sudderth, E.
2009-12-01
Methods for automatically associating detected waveform features with hypothesized seismic events, and localizing those events, are a critical component of efforts to verify the Comprehensive Test Ban Treaty (CTBT). As outlined in our companion abstract, we have developed a hierarchical model which views detection, association, and localization as an integrated probabilistic inference problem. In this abstract, we provide more details on the Markov chain Monte Carlo (MCMC) methods used to solve this inference task. MCMC generates samples from a posterior distribution π(x) over possible worlds x by defining a Markov chain whose states are the worlds x, and whose stationary distribution is π(x). In the Metropolis-Hastings (M-H) method, transitions in the Markov chain are constructed in two steps. First, given the current state x, a candidate next state x‧ is generated from a proposal distribution q(x‧ | x), which may be (more or less) arbitrary. Second, the transition to x‧ is not automatic, but occurs with an acceptance probability—α(x‧ | x) = min(1, π(x‧)q(x | x‧)/π(x)q(x‧ | x)). The seismic event model outlined in our companion abstract is quite similar to those used in multitarget tracking, for which MCMC has proved very effective. In this model, each world x is defined by a collection of events, a list of properties characterizing those events (times, locations, magnitudes, and types), and the association of each event to a set of observed detections. The target distribution π(x) = P(x | y), the posterior distribution over worlds x given the observed waveform data y at all stations. Proposal distributions then implement several types of moves between worlds. For example, birth moves create new events; death moves delete existing events; split moves partition the detections for an event into two new events; merge moves combine event pairs; swap moves modify the properties and assocations for pairs of events. Importantly, the rules for
Can we reliably estimate managed forest carbon dynamics using remotely sensed data?
NASA Astrophysics Data System (ADS)
Smallman, Thomas Luke; Exbrayat, Jean-Francois; Bloom, A. Anthony; Williams, Mathew
2015-04-01
Forests are an important part of the global carbon cycle, serving as both a large store of carbon and currently as a net sink of CO2. Forest biomass varies significantly in time and space, linked to climate, soils, natural disturbance and human impacts. This variation means that the global distribution of forest biomass and their dynamics are poorly quantified. Terrestrial ecosystem models (TEMs) are rarely evaluated for their predictions of forest carbon stocks and dynamics, due to a lack of knowledge on site specific factors such as disturbance dates and / or managed interventions. In this regard, managed forests present a valuable opportunity for model calibration and improvement. Spatially explicit datasets of planting dates, species and yield classification, in combination with remote sensing data and an appropriate data assimilation (DA) framework can reduce prediction uncertainty and error. We use a Baysian approach to calibrate the data assimilation linked ecosystem carbon (DALEC) model using a Metropolis Hastings-Markov Chain Monte Carlo (MH-MCMC) framework. Forest management information is incorporated into the data assimilation framework as part of ecological and dynamic constraints (EDCs). The key advantage here is that DALEC simulates a full carbon balance, not just the living biomass, and that both parameter and prediction uncertainties are estimated as part of the DA analysis. DALEC has been calibrated at two managed forests, in the USA (Pinus taeda; Duke Forest) and UK (Picea sitchensis; Griffin Forest). At each site DALEC is calibrated twice (exp1 & exp2). Both calibrations (exp1 & exp2) assimilated MODIS LAI and HWSD estimates of soil carbon stored in soil organic matter, in addition to common management information and prior knowledge included in parameter priors and the EDCs. Calibration exp1 also utilises multiple site level estimates of carbon storage in multiple pools. By comparing simulations we determine the impact of site
NASA Astrophysics Data System (ADS)
Jasoni, Richard; Arnone, John; Fenstermaker, Lynn; Wohlfahrt, Georg
2014-05-01
Eddy covariance measurements of net ecosystem CO2 exchange (NEE) in the Mojave Desert (Jasoni et al. 2005-Global Change Biology 11:749-756; Wohlfahrt et al. 2008-Global Change Biology 14:1475-1487), and in other deserts of the world (e.g., Hastings et al. 2005- Global Change Biology 14:927-939, indicate greater rates of net CO2 uptake (more negative NEE values) and net ecosystem productivity (NEP) than would have been expected for deserts (as high as -120 g C m-2 year-1). We continue to observe high rates of NEE and NEP and seek explanations for these findings at interannual, seasonal, and sub-seasonal time scales. Because moisture availability most strongly constrains biological activity in deserts, responses to rains probably play a significant role in defining components of NEE-namely net primary productivity (NPP, or roughly net photosynthesis by vascular and non-vascular plants) and heterotrophic respiration (Rh, mainly by soil microorganisms). Most precipitation in the Mojave Desert falls from October through April and periodically in the summer as convective storms. The main objective of this study was to quantify the extent to which NEE and the net flux of CO2 from/to biological soil crust (BSC) covered soil surfaces respond to rain pulses occurring during cool/cold and warm/hot times of the year. Flux data from 7 years (2005-2011) of measurements at our shub land desert site (average 150 mm rain per year) located 120 km northwest of Las Vegas showed a range in NEP from -111±34 to -47±28 g C m-2 year-1. Cool season rains usually stimulated NEE (more negative NEE values or net CO2 uptake) while warm season rains reversed this effect and led to positive NEE values (net ecosystem CO2 efflux. Cool season stimulation of NEE often occurred in the absence of green leaves on vascular plants, suggesting that photosynthesis of BSCs (up to 70% of soil surface covered by cyanobacteria, mosses, and lichens) were responsible for this net uptake. At other times during
Pugmire, Brian S; Guimaraes, Alexander R; Lim, Ruth; Friedmann, Alison M; Huang, Mary; Ebb, David; Weinstein, Howard; Catalano, Onofrio A; Mahmood, Umar; Catana, Ciprian; Gee, Michael S
2016-01-01
AIM: To describe our preliminary experience with simultaneous whole body 18F-fluorodeoxyglucose (18F-FDG) positron emission tomography and magnetic resonance imaging (PET-MRI) in the evaluation of pediatric oncology patients. METHODS: This prospective, observational, single-center study was Health Insurance Portability and Accountability Act-compliant, and institutional review board approved. To be eligible, a patient was required to: (1) have a known or suspected cancer diagnosis; (2) be under the care of a pediatric hematologist/oncologist; and (3) be scheduled for clinically indicated 18F-FDG positron emission tomography-computed tomography (PET-CT) examination at our institution. Patients underwent PET-CT followed by PET-MRI on the same day. PET-CT examinations were performed using standard department protocols. PET-MRI studies were acquired with an integrated 3 Tesla PET-MRI scanner using whole body T1 Dixon, T2 HASTE, EPI diffusion-weighted imaging (DWI) and STIR sequences. No additional radiotracer was given for the PET-MRI examination. Both PET-CT and PET-MRI examinations were reviewed by consensus by two study personnel. Test performance characteristics of PET-MRI, for the detection of malignant lesions, including FDG maximum standardized uptake value (SUVmax) and minimum apparent diffusion coefficient (ADCmin), were calculated on a per lesion basis using PET-CT as a reference standard. RESULTS: A total of 10 whole body PET-MRI exams were performed in 7 pediatric oncology patients. The mean patient age was 16.1 years (range 12-19 years) including 6 males and 1 female. A total of 20 malignant and 21 benign lesions were identified on PET-CT. PET-MRI SUVmax had excellent correlation with PET-CT SUVmax for both benign and malignant lesions (R = 0.93). PET-MRI SUVmax > 2.5 had 100% accuracy for discriminating benign from malignant lesions using PET-CT reference. Whole body DWI was also evaluated: the mean ADCmin of malignant lesions (780.2 + 326.6) was
NASA Astrophysics Data System (ADS)
Smallman, Luke; Williams, Mathew
2016-04-01
Forests are a critical component of the global carbon cycle, storing significant amounts of carbon, split between living biomass and dead organic matter. The carbon budget of forests is the most uncertain component of the global carbon cycle - it is currently impossible to quantify accurately the carbon source/sink strength of forest biomes due to their heterogeneity and complex dynamics. It has been a major challenge to generate robust carbon budgets across landscapes due to data scarcity. Models have been used but outputs have lacked an assessment of uncertainty, making a robust assessment of their reliability and accuracy challenging. Here a Metropolis Hastings - Markov Chain Monte Carlo (MH-MCMC) data assimilation framework has been used to combine remotely sensed leaf area index (MODIS), biomass (where available) and deforestation estimates, in addition to forest planting and clear-felling information from the UK's national forest inventory, an estimate of soil carbon from the Harmonized World Database (HWSD) and plant trait information with a process model (DALEC) to produce a constrained analysis with a robust estimate of uncertainty of the UK forestry carbon budget between 2000 and 2010. Our analysis estimates the mean annual UK forest carbon sink at -3.9 MgC ha‑1yr‑1 with a 95 % confidence interval between -4.0 and -3.1 MgC ha‑1 yr‑1. The UK national forest inventory (NFI) estimates the mean UK forest carbon sink to be between -1.4 and -5.5 MgC ha‑1 yr‑1. The analysis estimate for total forest biomass stock in 2010 is estimated at 229 (177/232) TgC, while the NFI an estimated total forest biomass carbon stock of 216 TgC. Leaf carbon area (LCA) is a key plant trait which we are able to estimate using our analysis. Comparison of median estimates for LCA retrieved from the analysis and a UK land cover map show higher and lower values for LCA are estimated areas dominated by needle leaf and broad leaf forests forest respectively, consistent with
NASA Astrophysics Data System (ADS)
Silva, Elsa A.; Miranda, J. M.; Luis, J. F.; Galdeano, A.
2000-05-01
The Ibero-Armorican Arc (IAA) is a huge geological structure of Pre-Cambrian origin, tightened during hercynian times and deeply affected by the opening of the Atlantic Ocean and the Bay of Biscay. Its remnants now lie in Iberia, north-western France and the Canadian Grand Banks margins. The qualitative correlation between these three blocks has been attempted by several authors (e.g. Lefort, J.P., 1980. Un 'Fit' structural de l'Atlantique Nord: arguments geologiques pour correler les marqueurs geophysiques reconnus sur les deux marges. Mar. Geol. 37, 355-369; Lefort, J.P., 1983. A new geophysical criterion to correlate the Acadian and Hercynian orogenies of Western Europe and Eastern America. Mem. Geol. Soc. Am. 158, 3-18; Galdeano, A., Miranda, J.M., Matte, P., Mouge, P., Rossignol, C., 1990. Aeromagnetic data: A tool for studying the Variscan arc of Western Europe and its correlation with transatlantic structures. Tectonophysics 177, 293-305) using magnetic anomalies, mainly because they seem to preserve the hercynian zonation, in spite of the strong thermal and mechanical processes that took place during rifting and ocean spreading. In this paper, we present a new contribution to the study of the IAA structure based on the processing of a compilation of magnetic data from Iberia and Grand Banks margins. To interpret the magnetic signature, a Fourier-domain-based inversion technique was applied, considering a layer with a constant thickness of 10 km, and taking into account only the induced field. The digital terrain model was derived from ETOPO5 (ETOPO5, 1986. Relief map of the earth's surface. EOS 67, 121) and TerrainBase (TerrainBase, 1995. In: Row III, L.W., Hastings, D.A., Dunbar, P.K. (Eds.), Worldwide Digital Terrain Data, Documentation Manual, CD-ROM Release 1.0. GEODAS-NGDC Key to Geophysical Records. Documentation N. 30, April) databases. The pseudo-susceptibility distribution obtained was repositioned for the 156.5 Ma epoch, using the Srivastava and
Bennett, Scott E K.; Oskin, Michael; Dorsey, Rebecca; Iriondo, Alexander; Kunk, Michael J.
2015-01-01
Accurate information on the timing of earliest marine incursion into the Gulf of California (northwestern México) is critical for paleogeographic models and for understanding the spatial and temporal evolution of strain accommodation across the obliquely divergent Pacific-North America plate boundary. Marine strata exposed on southwest Isla Tiburón (SWIT) have been cited as evidence for a middle Miocene marine incursion into the Gulf of California at least 7 m.y. prior to plate boundary localization ca. 6 Ma. A middle Miocene interpretation for SWIT marine deposits has played a large role in subsequent interpretations of regional tectonics and rift evolution, the ages of marine basins containing similar fossil assemblages along ~1300 km of the plate boundary, and the timing of marine incursion into the Gulf of California. We report new detailed geologic mapping and geochronologic data from the SWIT basin, an elongate sedimentary basin associated with deformation along the dextral-oblique La Cruz fault. We integrate these results with previously published biostratigraphic and geochronologic data to bracket the age of marine deposits in the SWIT basin and show that they have a total maximum thickness of ~300 m. The 6.44 ± 0.05 Ma (Ar/Ar) tuff of Hast Pitzcal is an ash-flow tuff stratigraphically below the oldest marine strata, and the 6.01 ± 0.20 Ma (U/Pb) tuff of Oyster Amphitheater, also an ash-flow tuff, is interbedded with marine conglomerate near the base of the marine section. A dike-fed rhyodacite lava flow that caps all marine strata yields ages of 3.51 ± 0.05 Ma (Ar/Ar) and 4.13 ± 0.09 Ma (U/Pb) from the base of the flow, consistent with previously reported ages of 4.16 ± 1.81 Ma (K-Ar) from the flow top and (K-Ar) 3.7 ± 0.9 Ma from the feeder dike. Our new results confirm a latest Miocene to early Pliocene age for the SWIT marine basin, consistent with previously documented latest Miocene to early Pliocene (ca. 6.2-4.3 Ma) planktonic and benthic
LCLS-II New Instruments Workshops Report
Baradaran, Samira; Bergmann, Uwe; Durr, Herrmann; Gaffney, Kelley; Goldstein, Julia; Guehr, Markus; Hastings, Jerome; Heimann, Philip; Lee, Richard; Seibert, Marvin; Stohr, Joachim; /SLAC
2012-08-08
The LCLS-II New Instruments workshops chaired by Phil Heimann and Jerry Hastings were held on March 19-22, 2012 at the SLAC National Accelerator Laboratory. The goal of the workshops was to identify the most exciting science and corresponding parameters which will help define the LCLS-II instrumentation. This report gives a synopsis of the proposed investigations and an account of the workshop. Scientists from around the world have provided short descriptions of the scientific opportunities they envision at LCLS-II. The workshops focused on four broadly defined science areas: biology, materials sciences, chemistry and atomic, molecular and optical physics (AMO). Below we summarize the identified science opportunities in the four areas. The frontiers of structural biology lie in solving the structures of large macromolecular biological systems. Most large protein assemblies are inherently difficult to crystallize due to their numerous degrees of freedom. Serial femtosecond protein nanocrystallography, using the 'diffraction-before-destruction' approach to outrun radiation damage has been very successfully pioneered at LCLS and diffraction patterns were obtained from some of the smallest protein crystals ever. The combination of femtosecond x-ray pulses of high intensity and nanosized protein crystals avoids the radiation damage encountered by conventional x-ray crystallography with focused beams and opens the door for atomic structure determinations of the previously largely inaccessible class of membrane proteins that are notoriously difficult to crystallize. The obtained structures will allow the identification of key protein functions and help in understanding the origin and control of diseases. Three dimensional coherent x-ray imaging at somewhat lower resolution may be used for larger objects such as viruses. The chemistry research areas of primary focus are the predictive understanding of catalytic mechanisms, with particular emphasis on photo- and
EDITORIAL: Focus on Quantum Information and Many-Body Theory
NASA Astrophysics Data System (ADS)
Eisert, Jens; Plenio, Martin B.
2010-02-01
and F Verstraete SIMULATION AND DYNAMICS A quantum differentiation of k-SAT instances B Tamir and G Ortiz Classical Ising model test for quantum circuits Joseph Geraci and Daniel A Lidar Exact matrix product solutions in the Heisenberg picture of an open quantum spin chain S R Clark, J Prior, M J Hartmann, D Jaksch and M B Plenio Exact solution of Markovian master equations for quadratic Fermi systems: thermal baths, open XY spin chains and non-equilibrium phase transition Tomaž Prosen and Bojan Žunkovič Quantum kinetic Ising models R Augusiak, F M Cucchietti, F Haake and M Lewenstein ENTANGLEMENT AND SPECTRAL PROPERTIES Ground states of unfrustrated spin Hamiltonians satisfy an area law Niel de Beaudrap, Tobias J Osborne and Jens Eisert Correlation density matrices for one-dimensional quantum chains based on the density matrix renormalization group W Münder, A Weichselbaum, A Holzner, Jan von Delft and C L Henley The invariant-comb approach and its relation to the balancedness of multipartite entangled states Andreas Osterloh and Jens Siewert Entanglement scaling of fractional quantum Hall states through geometric deformations Andreas M Läuchli, Emil J Bergholtz and Masudul Haque Entanglement versus gap for one-dimensional spin systems Daniel Gottesman and M B Hastings Entanglement spectra of critical and near-critical systems in one dimension F Pollmann and J E Moore Macroscopic bound entanglement in thermal graph states D Cavalcanti, L Aolita, A Ferraro, A García-Saez and A Acín Entanglement at the quantum phase transition in a harmonic lattice Elisabeth Rieper, Janet Anders and Vlatko Vedral Multipartite entanglement and frustration P Facchi, G Florio, U Marzolino, G Parisi and S Pascazio Entropic uncertainty relations—a survey Stephanie Wehner and Andreas Winter Entanglement in a spin system with inverse square statistical interaction D Giuliano, A Sindona, G Falcone, F Plastina and L Amico APPLICATIONS Time-dependent currents of one-dimensional bosons
NASA Astrophysics Data System (ADS)
Labat, J.
2010-07-01
physics of ionized gases should be done in a more organized manner. Already in the summer of 1964 the "Summer School on the Physics of Ionized Gases" has been held in Herceg Novi, small town on the Adriatic coast. Six internationally recognized lecturers were invited to give a series of lectures in various fields. These were: prof. J. D. Craggs (Univ. of Liverpool, UK, 3 lectures) prof. A. L. Cullen (Univ. of Sheffield, UK, 3 lectures), prof. Yu. H. Demkov (Univ. of Leningrad, 3 lectures), prof. A. von Engel (Univ. of Oxford, UK, 8 lectures), dr. R. Herman (Obs. de Paris, France, 2 lectures), prof. J. B. Hasted (Univ. College, London, UK, 6 lectures). They were actually the first real teachers for the young and growing generation of Yugoslav scientists working in the field of ionized gases, and their names should be praised with dignity and gratitude. Good results of this summer school suggested that the school of this type should be organized on a regular basis and possibly combined with the symposium. This idea has been accepted by all the participants and as a result of this idea in 1968 the first meeting in a long lasting series was held under the full name: "Yugoslav Symposium and International Summer School on the Physics of Ionized Gases", now known world wide as SPIG. Mainly foreign participants insisted that it should be held somewhere on the Adriatic coast. Until 1990, with the exception of XIV SPIG (held in Sarajevo) all were organized in an attractive summer resorts along Adriatic coast, on a regular, two year basis. Yugoslavia fell apart in 1991, and the regular 1992 term has been omitted. The renowned XVI SPIG meting has been held in Belgrade in spite of general crisis and isolation of newly formed Federal Republic of Yugoslavia. The next one, for the same reason, was also organized in Belgrade. The number of foreign participants dropped down sharply due to war surrounding and largely unsettled situation. However, the general situation in the country
NASA Astrophysics Data System (ADS)
1996-04-01
acquisition of data has made these techniques desirable even at the high school level. Students who are used to surfing the Internet on their home computer are ready to collect information via PC in their school labs as well. Bindel (page 356) takes advantage of the Personal Science Laboratory, an affordable package of probes and software for PC interfacing, to provide an experiment using the eye-catching lightstick as its object. Students use two methods to determine the activation energy of the reaction that produces the luminescence and explore concepts of kinetics as well as learn about computer-interfaced experimentation. Addendum. The engaging photgraph of Linus Pauling on the cover of the January issue was taken by Joseph McNally and is copyright Joseph McNally Photography, 52 Villard Avenue, Hastings-on-Hudon, NY 10706.
EDITORIAL: Focus on Quantum Information and Many-Body Theory
NASA Astrophysics Data System (ADS)
Eisert, Jens; Plenio, Martin B.
2010-02-01
and F Verstraete SIMULATION AND DYNAMICS A quantum differentiation of k-SAT instances B Tamir and G Ortiz Classical Ising model test for quantum circuits Joseph Geraci and Daniel A Lidar Exact matrix product solutions in the Heisenberg picture of an open quantum spin chain S R Clark, J Prior, M J Hartmann, D Jaksch and M B Plenio Exact solution of Markovian master equations for quadratic Fermi systems: thermal baths, open XY spin chains and non-equilibrium phase transition Tomaž Prosen and Bojan Žunkovič Quantum kinetic Ising models R Augusiak, F M Cucchietti, F Haake and M Lewenstein ENTANGLEMENT AND SPECTRAL PROPERTIES Ground states of unfrustrated spin Hamiltonians satisfy an area law Niel de Beaudrap, Tobias J Osborne and Jens Eisert Correlation density matrices for one-dimensional quantum chains based on the density matrix renormalization group W Münder, A Weichselbaum, A Holzner, Jan von Delft and C L Henley The invariant-comb approach and its relation to the balancedness of multipartite entangled states Andreas Osterloh and Jens Siewert Entanglement scaling of fractional quantum Hall states through geometric deformations Andreas M Läuchli, Emil J Bergholtz and Masudul Haque Entanglement versus gap for one-dimensional spin systems Daniel Gottesman and M B Hastings Entanglement spectra of critical and near-critical systems in one dimension F Pollmann and J E Moore Macroscopic bound entanglement in thermal graph states D Cavalcanti, L Aolita, A Ferraro, A García-Saez and A Acín Entanglement at the quantum phase transition in a harmonic lattice Elisabeth Rieper, Janet Anders and Vlatko Vedral Multipartite entanglement and frustration P Facchi, G Florio, U Marzolino, G Parisi and S Pascazio Entropic uncertainty relations—a survey Stephanie Wehner and Andreas Winter Entanglement in a spin system with inverse square statistical interaction D Giuliano, A Sindona, G Falcone, F Plastina and L Amico APPLICATIONS Time-dependent currents of one-dimensional bosons
NASA Astrophysics Data System (ADS)
Labat, J.
2010-07-01
physics of ionized gases should be done in a more organized manner. Already in the summer of 1964 the "Summer School on the Physics of Ionized Gases" has been held in Herceg Novi, small town on the Adriatic coast. Six internationally recognized lecturers were invited to give a series of lectures in various fields. These were: prof. J. D. Craggs (Univ. of Liverpool, UK, 3 lectures) prof. A. L. Cullen (Univ. of Sheffield, UK, 3 lectures), prof. Yu. H. Demkov (Univ. of Leningrad, 3 lectures), prof. A. von Engel (Univ. of Oxford, UK, 8 lectures), dr. R. Herman (Obs. de Paris, France, 2 lectures), prof. J. B. Hasted (Univ. College, London, UK, 6 lectures). They were actually the first real teachers for the young and growing generation of Yugoslav scientists working in the field of ionized gases, and their names should be praised with dignity and gratitude. Good results of this summer school suggested that the school of this type should be organized on a regular basis and possibly combined with the symposium. This idea has been accepted by all the participants and as a result of this idea in 1968 the first meeting in a long lasting series was held under the full name: "Yugoslav Symposium and International Summer School on the Physics of Ionized Gases", now known world wide as SPIG. Mainly foreign participants insisted that it should be held somewhere on the Adriatic coast. Until 1990, with the exception of XIV SPIG (held in Sarajevo) all were organized in an attractive summer resorts along Adriatic coast, on a regular, two year basis. Yugoslavia fell apart in 1991, and the regular 1992 term has been omitted. The renowned XVI SPIG meting has been held in Belgrade in spite of general crisis and isolation of newly formed Federal Republic of Yugoslavia. The next one, for the same reason, was also organized in Belgrade. The number of foreign participants dropped down sharply due to war surrounding and largely unsettled situation. However, the general situation in the country
BOOK REVIEW: Seeking Ultimates. An Intuitive Guide to Physics
NASA Astrophysics Data System (ADS)
Brown, Neil
2000-05-01
: entropy. It is physicists who can benefit most from discarding mathematics and seeking intuitive understanding. It is often too easy to put the numbers into a formula, with little real comprehension of the underlying physics. For layman or physicist the book is hard work. It is not a volume to be read from cover to cover; each section needs to be considered and digested, with frequent turning backwards (or sometimes forwards) to other pages. Even then the outcome may leave questions that can only be answered by access to an academic library to look up some of the copious references to original papers (which, of course, do not eschew mathematics or make concessions to conceptual difficulties). Unfortunately the book is marred by an impression of haste and lack of care, leading to errors that should not have reached the final print. For example, a graph of increase of population with generation number is shown as and stated to be a straight line. It should be exponential. This sort of thing undermines confidence in the whole text. High temperature superconductivity may have a revolutionary effect on electrical machines in the future, but for the time being magnets for magnetic resonance imaging machines and the like still use the old superconductors. Amusing anecdotes make for interesting reading, but the one about Faraday is garbled: he had nothing to do with frogs' legs (that was Galvani), and the quip about taxing electricity one day, if not apocryphal, was made either to Peel or to Gladstone, not to the King. In at least one case a topic mentioned in the index and glossary does not appear on the stated page in the text, apparently having been cut out at a late stage. Personally I did not find the book satisfying, but others will differ. Especially when dealing with intuitive appreciation, what is straightforward to one person may be utterly opaque to another. Making physics comprehensible and conveying its fascination is a daunting and often thankless task, but a very
The importance of being informed
NASA Astrophysics Data System (ADS)
Draganova, Tamara
2013-04-01
other schools in the region and the country. Campaigns were reflected in regional and national media. However, the capital invested in the young people is our responsibility - students, teachers and research workers, parents and citizens should be informed. And this is the power of us all today in order to face the future calmly and confidently, with the knowledge, attitudes and respect for our planet Earth. And we all, teachers are obliged and responsible to be conductors of Geosciences in the classroom today, for the future of our children... "If thou hast Knowledge, let others light their candle at thine..." Thomas Fuller
Software Uncertainty in Integrated Environmental Modelling: the role of Semantics and Open Science
NASA Astrophysics Data System (ADS)
de Rigo, Daniele
2013-04-01
. Semantic Array Programming for Environmental Modelling: Application of the Mastrave library. In: Seppelt, R., Voinov, A. A., Lange, S., Bankamp, D. (Eds.), International Environmental Modelling and Software Society (iEMSs) 2012 International Congress on Environmental Modelling and Software. Managing Resources of a Limited Planet: Pathways and Visions under Uncertainty, Sixth Biennial Meeting. pp. 1167-1176. http://www.iemss.org/iemss2012/proceedings/D3_1_0715_deRigo.pdf de Rigo, D., 2012. Semantic Array Programming with Mastrave - Introduction to Semantic Computational Modelling. http://mastrave.org/doc/MTV-1.012-1 Free Software Foundation, 2012. What is free software? http://www.gnu.org/philosophy/free-sw.html (revision 1.118 archived at http://www.webcitation.org/6DXqCFAN3 ) Stallman, R. M., 2009. Viewpoint: Why "open source" misses the point of free software. Communications of the ACM 52 (6), 31-33. http://dx.doi.org/10.1145/1516046.1516058 (free access version: http://www.gnu.org/philosophy/open-source-misses-the-point.html ) Lempert, R., Schlesinger, M. E., Jul. 2001. Climate-change strategy needs to be robust. Nature 412 (6845), 375. http://dx.doi.org/10.1038/35086617 Shell, K. M., Nov. 2012. Constraining cloud feedbacks. Science 338 (6108), 755-756. http://dx.doi.org/10.1126/science.1231083 van der Sluijs, J. P., 2012. Uncertainty and dissent in climate risk assessment: A Post-Normal perspective. Nature and Culture 7 (2), 174-195. http://dx.doi.org/10.3167/nc.2012.070204 Lenton, T. M., Held, H., Kriegler, E., Hall, J. W., Lucht, W., Rahmstorf, S., Schellnhuber, H. J., Feb. 2008. Tipping elements in the earth's climate system. Proceedings of the National Academy of Sciences 105 (6), 1786-1793. http://dx.doi.org/10.1073/pnas.0705414105 Hastings, A., Wysham, D. B., Apr. 2010. Regime shifts in ecological systems can occur with no warning. Ecology Letters 13 (4), 464-472. http://dx.doi.org/10.1111/j.1461-0248.2010.01439.x Barnosky, A. D., Hadly, E. A., Bascompte, J
Once a myth, now an object of study - How the perception of comets has changed over the centuries
NASA Astrophysics Data System (ADS)
2004-02-01
symbol of the prophet’s empowerment. Or again Luke 21:11: “And great earthquakes shall be in divers places, and famines, and pestilences; and fearful sights and great signs shall there be from heaven.” In 1066, Halley’s Comet appeared to many as a harbinger of the Norman conquest of Britain, so vividly portrayed in the Bayeux tapestry, with its scenes from the Battle of Hastings. The decisive step towards overturning the view that comets are atmospheric phenomena was taken in 1577 by Danish astronomer, Tycho Brahe. For two and a half months he observed from his observatory in Uranienburg the progress of a comet across the heavens. Relying on the phenomenon of the daily parallax - an apparent “shuddering” motion of heavenly bodies in fact attributable to the observer’s position on the revolving Earth - he was able to establish that the comet had to be located beyond the lunar orbit. Halley discovers an elliptical orbit The scientific description of comets took another major step forward in 1705 thanks to the work of the British astronomer and physicist, Edmond Halley, a friend and patron of Isaac Newton. Investigating recorded comet measurements, he observed that the orbits of a number of bright comets were very similar: his own calculation of the orbit of a comet observed in 1682 coincided with the data recorded by Johannes Kepler in 1607 and by Apianus in 1531. He concluded that various comet observations were attributable to one and the same comet. Halley was proved right when in December 1758, the comet whose return he had predicted, thenceforth named after him, did indeed make a repeat appearance. This confirmed his theory that apparently parabolic comet orbits were in fact “simply” sections of one enormous elliptical orbit. Since then observations recorded in China in 240 BC have been identified as relating to a sighting of Halley’s comet, the oldest known document dealing with this phenomenon. What was described in the Bible as a sign from God
NASA Astrophysics Data System (ADS)
László, M.
2009-04-01
%. Entre as misturas 1 e 2, foi melhor a 2. (80% latossolo vermelho novo, 10% palha de arroz queimado, 10% esterco de curral). Examinando-se 15 fatores, entre 11 casos afirmou-se a mistura como para melhor que a mistura 1. (70% latossolo vermelho novo, 20% palha de arroz queimado, 10% esterco de curral). Em caso de número de tuberculos 0-20 mm com a mistura 2. foi possivel aumentar geralmente os números de tuberculos em 77% que a mistura padrão. Efeitos de adubação 1. Área da folhas por planta entre manejo foi melhor de modo significativo a doságem de 3.6 grama vaso-1 adubo complexo (3103 cm2 plantas-1). 2. Peso fresco da folhas e de hastes por plantas as tendencias foram parecidos com o da área de folhas. 3. Peso fresco de raizis por planta até 7.2 grama vaso-1 diminuiu depois aumentou. 4. Peso fresco total de tuberculos por planta as crescentes doságens de um modo forte diminuiram a produção de tuberculos de 0 e 18.0 grama vaso-1 em 160% em os dois caso da mistura. 5. Peso de fitomassa fresco por planta foi melhor a 3.6 g vaso-1 (239 grama planta-1 em médio da dois mistura), depois os dados diminuirám. 6. Produção de biomassa fresco por planta a maxima produção (188 grama planta-1) foi obtida com 3.6 grama vaso-1. Deste ponto de modo forte caiu a produção. 7. Peso da matéria seca de folhas, hastes e raizis por planta somente em caso de mistura padrão o resultado foi significativo em relação aos outros tratamentos. 8. Péso da matéria seca de tuberculos total por planta modo significativo diminuiu a produção (0 e 18.0 grama vaso-1 = 360%) em médio da duas misturas. 9. Biomassa produção de materia seca por planta modo significativo diminuiu para efeito de alta dosagens de adubo complexo (0 e 18.0 grama vaso-1 = 158%) em médio da duas misturas. 10. Peso fresco de tuberculos com 0-20 mm as crescentes dosagens de 0 e 18.0 grama vaso-1 diminuiram a produção em 213% em médio da duas misturas. 11. Peso fresco de tuberculos com 20 mm-1 as